Show HN: Safe-NPM – only install packages that are +90 days old

90 pointsposted 2 months ago
by kevinslin

68 Comments

sebmellen

2 months ago

"Here, install my new 1-day old NPM package that doesn't let you install packages younger than 90 days."

Pardon me, I couldn’t help myself :D

mort96

2 months ago

I get that it's a joke, but I feel the need to defend this project anyway.

The problem with NPM isn't any one young package. The problem with the NPM is that any time you run 'npm install', you download potentially thousands of packages, and you get the most recent patch release from all of them. Installing one 1-day-old NPM package to forever avoid day 1 releases of thousands of packages seems like a worthwhile trade.

Still, I would maybe choose the tried and true PNPM instead, which supports this too.

moritzwarhier

2 months ago

> The problem with NPM isn't any one young package. The problem with the NPM is that any time you run 'npm install', you download potentially thousands of packages, and you get the most recent patch release from all of them.

Isn't this simply wrong?

Last I checked, lock files work. They didn't for a long time, until a couple of years ago, as far as I know.

If you delete your lock file or explicitly run a package upgrade, sure, you get the latest versions compatible with your semver ranges.

> Installing one 1-day-old NPM package to forever avoid day 1 releases of thousands of packages seems like a worthwhile trade.

If you want to be extra sure, you can simply not use semver ranges in your package.json, or only for select packages.

As far as I know, this is recommended anyway.

chatmasta

2 months ago

Lockfiles work if you combine them with version pinning (exact version, no semver), or always run `npm install ci` unless you’re intentionally attempting to update your packages.

I’ve always preferred exact versions because I’d rather updates be opt-in rather than an opt-out footgun. Otherwise any new dev to the project might accidentally pull some new version of a package that satisfies the semver requirement but modifies the lockfile, then they’ll check it into the code, and it’s another thing to fix at review time… there’s just a lot less friction if you use exact versions. It makes hermetic/reproducible builds and static dependency analysis easier, too.

Of course you need some update hygiene, preferably via an automated bot that opens PRs and runs tests. Renovate works well.

(btw, this same issue occurs with Docker base images; it’s better to base images on the sha256sum of the target image rather than a floating tag. Renovate can update those too.)

mort96

2 months ago

Doesn't NPM only respect lock files when you run 'npm ci'? I thought 'npm install' just used the constraints in package.json

moritzwarhier

2 months ago

You are right that 'npm install' can upgrade versions even when a lock file is present, but AFAIK this should only happen it the lock file is not compatible with the package.json. I haven't seen it in a long time, and AFAIK it can't happen without you changing the package.json.

But yes, it's a reason to pin dependencies and use npm ci / yarn immutable etc.

Updates of transitive dependencies are afaik not automatically installed when there is a working lock file: this is the thing that changed some versions ago I think (I mixed up Node and npm versions in my initial comment).

So yes, to be sure that you never install anything else, it's best to use 'npm ci' or 'yarn install --immutable', which will fail if the lock file is broken or not present.

But 'npm install' does not install the latest patch release compatible with your package.json with precedence over the lockfile.

What it does do is upgrade if you edit the version range by hand to be incompatible with the lock file, e.g. increase major version of a package.

But if you have, say, Typescript ^5 in your package.json, but 5.4 in your lock file, 'npm install' won't upgrade it.

https://docs.npmjs.com/cli/v11/commands/npm-install

> If the package has a package-lock, or an npm shrinkwrap file, or a yarn lock file, the installation of dependencies will be driven by that, respecting the following order of precedence:

> npm-shrinkwrap.json

> package-lock.json

> yarn.lock

'npm ci' and friends are safer as they will always fail when they can't install from lock file without any conflicts or changes, that's correct.

Don't know how other package managers behave in this regard, except for yarn and pnpm.

PHP composer AFAIK behaves similar to npm?

h33t-l4x0r

2 months ago

You would `npm link` that thing in real life I think.

moritzwarhier

2 months ago

As someotherguyy already mentioned, this is a default feature in pnpm.

And as far as cat-and-mouse-games go in other package managers, I'd say that pinning dependencies and disabling postinstall scripts is a much better option. Sure, not a foolproof one either, but as good as it gets.

edit: misspelled someotherguyy's user name

buu700

2 months ago

I recently learned that this is (for all intents and purposes) a feature in npm as well, specifically the `--before` flag to `npm install`: https://docs.npmjs.com/cli/v11/commands/npm-install#before. That was harder than it should've been to figure out; it really needs better marketing.

Related to that is the proposal for `stabilityDays`, which seems way more practical: https://github.com/npm/cli/issues/8570#issuecomment-33004136.... So rather than merely saying "I only want package versions more than N days old", you'd be adding the requirement that "...and also they should have gone at least N days without a subsequent patch release". e.g. if mylib@6.0.0 is released, only to be quickly followed by 6.0.1 and 6.0.2, you ideally wouldn't want to risk ever installing the probably-broken 6.0.0 or 6.0.1 based on luck of the draw; the better behavior would be to stick with the last 5.x release until 6.0.2 has aged past the threshold.

moritzwarhier

2 months ago

Thanks for sorting out the detailed info! While I use an alternative package manager at $job, I personally wish for npm to be a stable mainline for Node package management and to sherlock the advantages of pnpm, yarn etc.

2muchcoffeeman

2 months ago

Why is the community persisting with such poor solutions?

moritzwarhier

2 months ago

What would be a better solution? Do other package managers reliably restrict access to the host system beyond the scope of the project folder?

Many quirks come from abilities that were once deemed useful, such as compiling code in other languages after package install.

Sure, today, I can disable install scripts if I want but it doesn't change much when I eventually run code from the package anyway.

But even restricting access to the file system to the project's root folder would leave many doors open, with or without foreign languages: Node is designed as a general purpose JS runtime, including server-side and build-time usage.

The utility of node.js was initially to provide a JS API that, unlike the web platform, is not sandboxed. And npm is the default package manager.

This not only allows server-side usage, but also is essential to many early dev scenarios. Back in the days, it might have been SCSS builds using node-gyp (wouldn't recommend). Today it's things like Golang TypeScript or SSGs.

So, long story short: as many people before me already said, it's an ecosystem/cultural problem.

One thing against npm in this regard was/is its broken lock-file handling until I think version 12 or 16. That led to unintended transitive dependency version changes, breaking any reproducibility.

Same for compiling foreign languages.

These problems are solved today / not different from other package managers and -registries, as far as I know.

The culture of taking breaking changes and dependency bloat lightly has not changed as much, I think, although it's improved.

This most important point seems to be related to 3 reasons IMO:

- junior developers without experience in library development reaching large audiences

- specs, languages, runtime, and the package managers itself going through disruptions and evolutions

- rapidly releasing breaking majors, often caused by the above factors

The combination of these plus the role of the project lead/team who actually decides about the dependencies.

There are probably also many projects with unclear roles and many people who can push manifest changes, coupled with habitual access to CI/CD pipelines.

LelouBil

2 months ago

Deno has capabilities, but I don't use it so I don't know if they are useful in practice or if everyone just always allows everything.

m4rtink

2 months ago

Established Linux distributions.

moritzwarhier

2 months ago

Sure. But I'm not sure if I wanted to burden their package registry maintainers with maintaining all kinds of JS/TS packages?

And if you go for custom registries, what's the big difference to npm registry?

I don't understand it :)

One good thing about npm ecosystem IMO is that it's frowned upon to depend on system globals.

m4rtink

2 months ago

Maybe that's another indication of something wrong - the NPM ecosystem granularity being so high there is not enough humans alive to safely maintain all the packages ? And some rethinking might be needed even there.

tkzed49

2 months ago

Not controlling transitive deps makes this vastly less useful because direct deps can specify version ranges (e.g. latest minor version). Personally I'd stick with pnpm's feature.

zelphirkalt

2 months ago

This is why one should pin all direct and transitive dependencies with their checksums and not upgrade everyday willy-nilly. There is no need to specify the specific version numbers of transitive dependencies, if one keeps a lock file that pins those exact versions and checksums of transitive dependencies, and one doesn't upgrade willy-nilly all the time. Make upgrading dependencies a conscious choice, and perhaps have a policy of at most upgrading every X days.

tkzed49

2 months ago

I don't think it's accurate to envision that the average team using the npm ecosystem is upgrading their dependencies daily. Rather, the problem is that modifying your direct deps (e.g. adding a package, upgrading a package) requires modifying transitive deps.

So yeah, ~everyone is using a lockfile with checksums. But even if I think really hard about installing XYZ@1.2.3 package, and check that the lockfile diff is reasonable, I'm not manually auditing the whole supply chain (I'd get fired for getting nothing done). And a single dependency change that I choose to make can affect a substantial number of transitive deps.

zelphirkalt

2 months ago

My idea is, that they do _not_ upgrade their dependencies daily, because that is what is causing the issue. People don't pin all their versions and checksums properly, and the next time they run `npm install` they get a new version of some library. I don't even want to see any "@^1.2" or whatever the syntax was. Also they should be running `npm ci`.

I have seen this multiple times with people from various backgrounds and in frontend as well as backend. People still think like "Lets auto upgrade patch releases, so that we always get the bugfixes." or "Lets upgrade quickly, so that we deal with changes right away, before accumulating the work.". But they don't think properly about security and reproducibility.

zelphirkalt

2 months ago

This works only if there are some other people, who will use a dependency "too early" to fall victim to some exploit and then notice it, within those 90 days. Imagine, if everyone only used packages older than 90 days. Then we would have no frontrunner to run into the issues before us.

A cooldown time alone is not actually a sufficient solution. What people really need to stop doing, is not properly pinning their versions and checksums, and installing whatever newer version is available. That would cause a problem even, if the date line is moved 90 days into the future for all packages. If however, one only updates versions of dependencies when one consciously makes that choice, there are far fewer points in time, when versions change, and therefore the chance of catching something is also much lower. Combine that with a cooldown time/minimum age for versions, and you got an approach.

LelouBil

2 months ago

Yes and no, usually when malicious packages go public it's some third party cybersecurity firm that scans packages that found it.

mrconter11

2 months ago

But safe-npm is not 90 days old yet.. :/

jagged-chisel

2 months ago

Consider this a 3-month lead on the ability to utilize it

asdkkthrowaway

2 months ago

Doesn't this just mean you're 90 days late on any patches?

beepbooptheory

2 months ago

This article was on the front page recently that discusses the idea behind this:

https://blog.yossarian.net/2025/11/21/We-should-all-be-using...

Most of the time, you need quick patches because of fairly recent dependency changes, so if you just wait and kind of "debounce" you dependency updates, you can cover a lot of supply chain vulnerabilities etc.

ntonozzi

2 months ago

It's not debouncing, it's delaying. Ideally you can still update a specific dependency to a more up to date version if it turns out an old version has a vulnerability.

moritzwarhier

2 months ago

auto-updating is bad.

Scheduled, audited updates are good.

Installing random npm packages as suggested here is also bad. Especially with "--global", although I'm not sure if that makes any difference because Node by default of course can access all of your file system.

ttoinou

2 months ago

If everybody does that, won't we take 90 days more to detect problems / hacks of npm packages ?

lelandbatey

2 months ago

No, cause the folks detecting the problems typically do so by actively scanning new releases (usually security companies do this). Few such problems are detected by people who do a "normal" update and receive compromised code, investigate, and then report the problem. It does happen, but it's not the "usual" way these supply chain attacks are discovered, especially not the really big ones.

user

2 months ago

[deleted]

mikeweiss

2 months ago

Umm... Tell me how the most recent supply chain attack was discovered again?

aussieguy1234

2 months ago

This does mean that security patches released yesterday won't get installed.

Its the opposite of "keep your software up to date"

waterTanuki

2 months ago

Just use pnpm. I've never once had compatibility issues with it on linux/mac/windows over the past 6 years.

PunchyHamster

2 months ago

Now with 9000% more zero-days!

> Installs the newest "aged" version

Probably want to install version that has CVE-fixed instead, i.e find the cve for packages and install latest version that has all of them fixed but not later.

Technically someone could fake a cve to get people to upgrade but that's a far more involved process

codezero

2 months ago

Does anyone have any statistics on how long a compromised package has been in the wild on average?

spullara

2 months ago

this is a good idea but i just did this:

  alias npm='npm --before="$(date -v-1w +%Y-%m-%d)"'
  alias pnpm='pnpm --before="$(date -v-1w +%Y-%m-%d)"'

user

2 months ago

[deleted]

vrighter

2 months ago

the more people use this, the less useful it becomes for everyone. If everyone uses this, then everyone would still be using a particular package for the first time at the same time. What then? Release another package that extends the delay to 6 months?

mdx97

2 months ago

Malicious packages aren’t just found because someone gets pwned, there are organizations out there proactively scanning for this stuff.

robkop

2 months ago

You could dual brand as vibe-npm, only install packages that are in your models training dataset

m4rtink

2 months ago

So how is not using those Debian packages because they are too old working out ? ;-)

ghtbircshotbe

2 months ago

Recently decided to learn typescript. You would hardly know there's a Debian package from reading typescript's website.

c-hendricks

2 months ago

Why would the typescript webpage mention Debian at all?

m4rtink

2 months ago

Upstreams often mention distros in which their software is packaged, how the package is called and sometimes even the commands to install it.

c-hendricks

2 months ago

They do mention how to install it, it's done via node. To me it sends node are the ones that should have distro specific notes on how to install?

Don't know why the TS team would point to a 3 year old fork of something they don't have control over.

ghtbircshotbe

2 months ago

In telling you how to install it. That's kind of the point - they all assume you're going to use npm to install it.

c-hendricks

2 months ago

Doesn't seem very far fetched to use a node package manager to install a node package tho?

ghtbircshotbe

2 months ago

Maybe not. But I'm not going to if I can help it, which is entirely doable by using Debian's package manager. And I avoid running things as root if I can help it. It's bad enough I need to run apt as root. The recent recent news hasn't increased my trust in npm.

user

2 months ago

[deleted]

arrty88

2 months ago

With the help of AI, i see no reason to install most deps nowadays besides types and react and mui framework. Everything can be built from scratch quickly.

actionfromafar

2 months ago

I think this is a pretty common approach nowadays, and one of the reasons why I believe my job is safe for now. I expect to be called up to fix some of the resulting mess. It's a two-edged sword, for sure.

zelphirkalt

2 months ago

You still will have to maintain it then though.

catlifeonmars

2 months ago

Now you have shifted your supply chain issues to your coding agent.

solumunus

2 months ago

And do you think the severity of the issue is anywhere near the same?

zelphirkalt

2 months ago

I think this will remain to be seen. Wasn't there a paper linked here on HN recently, that claimed, that even few examples are sufficient, to poison LLMs? (I didn't read that paper, and merely interpreted the meaning of the title.)

solumunus

2 months ago

I don't think it remains to be seen. I think it's obvious that the completely explicit exploit is going to be more effective.

cheesekunator

2 months ago

Why does elapsed time mean a library is safe? This is so ridiculous. It doesn't protect you against anything. I'm sure there are 1000s of old libraries out there with hidden vulnerabilities or malicious code.

Waterluvian

2 months ago

Literally nothing can mean a “library is safe.”

The idea of “safe” in terms of risk and security has misled a lot of people into this wrong idea that there’s a binary state of safe and unsafe.

It’s all about risk management. You want to reduce risk as inexpensively as possible. One of many inexpensive approaches is “don’t install dependencies that are new.” Along with “don’t install dependencies that nobody else uses.” You might also apply the rule, “don’t install dependencies that aren't shipped with the OS.” Or “don’t use dependencies that haven’t been formally proven.” Etc.

Indeed, calling it “Safe-NPM” can be misleading. As if using it achieves some binary state of safety.

femiagbabiaka

2 months ago

Most supply chain attacks have a very limited window in which they’re exploitable. This is not a panacea, but it is a good idea.

PunchyHamster

2 months ago

hedging bets of zero day vs compromise (that have big chance to be found in thos e 90 days). But yeah, not a good idea

user

2 months ago

[deleted]