postepowanieadm
15 hours ago
If everyone is going to wait 3 days before installing the latest version of a compromised package, it will take more than 3 days to detect an incident.
acdha
11 hours ago
Think about how the three major recent incidents were caught: not by individual users installing packages but by security companies running automated scans on new uploads flagging things for audits. This would work quite well in that model, and it’s cheap in many cases where there isn’t a burning need to install something which just came out.
kjok
7 hours ago
I think there's some confusion here. No automated scan was able to catch the attack. It was an individual who notified these startups.
acdha
6 hours ago
Quite possibly - there have been several incidents recently and a number of researchers working together so it’s not clear exactly who found something first and it’s definitely not as simple to fix as tossing a tool in place.
The CEO of socket.dev described an automated pipeline flagging new uploads for analysts, for example, which is good but not instantaneous:
https://news.ycombinator.com/item?id=45257681
The Aikido team also appear to be suggesting they investigated a suspicious flag (apologies if I’m misreading their post), which again needs time for analysts to work:
https://www.aikido.dev/blog/npm-debug-and-chalk-packages-com...
My thought was simply that these were caught relatively quickly by security researchers rather than by compromised users reporting breaches. If you didn’t install updates with a relatively short period of time after they were published, the subsequent response would keep you safe. Obviously that’s not perfect and a sophisticated, patient attack like liblzma suffered would likely still be possible but there really does seem to be a value to having something like Debian’s unstable/stable divide where researchers and thrill-seekers would get everything ASAP but most people would give it some time to be tested. What I’d really like to see is a community model for funding that and especially supporting independent researchers.
nikanj
5 hours ago
Automated scans have detected 72251 out of the previous 3 supply-chain attacks
Imustaskforhelp
3 hours ago
Reminds me of that michael burry quote.
davidpfarrell
3 hours ago
Wow so couldn't said security co's establish their own registry that we could point to instead and packages would only get updated after they reviewed and approved them?
I mean I'd prolly be okay paying yearly fee for access to such a registry.
anematode
15 hours ago
A lot of people will still use npm, so they'll be the canaries in the coal mine :)
More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure.
kjok
7 hours ago
> automated scanners seem to do a good job already of finding malicious packages.
That's not true. This latest incident was detected by an individual researcher, just like many similar attacks in the past. Time and again, it's been people who flagged these issues, later reported to security startups, not automated tools. Don't fall for the PR spin.
If automated scanning were truly effective, we'd see deployments across all major package registries. The reality is, these systems still miss what vigilant humans catch.
anematode
5 hours ago
The latest incident was detected first by an individual researcher (haven't verified this myself, but trusting you here) -- or maybe s/he was just the fastest reporter in the west. Even simple heuristics like the sudden addition of high-entropy code would have caught the most recent attacks, and obviously there are much better methods too.
hobofan
7 hours ago
> If automated scanning were truly effective, we'd see deployments across all major package registries.
No we wouldn't. Most package registries are run by either bigcorps at a loss or by community maintainers (with bigcorps again sponsoring the infrastructure).
And many of them barely go beyond the "CRUD" of package publishing due to lack of resources. The economic incentives of building up supply chain security tools into the package registries themselves are just not there.
kjok
6 hours ago
You're right that registries are under-resourced. But, if automated malware scanning actually worked, we'd already see big tech partnering with package registries to run continuous, ecosystem-wide scanning and detection pipelines. However, that isn't happening. Instead, we see piecemeal efforts from Google with assurance artifacts (SLSA provenance, SBOMs, verifiable builds), Microsoft sponsoring OSS maintainers, Facebook donating to package registries. Google's initiatives stop short of claiming they can automatically detect malware.
This distinction matters. Malware detection is, in the general case, an undecidable problem (think halting problem and Rice theorem). No amount of static or dynamic scanning can guarantee catching malicious logic in arbitrary code. At best, scanners detect known signatures, patterns, or anomalies. They can't prove absence of malicious behavior.
So the reality is: if Google's assurance artifacts stop short of claiming automated malware detection is feasible, it's a stretch for anyone else to suggest registries could achieve it "if they just had more resources." The problem space itself is the blocker, not just lack of infra or resources.
mcintyre1994
13 hours ago
In the case of the chalk/debug etc hack, the first detection seemed to come from a CI build failure it caused: https://jdstaerk.substack.com/p/we-just-found-malicious-code...
> It started with a cryptic build failure in our CI/CD pipeline, which my colleague noticed
> This seemingly minor error was the first sign of a sophisticated supply chain attack. We traced the failure to a small dependency, error-ex. Our package-lock.json specified the stable version 1.3.2 or newer, so it installed the latest version 1.3.3, which got published just a few minutes earlier.
DougBTX
11 hours ago
> Our package-lock.json specified the stable version 1.3.2 or newer
Is that possible? I thought the lock files restricted to a specific version with an integrity check hash. Is it possible that it would install a newer version which doesn't match the hash in the lock file? Do they just mean package.json here?
streptomycin
10 hours ago
If they were for some reason doing `npm install` rather than `npm ci`, then `npm install` does update packages in the lock file. Personally I always found that confusing, and yarn/pnpm don't behave that way. I think most people do `npm ci` in CI, unless they are using CI to specifically test if `npm install` still works, which I guess maybe would be a good idea if you use npm since it doesn't like obeying the lock file.
Rockslide
10 hours ago
How does this get repeated over and over, when it's simply not true? At least not anymore. npm install will only update the lockfile if you make changes to your package.json. Otherwise, it will install the versions from the lockfile.
mirashii
7 hours ago
> How does this get repeated over and over, when it's simply not true?
Well, for one, the behavior is somewhat insane.
`npm install` with no additional arguments does update the lockfile if your package.json and your lockfile are out of sync with one another for any reason, and so to get a guarantee that it doesn't change your lockfile, you must do additional configuration or guarantee by some external mechanism that you don't ever have an out of date package.json and lock. For this reason alone, the advice of "just don't use npm install, use npm ci instead" is still extremely valid, you'd really like this to fail fast if you get out of sync.
`npm install additional-package` also updates your lock file. Other package managers distinguish these two operations, with the one to add a new dependency being called "add" instead of "install".
The docs add to the confusion. https://docs.npmjs.com/cli/v11/commands/npm-install#save suggests that writing to package-lock.json is the default and you need to change configuration to disable it. The notion that it won't change your lock file if you're already in sync between package.json and package-lock.json is not actually spelled out clearly anywhere on the page.
> At least not anymore.
You've partially answered your own question here.
Rockslide
7 hours ago
> You've partially answered your own question here.
Is that the case? If it were ever true (outside of outright bugs in npm), it must have been many many years and major npm releases ago. So that doesn't justify brigading outdated information.
chowells
5 hours ago
I mean, it's my #1 experience using npm. I never once have used `npm install` and had a result other than it changing the lockfile. Maybe you want to blame this on the tools I used, but I followed the exact installation instructions of the project I was working on. If it's that common to get it "wrong", it's the tool that is wrong.
streptomycin
5 hours ago
My bad, it really annoyed me when npm stopped respecting lockfiles years ago so I stopped using it. That's great news that they eventually changed their mind.
However in rare cases where I am forced to use it to contribute to some npm-using project, I have noticed that the lockfile often gets updated and I get a huge diff even though I didn't edit the dependencies. So I've always assumed that was the same issue with npm ignoring the lockfile, but maybe it's some other issue? idk
Rockslide
4 hours ago
Well there are other lockfile updates as well, which aren't dependency version changes either. e.g. if the lockfile was created with an older npm version, running npm install with a newer npm version might upgrade it to a newer lockfile format and thus result in huge diffs. But that wouldn't change anything about the versions used for your dependencies.
cluckindan
8 hours ago
Are you 100% on that?
Rockslide
7 hours ago
Yes. As someone who's using npm install daily, and given the update cadence of npm packages, I would end up with dirty lock files very frequently if the parent statement were true. It just doesn't happen.
hobofan
7 hours ago
Since nobody else answers your question:
> Do they just mean package.json here?
Yes, most likely. A package-lock.json always specifies an exact version with hash and not a "version X or newer".
Mattwmaster58
10 hours ago
> Is that possible?
This comes up every time npm install is discussed. Yes, npm install will "ignore" your lockfile and install the latest dependancies it can that satisfy the constraints of your package.json. Yes, you should use npm clean-install. One shortcoming is the implementation insists on deleteing the entire node_modules folder, so package installs can actually take quite a bit of time, even when all the packages are being served from the npm disk cache: https://github.com/npm/cli/issues/564
vasachi
15 hours ago
If only there was a high-ranking official at Microsoft, who could prioritize security[1]! /s
[1] https://blogs.microsoft.com/blog/2024/05/03/prioritizing-sec...
kibwen
11 hours ago
Also, if everyone is going to wait 3 days before installing the latest version of a compromised package, it will take more than 3 days to broadly disseminate the fix for a compromise in the wild. The knife cuts both ways.
singulasar
14 hours ago
Not really, app sec companies scan npm constantly for updated packages to check for malware. Many attacks get caught that way. e.g. the debug + chalk supply chain attack was caught like this: https://www.aikido.dev/blog/npm-debug-and-chalk-packages-com...
blamestross
12 hours ago
1) Checks and audits will still happen (if they are happening at all)
2) Real chances for owners to notice they have been compromised
3) Adopt early before that commons is fully tragedy-ed.