ashishb
10 hours ago
Here's my `npm` command these days. It reduces the attack surface drastically.
  alias npm='docker run --rm -it -v ${PWD}:${PWD} --net=host --workdir=${PWD} node:25-bookworm-slim npm'
  - No access to my env vars
  - No access to anything outside my current directory (usually a JS project).
  - No access to my .bashrc or other files.
dns_snek
24 minutes ago
This folk-wisdom scapegoating of post-install scripts needs to stop or people are going to get really hurt by the false sense of security it's creating. I can see the reasoning behind this, I really do, it sounds convincing but it's only half the story.
If you want to protect your machine from malicious dependencies you must run everything in a sandbox all the time, not just during the installation phase. If you follow that advice then disabling post-install scripts is pointless.
The supply chain world is getting more dangerous by the minute and it feels like I'm watching a train derail in slow motion with more and more people buying into the idea that they're safe if they just disable post-install scripts. It's all going to blow up in our collective faces sooner or later.
phiresky
10 hours ago
That seems a bit excessive to sandbox a command that really just downloads arbitrary code you are going to execute immediately afterwards anyways?
Also I can recommend pnpm, it has stopped executing lifecycle scripts by default so you can whitelist which ones to run.
tetha
2 hours ago
At work, we're currently looking into firejail and bubblewrap a lot though and within the ops-team, we're looking at ways to run as much as possible, if not everything through these tools tbh.
Because the counter-question could be: Why would anything but ssh or ansible need access to my ssh keys? Why would anything but firefox need access to the local firefox profiles? All of those can be mapped out with mount namespaces from the execution environment of most applications.
And sure, this is a blacklist approach, and a whitelist approach would be even stronger, but the blacklist approach to secure at least the keys to the kingdom is quicker to get off the ground.
ashishb
9 hours ago
> Also I can recommend pnpm, it has stopped executing lifecycle scripts by default so you can whitelist which ones to run.
Imagine you are in a 50-person team that maintains 10 JavaScript projects, which one is easier?
  - Switch all projects to `pnpm`? That means switching CI, and deployment processes as well
  - Change the way *you* run `npm` on your machine and let your colleagues know to do the same
larusso
4 hours ago
I don’t get your argument here. 10 isn’t a huge number in my book but I don’t know of course what else that entails. I would opt for a secure process change over a soft local workflow restriction that may or may not be followed by all individuals. And I would definitely protect my CI system in the same way than local machines. Depending on the nature of CI these machines can have easy access rights. This really depends how you do CI and how lacks security is.
azangru
9 minutes ago
> which one is easier?
> Switch all projects to `pnpm`?
Sorry; I am out of touch. Does pnpm not have these security problems? Do they only exist for npm?
jve
an hour ago
You do the backward logic here. I would go for a single person to deal with pnpm migration and CI rather than instruct other 10 for everyone to hopefully do the right thing. And think about it when the next person comes in... so I'd go for the first option for sure.
And npm can be configured to prevent install scripts to be run anyways:
> Consider adding ignore-scripts to your .npmrc project file, or to your global npm configuration.
But I do like your option to isolate npm for local development purposes.
afavour
9 hours ago
There are a great many extra perks to switching to pnpm though. We switched on our projects a while back and haven’t looked back.
fragmede
7 hours ago
Am I missing something? Don't you also need to change how CI and deployment processes call npm? If my CI server and then also my deployment scripts are calling npm the old insecure way, and running infected install scripts/whatever, haven't I just still fucked myself, just on my CI server and whatever deployment system(s) are involved? That seems bad.
ashishb
6 hours ago
Your machine has more projects, data, and credentials than your CI machine, as you normally don't log into Gmail on your CI. So, just protecting your machine is great.
Further, you are welcome to use this alias on your CI as well to enhance the protection.
arghwhat
2 hours ago
Attacking your CI machines means to poison your artifacts you ship and systems they get deployed to, get access to all source it builds and can access (often more than you have locally) and all infrastructure it can reach.
CI machines are very much high-value targets of interest.
fragmede
an hour ago
> Further, you are welcome to use this alias on your CI as well to enhance the protection.
Yes, but if I've got to configure that across the CI fleet as well as in my deploy system(s) in order to not get, and also be distributing malware, what's the difference between having to do that vs switching to pnpm in all the same places?
Or more explicitly, your first point is invalid. Whether you ultimately choose to use docker to run npm or switch to pnpm, it doesn't count to half-ass the fix and only tell your one friend on the team to switch, you have to get all developers to switch AND fix your CI system, AND also your deployment system (s) (if they are exposed).
This comment proffers no option on which of the two solutions should be preferred, just that the fix needs to made everywhere.
ashishb
9 hours ago
> That seems a bit excessive to sandbox a command that really just downloads arbitrary code you are going to execute immediately afterwards anyways?
I won't execute that code directly on my machine. I will always execute it inside the Docker container. Why do you want to run commands like `vite` or `eslint` directly on your machine? Why do they need access to anything outside the current directory?
bandrami
8 hours ago
I get this but then in practice the only actually valuable stuff on my computer is... the code and data in my dev containers. Everything else I can download off the Internet for free at any time.
ashishb
8 hours ago
No.
Most valuable data on your system for a malware author is login cookies and saved auth tokens of various services.
hinkley
7 hours ago
Maybe keylogging for online services.
But it is true that work and personal machines have different threat vectors.
spicybright
5 hours ago
Yes, but I'm willing to bet most workers don't follow strict digital life hygiene and cross contaminate all the time.
apsurd
8 hours ago
it annoys me that people fully automate things like type checkers and linting into post commit or worse entirely outsourced to CI.
Because it means the hygiene is thrown over the fence in a post commit manner.
AI makes this worse because they also run them "over the fence".
However you run it, i want a human to hold accountability for the mainline committed code.
throwaway290
8 hours ago
It's weird that it's downvoted because this is the way
apsurd
8 hours ago
maybe i'm misunderstanding the "why run anything on my machine" part. is the container on the machine? isn't that running things on your machine?
is he just saying always run your code in a container?
minitech
8 hours ago
> is the container on the machine?
> is he just saying always run your code in a container?
yes
> isn't that running things on your machine?
in this context where they're explicitly contrasted, it isn't running things "directly on my machine"
simpaticoder
10 hours ago
pnpm has lots of other good attributes: it is much faster, and also keeps a central store of your dependencies, reducing disk usage and download time, similar to what java/mvn does.
Kholin
7 hours ago
I've tried use pnpm to replace npm in my project, it really speed up when install dependencies on host machine, but much slower in the CI containers, even after config the cache volume. Which makes me come back to npm.
worthless-trash
4 hours ago
> That seems a bit excessive to sandbox a command that
> really just downloads arbitrary code you are going to
> execute immediately afterwards anyways?
I don't want to stereotype, but this logic is exactly why javascript supply chain is in the mess its in.
kernc
7 hours ago
> alias npm=...
I use sandbox-run: https://github.com/sandbox-utils/sandbox-run
The above simple alias may work for node/npm, but it doesn't generalize to many other programs available on the local system, with resources that would need to be mounted into the container ...
ashishb
7 hours ago
> The above simple alias may work for node/npm, but it doesn't generalize for many other programs that are available on the local system, with resources that would somehow have to get mounted into the container ...
Thanks. You are right, running inside Docker won't always work for local commands. But I am not even using local commands.
Infact, I have removed `yarn`, `npm`, and several similar tools already from my machine.
It is best to run them inside Docker.
> I use sandbox-run: https://github.com/sandbox-utils/sandbox-run
How does this work if my local command is a Mac OS binary? How will it run inside Docker container?
fingerlocks
5 hours ago
Or use ‘chroot’. Or run it as a restricted owner with ‘chown’. Your grandparents solutions to these problems still work.
bitbasher
8 hours ago
There are so many vectors for this attack to piggyback off from.
If I had malicious intentions, I would probably typo squat popular plugins/lsps that will execute code automatically when their editor runs. A compromised neovim or vscode gives you plenty of user permissions, a full scripting language, ability to do http calls, system calls, etc. Most LSPs are installed globally, doesn't matter if you downloaded it via a docker command.
ashishb
7 hours ago
> A compromised neovim or vscode gives you plenty of user permissions, a full scripting language, ability to do http calls, system calls, etc. Most LSPs are installed globally, doesn't matter if you downloaded it via a docker command.
Running `npm` inside Docker does not solve this problem. However, running `npm` inside Docker does not make this problem worse either.
That's why I said running `npm` inside Docker reduces the attack surface of the malicious NPM packages.
dns_snek
4 minutes ago
[delayed]
lelanthran
3 hours ago
Won't that still download malicious packages that are dep?
silverwind
3 hours ago
This will break native dependencies when the host platform is not the same as the container platform.
sthuck
10 hours ago
That definitely helps and worth doing. On Mac though I guess you need to move the entire development to containers due to native dependencies.
chuckadams
10 hours ago
My primary dev environment is containers, but you can do a hell of a lot with nix on a mac.
genpfault
10 hours ago
> triple-backtick code blocks
If only :(