Hacker News new | past | comments | ask | show | jobs | submit login

I don't know why anyone would auto-download untrusted packages as part of their own build/release chain and not review the code manually. Anything could be in there.

The way people use NPM is pure fucking insanity.




I mean, the alternative is reviewing 3000 packages by hand. No thanks. I’ll trust Snyk to let me know if anything is compromised.


I suspect a huge fraction of those packages need to be rolled up into a standard library for which some foundation can take money and do proper release engineering.


But then aren’t you still trusting a third-party to properly audit their own code and ensure it doesn’t get compromised? Where’s the line?


It's easier to trust 1 org than 1000 different developers.


I don't know where the line is, but I know users of eg Debian don't find trojans and cryptominers appearing on their systems after routine updates. They have thousands of obscure packages, but they don't let just anyone upload something.


You have to trust someone at some point. I think that was part of the point of the "Reflections on Trusting Trust" lecture.


When these compromised npm packages happen, the blast radius would be a fraction of what it is today if the JavaScript standard library wasn't so anaemic.


Scanners are overrated. They only do basic heuristics like checksumming and pattern matching. It's like antivirus for your code and we all know how useless that is.


When you've got thousands of packages, I'll take a scanner over nothing. For a big ass monorepo like the one I maintain at work, it's a great tool.

Snyk does report on vulnerabilities that don't necessarily affect your usage of a package, but IMHO, that's erring on the safe side and I'm ok with that.


most importantly, they check version numbers against known compromised versions like this, just in case you aren't on HN to see it.


Snyk is okay but I've had my issues with them. Never put all of your eggs in one basket.


Could you elaborate on those issues ?


Totally agree, this is a huge reason I use Yarn. Reproducibility.

The catch is, particularly with yarn 3, if you want the soundness and capability to vendor zip files (completely removing the concept of node_modules) you need to use PnP and deal with some packages that don’t define their dependencies very well.

There’s still packages that will try to import/require stuff they dont declare, which is unsound. So when you get the warning, you need to declare it yourself in the root yarn config.

It’s a bit annoying, but the payoff is soundness.

You can clone to a new machine (be it a for a dev, or ci/cd) and be good to go without touching the network.

It’s like going from js to ts. Extra rigour for a price, but more often than not the price is worth it.


Dies yarn still not honor the yarn.lock of transitive dependencies?


Even if they were trusted, it's still a horrific waste of resources. Maybe it's related to the insane trendchasing attitude ("gotta get the latest!!!!!111") that has infected JS/web development. Combine that with "free" cloud CI services and the general spreading stupidity of not knowing where the data actually is or what's going on (popularity of "quick tutorials" that have completely clueless noobs running one-line commands which can silently manipulate gigabytes of data), and it's not hard to see the result.


Convenience and invented here syndrome.

https://en.m.wikipedia.org/wiki/Invented_here


I recently looked at cloudflare workers. Then I remembered that the JS ecosystem is built around NPM. No thanks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: