Hacker News new | past | comments | ask | show | jobs | submit login

It's funny, the package is-odd has dependencies itself. It depends on a package called `is-number`. Go figure.

I finally understand how `node_modules/` directories grow to 100s of megabytes for the smallest apps

Turtles all the way down.




It is a trade off.

In most other ecosystems such as Java, Python, PHP, etc if you have a dependency A that depends on B 1.0, and a dependency C that depends on B 2.0, you cannot have both of them installed, so you either pick one or the other, or some intermediate version which works in all cases, but many times you end up not being to use one of them or, worse, not able to upgrade either A or C because you cannot resolve all the dependencies anymore due to constraints.

In node you can do this, you can use both A and C even if they use their own, incompatible B versions. If both A and C depend on the same (or compatible) B version, then you only have one as in the other ecosystems. And you can always enforce and override what final version you want if necessary.

This, plus usually the inclusion of documentation, type definitions, source and transpiled code is why node_modules is usually pretty big.

But you don't ship node_modules "as is" to the browser (200MB node module does not mean a 200MB bundle). And usually this is not a problem for any modern laptop or server.

Is this better? Is this worse? Can't say, it is just a different tool with a different trade off.

But just "node modules 10Gb is bad" to me only tells the bad side of it, and usually comes from a lack of understanding of how this works and how it compares to other systems.


Anyone not committing the entire node_modules to their repo is in for a really bad time when npm goes down or gets compromised.


No, that's fixing the problem in the wrong way (by using your code version tool as an FTP more or less), complicating code reviews and unnecessarily growing your code repository.

If you want to be safe if NPM goes down, what you need is to keep an archive of built "binary" releases, either zip files, docker images, or whatever.

You can also have internal NPM proxies or caches too.

Committing external dependencies in the codebase is a terrible solution. And no, google doing it doesn't mean it is good for everyone else, probably more likely to be the opposite to what you need.


No one said committing to the code repo, deployment repos are a thing and if you're not revisioning your deployments then you are screwed anyway.


> No one said committing to the code repo

Yes, you did. Quoting: "...Anyone not committing the entire node_modules to their repo..."

> you're not revisioning your deployments

That's what I'm saying, quoting: "...If you want to be safe if NPM goes down, what you need is to keep an archive of built "binary" releases, either zip files, docker images, or whatever..."

Using Git/SVN/etc repository for this (which is what I kind of get from your responses) is just using the wrong tool.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: