Hacker News new | past | comments | ask | show | jobs | submit login

I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway. You can even depend to your non-npm repo (github, urls...) from a npm-based package.

If you want to "feel" as safe, you have import maps in deno, which works like package.json.

Overall, I think Deno is more secure because it cuts the man in the middle (npm) and you can make a npm mirror with low effort, a simple fork will do. Which means you can not only precisely pin which code you want, but also make sure nobody knows you use those packages either.

Take it with an open mind, a new "JSX" or async programming moment. People will hate it, then will start to see the value of this design down the road.




> I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway

npm installs aren't the same as installing from a random URL, because:

* NPM (the org) guarantees that published versions of packages are immutable, and will never change in future. This is definitely not true for a random URL.

* NPM (the tool) stores a hash of the package in your package-lock.json, and installing via `npm ci` (which enforces the lockfile and never updates it in any case) guarantees that the package you get matches that hash.

Downloading from a random URL can return anything, at the whims of the owner, or anybody else who can successfully mitm your traffic. Installing a package via npm is the same only the very first time you ever install it. Once you've done that, and you're happy that the version you're using is safe, you have clear guarantees on future behaviour.


My assumption would be that new men in the middle will arise, but this time, you can pick which one to use.


btw: https://github.com/denoland/deno/issues/1063

they know there is a bad mitm vector and won't fix it


This is why I think a content addressable store like IPFS would shine working with Deno


That solves this specific problem nicely, although AFAIK IPFS doesn't guarantee long-term availability of any content, right? If you depend on a package version that's not sufficiently popular, it could disappear, and then you're in major trouble.

It'd be interesting to look at ways to mitigate that by requiring anybody using a package version to rehost it for others (since they have a copy locally anyway, by definition). But then you're talking about some kind of IPFS server built into your package manager, which now needs to be always running, and this starts to get seriously complicated & practically challenging...


One advantage of having a centralized repository is that the maintainers of that repository have the ability to remove genuinely malicious changes (even if it's at the expense of breaking builds). Eliminating the middle man isn't always a great thing when one of the people on the end is acting maliciously.


I'm just thinking out loud here, but it seems to me that you could just make sure you're importing all your dependencies from trusted package repos, right? And since the URL for a package is right there in the `import` statement, it seems like it'd be pretty easy to lint for untrusted imports.

I don't detest NPM in the way that some people do, but I have always worried about the implications of the fact that nearly the entire community relies on their registry. If they ever fell over completely, they would have hamstrung a huge amount of the JS community.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: