Hacker News new | past | comments | ask | show | jobs | submit login
Deno 1.28: Featuring 1.3M New Modules (deno.com)
290 points by mhoad on Nov 14, 2022 | hide | past | favorite | 150 comments



I’m slightly disappointed that the vision of leaving NodeJS and npm behind ended up failing. The last year ended up being a bunch of concessions to make Deno more appealing to the masses, like disabling type checking on run by default.

But hey, that’s pragmatism for you, sometimes you just have to let go of ideals even if it hurts a bit.

I’m certainly looking forward to what happens next, even if more from the sidelines.


> But hey, that’s pragmatism for you, sometimes you just have to let go of ideals even if it hurts a bit.

But on the other hand, that's how we end up with a programming language landscape where every language is mostly the same, except for community conventions and very slightly different syntax.

I'd love it if more languages where strongly controlled like Clojure and alike, where there is a unified vision that is well kept across time. Not that Clojure is perfect, but probably the best example of a language where the community is welcome to suggest things but unless the person in control approves it, it won't make it into the core language and instead will/could be implemented as a library. In contrast to Rust which seems to base language additions/changes based on popularity in the community.


JS is JS. It's not going to stop being JS. Deno is a slightly different syntax for the same thing. If you want a different language, step 1 is to pick a different language.


JavaScript in 2022 is JavaScript in 2022, but won't be JavaScript 2022 won't be the same as the JavaScript we'll have in 2030, nor was it the same JavaScript we used in 2010. That's because JavaScript is not driven by a single person with a unified vision, it's driven by a committee who implements stuff based on popularity, quite literally.

I don't want a different language, I want multitude of different languages. And not a multitude of different Fortrans/C-like language.


I’ve tried Smalltalk, Common Lisp/Clojure, and read about Forth the past months. And I agree with you that there is more to explore than the standard set of features found in C-like language (live programming, real macros,…)


I differ from this opinion. Why would you expect "node" people to switch to "deno" - because it's better? I wouldn't expect that. I would expect that to happen only if "deno ecosystem" is better than the "node ecosystem". And ecosystem is not built in a day or few years. And this is a great move, help people to switch from "node" to "deno" while they keep using existing tools/libraries they use. And slowly, people can then switch to deno standard/3rdParty modules.


It's a chicken-and-egg thing and reasonable people can disagree. And I happen to disagree with you. :)

I'm not familiar with Deno, but have skimmed a few posts about it here and on Reddit. In any case, my point of view here is general enough that it doesn't matter if it's literally about the Deno project or some other new language/runtime project.

When you say,

> And this is a great move, help people to switch from "node" to "deno" while they keep using existing tools/libraries they use. And slowly, people can then switch to deno standard/3rdParty modules.

my main thought is that compromising whatever benefits Deno envisioned to "attract" Node developers actually gives Node developers LESS reason to switch.

I'm no psychologist, but I can't help but believe that part of the reason Rust got such a cult-like following (myself included) is because a bunch of us spent years writing C++ and then tried Rust and it wasn't smooth. Perhaps that's paradoxical, but for me, it really showed me how much safer and more robust my code could be with this new tool. I don't think that Rust would be as popular today if it had some kind of unsafely-call-C++-mode enabled by default.

It's a balance, obviously. If you want your language/runtime to be the best quality, you don't compromise much; if you want it to be popular, you make it easy to get in to. Often those two have some tension.

Just my two cents as someone who's a bit more on the idealist/academic side of things and is super disappointed by the compromises in languages like Kotlin and TypeScript.


I second this having switched to Go from typescript and python. Except almost everything is also smoother


Meh, I feel like this is somewhat inevitable. People love talking about clean breaks and how NPM is evil and left-pad, but nobody wants to write their own packages. It's probably the biggest blocker for adoption for any language or tool. If there's no mature libraries for authentication on Deno, am I going to roll up my sleeves and reimplement JWTs, or am I going to sigh and switch to Node? For a lot of tools they're stuck with people who will do the former, and then finally, after enough libraries are built, they start getting the people who do the latter.

Because as much as there are loud voices on the internet decrying packages, most people are not so ideologically focused. They just want to get their code written. Packages help and so they use packages. Plus most of the loud voices neglect to offer any solution other than "packages are bad!!!"


Supporting TypeScript OOTB in any capacity was itself "appealing to the masses". Being able to use TS without the burden of setting it up was a big selling point that resulted in people giving Deno a try, particularly in the early days.

Search this transcript for "any mistakes so far": https://changelog.com/podcast/443


Yes, I agree with that. To me the difference between this and the current changes is that TS support seems to have been a decision done quite early.

Meanwhile configuration and npm support was something that initially (pre and at 1.0 times) was something considered a bad thing. Now, not so much.

I would lie if I said that such a hard stance on these things wasn’t what initially drew me to the project.


I don't understand how breaking every dependency in the ecosystem just to not have to setup a package.json or tsconfig.json file is worth it.


It's really not. Deno broke most of its promises, so for me personally, it's no longer worth switching over.


Setting up TypeScript with node is maybe 1 minute work? I don't understand this argument.


> But hey, that’s pragmatism for you

If you'll permit me to be a cynic: this is VC funding for you. Deno took on a ton of funding and they need results. The barrier of NPM incompatibility simply can't be allowed to get in the way of them getting marketshare.


Incremental progress is progress.

The dreaded npm ecosystem is still full of value.

I tried Deno a few months ago, because I wanted to avoid the hassle with setting up a package.json, yarn lockfile, adding dependencies, etc.. just to run a TS script that generates k8s manifest YAMLs via https://github.com/cdk8s-team/cdk8s-plus

But very quickly I realized it's not there yet.


I think it’s exciting.

Deno is better than nodejs but nodejs is “good enough”.

Competing against “good enough” is extremely hard.

If Deno becomes a better choice than node through npm compatibility them that’s a huge win for Deno.


npm has a lot of issues and I don't think the credit for adding value is delivered with the flack it gets. Everyone's darling Python is still garbage to figure out packaging and deploying even with supposedly great new poetry compared to npm.


Decoupling running and type checking is quite popular nowadays, Vite does it as well. It can give a huge speed boost and allows you to ignore type errors while you're just messing around. Of course then you also need the discipline to eventually fix them, but there's probably a reason you went with TS instead of JS.


It is popular in newgen tooling but I feel its more a function of the slowness of `tsc` than anything else. I get why `tsc` is slow, and have a lot of respect for the team and the constraints they work under but I can't help but feeling if we were to get a faster type checker (potentially `stc` from the creator of `swc` which also does this? https://github.com/dudykr/stc), this choice would be less popular. However if I'm honest I don't know if TypeScript's type system means that there are some natural constraints on how fast it can be validated.


I feel like part of the issue is not as much that typechecking is slow, but that at runtime type checking "doesn't matter". I think a lot of people would like to see the Stage 1 Type Annotations proposal [1] move forward where JS would allow and ignore type annotations at runtime and even "type stripping" disappears at runtime for TS files.

It doesn't matter how fast a type checker is at that point at runtime.

The only reason to bring back type checking at runtime would be if V8 et al ever also started using type hints as JIT hints and you want to triple check you are sending the right type hints and not accidentally stomping on your own runtime performance. (Which is intentionally not in that Stage 1 proposal mind you: no one is expecting runtime type hints in the near future.)

[1] https://github.com/tc39/proposal-type-annotations


If a pure Deno package gets you covered, do not use npm. This is my strategy.

I prefer Deno packages.


Do they really have a choice? Deno is not particularly faster than Node and Bun outperforms both, so Deno has to find a place somewhere between the two. Ecosystem ultimately matters more than speed, thus Deno is making the smart move, as much as I would like to move entirely beyond Node.


*Bun outperforms both on some metrics, which Deno has committed to matching/beating, neither of which may be the actual deciding factor in success since performance isn't everything


Type checking absolutely should not run during runtime. When I’m refactoring I need to be able to test that things are partly working as I go.

If I have to get an entire application’s types perfect before even running the code that’s a huge DX slowdown.

Types should pass, strictly, before merging.


> When I’m refactoring I need to be able to test that things are partly working as I go.

Personally speaking, isn't that exactly the value of having a type system in the first place? If it still type checks after the refactor it ideally* is still working. Unless of course some system boundaries changed or there is some dynamic component.

I suppose we both come from different philosophies here. I write out the types for a program first, then the behavior follows through. If I cannot properly determine the types I escape with dynamism.

You seem to dynamically write the system up and then determine the types, do I interpret that right?

The current setup might be a better experience then, but I think the default matters here.

In my little bubble I've encountered more libraries with broken typing since type checking has been reduced, so I perceive it as a net-negative. You end up having to explain that you need to take manual precautions to actually get type checking.

Of course this could just be due to the growing user-base and the higher probability of hitting a "wrongly typed" library, so it will remain to be seen how much impact it has.

I'm used to the --check flag by now :)

* How ideal depends on how expressive the type system is.


Worth clarifying that the change was to the default behavior. There was always a --no-check option to skip checking for those who wanted to


Really impressed with Deno's overall vision and execution. They are taking the slow and steady road. I remember everyone criticizing them for not supporting npm at the start but I was sure at some pint it has to happen. You can't live without the npm ecosystem if you are writing code in javascript.

I am still not using Deno as my main production runtime but at some point I might make the switch.

Personally I'll use Deno if I can use it with Next.js which with this npm development I can see Next.js support coming soon. Next.js and Deno are both focused on executing code on the edge so it's just natural for it to happen.

Great work Deno team! You deserve all the credits!


I'm wondering if Vercel itself has interest in Deno, given their focus on the edge (and generally being at the forefront of JS tech). I hope they don't try and purchase it or anything, but first-party Next.js support for Deno would be awesome


if anything the alliances in the "battle for the edge" have already lined up - vercel with cloudflare workers, netlify with deno. it'd take a lot now to break those alliances


I don't really see the war analogy as useful... All of these companies want to be wherever the developers and the industry are. It's in their best interest to support (and be close to) whichever technologies prove valuable and/or popular. It doesn't have to be either/or unless there's prohibitive effort required to support both


not so much prohibitive effort as both sides have chosen an edge vendor as their blessed edge solution


Seems like Deno's answer to Next.js is Fresh. Wouldnt it be on Next.js to make themselves compatible on Deno? Either way, it doesnt seem like either party has too much incentive to make this work.


I am not sure if Fresh would be a direct competitor since it does not and will not[0] support client side routing. In every company I worked for in the last few years white flashes on page navigations were absolutely unacceptable.

I still like the framework, but it probably targets a more specific segment of the market. I think aleph.js[1] is more like next and then there is the esoteric ultra.js[2] which kind of tries to do something similar and be super bleeding edge.

[0] https://github.com/denoland/fresh/issues/403#issuecomment-11... [1] https://github.com/alephjs/aleph.js [2] https://ultrajs.dev/docs


Fresh is nice. So is Deno without npm. But it's always about the ecosystem. If Deno wants adoption they should invest in making Deno an option for new Next.js projects


> No node_modules folder by default (use --node-modules-dir for backwards compatibility). Modules are cached once in a special global directory.

This is definitely interesting. It will be nice not having to deal with that massive directory in each project.


Downside is you can't use patch-package

Even the --node-module-dir flag just creates a symlink to your home directory cache so you can't do per-project package fixes (besides forking and rehosting the package)

I'm also not sure how/if it supports packages that require compiling binaries like sqlite or playwright without post-install hooks


No, the --node-modules-dir flag doesn't create a symlink to the home directory cache. It creates a copy in the node_modules folder (in the future it will use hardlinks to reduce space, there is an open issue). It's stored at node_modules/.deno/<package-name>@<version>/node_modules/<package-name> (that flag is heavily influenced by pnpm)

You can do a patch package by doing something like the following (and then you can move this into a `deno task` https://deno.land/manual@v1.28.0/tools/task_runner when launching your app to ensure it happens):

    deno cache --node-modules-dir main.ts
    deno run --allow-read=. --allow-write=. scripts/your_patch_script.ts
    deno run --node-modules-dir main.ts


This seems like a super edge case or antipattern anyway. I've worked with Node.js for several years and can't think of a time I've patched my local node_modules, because a) how would I ensure they're not overwritten in the next npm install, and b) how would I share those changes with the team


patch-package is a tool for this specific purpose

https://www.npmjs.com/package/patch-package

You commit your changes as a diff text file and add the patch-package command to your post-install hook so it runs after every install

My main use for it is fixing other packages' package.json exports field and Typescript definitions


Interesting, I can definitely see this being particularly useful for typescript fixes.

Where should one put the text files in the code base? Do you create some root level "patch" directory?


patch-package automatically makes a `patches` directory at the top level of your project with `{package-name}+{package-version}.patch` files in it.


yes


One widely-used example is Angular. (It may have changed, I’m not entirely up-to-date.) The Angular Compatibility Compiler (ngcc) creates optimized code of Angular component libraries right next to the original files inside node_modules. It does this automatically as part of the Angular build process.

That’s also how I would share any patches: Just apply them when needed during the build process. NPM doesn’t care about what changes inside packages, so you can hack them all you want.


It's more common than you'd think. Consider the case of an app that consumes a component library. One is building a component in one project, and then consuming it in another project. `npm link` makes this trivial. While I totally prefer building things in isolation, there are often little edge cases that crop up once a library is added to an app. Sometimes it's easier just to consume a lib in a "real world scenario".


I can think of at least three times I've had to do this, across ~7 years of working with Node. It's an awful hack, but when you need it, you need it.


It's not uncommon but you most likely fork the package then "npm i https://guthub.com/account/forked_package"

A popular choice if you need some pr's that aren't (going to be) merged.


It's the same way every package manager has done this since the beginning of time.


Package managers end goal was the user to use the package, so it was good that they were installed globally. npm goals, as a tool for devs, includes having projects self-contained and not interfere with each other, which is pretty different goal and thus having a local node_modules has multiple advantages instead of installing everything globally. Kinda how docker or virtual machines work, but just for packages.


I was under the impression that Node (and the node_modules folder) predated NPM. So NPM was just going along with the concepts laid down by Node.


I heavily disagree with the very premise that OS package managers are solving the same problems as language-specific, development targeted dependency management solutions.


Language-specific package managers also often install globally, at least by default.


Really? I know pip does it by default if you don't use virtual envs, but that's related to how pip works - activating the venv basically swaps pip/python executables and `site-packages` to the ones in the venv directory. I admittedly can't think of another I used where that's the case (none of cargo, go mod or npm/yarn/pnpm install globally by default). I admittedly did not really try others enough to notice how they work.

Still, the end-goal is not the same, IMHO. Outside exceptions (e.g. some developers not packaging their applications and making users install through `pip`/`npm install -g`, which I also really dislike), your typical end-user would not interact with language specific package managers. They're most typically used for development/deployment dependencies, not end-user installation.


Go's not technically global but the path of least resistance is to use it as if it is. You'll only notice the difference on an actual multi-user workstation, or if you take extra steps to get a per-project package directory.

Rubygems is usually global.

The granddaddy of them all, CPAN and its cpan install tool, defaults to installing globally, in practice, on most systems.

Pip, as you noted, does it.

I think Maven's typically user-scoped, so not really global, but also not project-scoped like npm. Similar to Go.

You can get around this in most or all cases—sometimes with options or config, sometimes with extra effort or some aliases or scripts, sometimes environment variables, sometimes with 3rd party tools—but the simplest or default mode is global installation, or at least project-global if not user-global in Go's or Maven's cases. And many of those solutions (e.g. rvm) still don't scope installation per-project by default, like npm does.

At the time npm got started, especially, it was definitely an outlier in its default scope. Some others have taken its approach since, and maybe a majority of total language-specific package managers even do, these days, but adjusted for actual use in the wild, I bet NPM's by far the most-used one that operates that way, and that most of the other popular ones scope more broadly—including, often, globally—by default. That is, a language-specific package manager one runs into in the real world, in 2022, usually isn't going to scope packages like NPM does without some extra effort. Global or per-user scoping (the latter being functionally identical on any single-user machine) are more likely.


Every system package manager. There are plenty of programming dependency managers that don't use a centralised store though.


Well except for NPM :D


Composer also?


except for rust which downloads the same package over and over again, even in the same project!


This sounds similar to PNPM: https://pnpm.io/. You still get a node_modules folder but it just has symlinks into the global cache. Ot was basically a drop-in replacement (minus the need for —shamefully-hoist on our install) and ended up being faster to install and build


I am by no means a JavaScript person; I use it when it’s the right tool for job (rare). However, projects like Deno and Bun make me hopeful for the future of the JS ecosystem.

And while the ‘1.3M modules’ headline is more scary than exciting (for me), I’m glad they locked it behind a special “npm:” classifier and have their own way of dealing with node_modules so things stay explicit, while allowing people to swap over somewhat seamlessly.

I wonder what impact this will have on Deno and NPM long term.


whats your go to language?


Even if you can't use Deno in your project today, it's hard not be excited about the future that Deno is pushing towards (and dragging the Node ecosystem with it).

When ES6 came out but wasn't yet widely supported it became pretty common to add a build step to transpile the backend code to. This brought a whole bunch of issues that transpilation brought with it but was seen as a temporary evil worth the tradeoff because the new language features brought so many productivity improvements, but stuck for so long it became pretty standard. And when TypeScript became popular, it felt like we'll just never get rid of this complexity explosion that build systems bring. And then Deno came.

Just imagine the amount of config files you'll be able to delete once you no longer need to build your backend codebase.


As always, less complexity and less expressive power at a given level go hand in hand: Deno as it exists right now can’t work even with a relatively tame nonstandard approach to JSX such as that in Solid.js[1] (without essentially running a build step at startup), let alone a full language extension like Svelte[2] (there is a thing for that now[3], but it seems to be squeezing in a build system through a localhost server IIUC?).

[1] https://github.com/solidjs/solid/discussions/332

[2] https://github.com/sveltejs/svelte/issues/4431

[3] https://github.com/crewdevio/Snel


I can see merit in both sides of the argument on whether Deno should have added npm support. I lean towards the side that's excited about this release, especially for what's mentioned here: https://deno.com/blog/v1.28#security

  deno run npm:install-malware
     ┌ Deno requests write access to /usr/bin/.
     ├ Requested by `install-malware`
     ├ Run again with --allow-write to bypass this prompt.
     └ Allow? [y/n] (y = yes, allow; n = no, deny) >
This alone would entice me to start using Deno as a drop-in replacement for Node (I'm mostly running it to bundle front-end things).


> Building apps will be easier and _more secure than ever_

I assume it is because of the permissions feature of Deno. If so, at what extent(s) does Deno provide security if I have a deep, deep [npm] dependency that, for instance, reads file, i.e. needs file system permission? Do I have to specify every single permission for each dependency? Does Deno have some kind of "dependencies file" that allows to specify dependencies' permissions?

See: https://news.ycombinator.com/item?id=30703817


As far as I know permissions are at a process level, not a module/dependency level. This means if something in your application needs to be able to read/write a file then all of your dependencies can as well.


I can see at: https://deno.land/manual@v1.28.0/basics/permissions, that I can set permissions through parameters.


If you want to play with deno in the browser with LSP support and instant preview, no need to install anything.

Signup on https://app.windmill.dev -> New Script -> Next, that's it you can now play with deno and get a feel of the language and auto npm imports.


How is this easier than actually using deno? It's a single executable. Download, write code in your editor of choice and run deno. No account sign up, no install, no friction.


Did you forget the /s? You're implying that

1. downloading an executable 2. opening an editor 3. writing and saving a file 4. running the executable

Is somehow easier than opening a website writing some code and pressing run?


You have to sign up for an account which means waiting for and then verifying an email. In addition to giving your mail up for spam and bullshit from a startup that I could care less about. Hard pass, I'll download an exe and run it.


> Is somehow easier than opening a website writing some code and pressing run?

Faster than opening website - no.

Faster than opening website, signing up, verifying email, saving password - absolutely.


The new npm import is a much required step in the right direction. The next step would be to allow namespacing of other / default repositories.

```

import namespace npm

import { chalk } from "chalk"

import { assert } from "test"

import namespace local

import { chalk } from "mychalk.ts"

import { assert } from "test"

// who needs extensions anyways?

```


a) They intentionally want to segregate NPM imports, because this is a stopgap, not the way forward they're trying to lay out for the ecosystem. The future Deno is aiming for is one where you don't have repositories (as we've known them), only web domains.

b) The above introduces non-standard JS syntax and semantics, which also goes against Deno's philosophy


b) that's a valid argument. so Deno.namespace("npm") would be the way

a) that's fine. In fact namespacing helps with segregation, because once there is some "npm for deno" one would just set the appropriate Deno.namespace("default") without having to touch a million import "npm:package_xyz".

Explicitly repeating full paths such as

import { createRequire } from "https://deno.land/std/node/module.ts";

over and over again is just an ugly antipattern.


My opinion is that imports are such a simple, rarely-modified, hard to mess up (and often automatically-managed) part of the code that it's much more valuable to have simplicity/explicitness over complex new mechanisms that save a few characters. But we can agree to disagree

Another thing that helps is that in Deno, it's common to have a deps.ts file that imports and then exports all the third-party libraries in use, for the rest of the project to reference. This is mainly done to make sure everything is using the same versions of everything, but it helps with brevity too. You could even approximate your own namespacing mechanism using this pattern


Deno already supports "import maps" for this. [1] There's one example in the linked article even.

You create an import map JSON file that tells Deno that maps human useful "short names" and "short paths" to full URLs (including `npm:package@version` URLs).

[1] https://deno.land/manual@v1.28.0/basics/modules/import_maps


Huh. I was about to complain that this breaks with web standards, but apparently it's being proposed as a standard feature: https://github.com/WICG/import-maps

Interesting!


How can it break a web standard when this server-side mess of different require(), import(), import "npm:", import ".mjs" does not directly translate to browser code anyways?

Interesting indeed that you can remap whole URLs:

{ "imports": { "https://www.unpkg.com/vue/dist/vue.runtime.esm.js": "/node_modules/vue/dist/vue.runtime.esm.js" } }

"… instead grab the one from the local server."

This will surely result in much power and confused human debuggers;)

Changing import maps will also be a prime target for some class of hacks.


> How can it break a web standard when this server-side mess of different require(), import(), import "npm:", import ".mjs" does not directly translate to browser code anyways?

require() and .mjs are specific to Node (Deno does away with them), import() is actually a web standard. Not sure how npm: fits into things, though if you're importing an NPM dependency directly you're almost certainly pulling in some non-web stuff anyway, so maybe they just decided that was a lost cause

But "web standards compatible" is one of Deno's stated goals. A whole lot of the Deno code out there (or at least a much larger portion than Node code) can be imported directly, with no preprocessinng, by browser scripts. That's a super exciting goal to work towards.


`npm:` fits the form of a URL/URI in the most basic syntax sense (`protocol:resourceviathatprotocol`, just as `http:` indicates a protocol). `npm` is not currently an IANA registered protocol [1], but it easily could be. There's nothing it would currently conflict with. Also, URLs with unregistered protocols are relatively common on the web today as URLs anyway. A lot of applications use unregistered protocols like `twitter:` or other `myappname:` for deep linking into mobile apps and PWAs. In most cases the user's operating system's registry of protocols wins over IANA's more generalized registry of protocols.

[1] https://www.iana.org/protocols#index_N


great, thanks!

not fully supported in WebStorm but the future is near


What was the other node alternative that was faster? I was going to try it out, but can't remember it. Deno seems too much of a change.


I think you mean Bun https://bun.sh

Please share any feedback jarred@oven.sh


Bun?[1]

Which is a bigger change then Deno, as that is basically node with some more stuff included (same engine, lead developer, similar attitude towards standard library etc.), whereas Bun is using the JavaScript Core engine from Safari.

[1]: https://bun.sh


Sorry, but this is nearly the exact opposite of the truth. The choice of underlying JS runtime only really affects performance. Bun is specifically designed to be fully Node-compatible (including the standard library, etc), while Deno is specifically designed to be a break that leaves Node's baggage behind (the current post is one of the pragmatic compromises they've made on that intent, but the intent is still there). Both runtimes add their own stuff on top of the above, but Bun is definitely the most "like Node" in terms of compatibility/familiarity/ease of adoption (and I say this as someone who prefers Deno).


Do you mean bun??

There's still a whole lot of confusion around which is actually faster. Both have published counter benchmarks to prove that they are faster than their rival.


In my experience, v8 has a better memory and garbage collection system, but JSC produces faster code. I don't know how that works in a server context though.


I really care more about compile times more than speed in prod.


Bun is probably what you're thinking of.

https://bun.sh


bun


Unless I'm missing something, does the lack of an 'install' command (be it for non-npm or npm libraries so you can bundle the packages with your image) not mean it's detrimental to any deployment where you have either horizontal scaling or cold starts (eg. GCP Cloud Run)?


The `deno cache` command (ex. `deno cache main.ts` or `deno cache --node-modules-dir main.ts` if you want a node_modules folder) will ensure all npm packages used in the provided module are cached ahead of time. It acts similarly to a `npm install` if you need it.

Also, at the moment, npm specifiers aren't supported with `deno compile` (https://deno.land/manual@v1.28.0/tools/compiler), but in the future that will be one way to have a self contained executable with everything ready to go.


To deploy you'd create a binary executable version of the whole project and deploy that. That's what Deno Deploy does, I think.

This way the execution is almost garanteed to me the same and the edge network doesn't have to support the runtime at all, just allow binaries.

https://deno.land/manual/tools/compiler


Ah, there is this: https://deno.land/manual@v1.28.0/advanced/continuous_integra..., however I can't seen to find any specific page for a 'deno cache' CLI command aside from some loosely related pages that use the command.


"You can now import over 1.3 million npm modules in Deno".

Pragmatism prevails.


Edit: can't delete, but thanks for finding typo

https://www.npmjs.com/package/binaryen


The error is pretty much self explanatory. The npm package 'binaryan' does not exist.

https://www.npmjs.com/package/binaryan returns a 404.

The package you try to import must exist on npm.


Can I use Deno now with AWS from npm and everything just works?


Yes, `import { S3Client } from 'npm:@aws-sdk/client-s3@3.209.0'` works. Not every module work perfectlym but newer modules almost always do. If it's a ESM module, you won't shouldn't have any problems.


How can one use a bazillion modules in production and not get hacked because of a dependency vulnerability? How are packages vetted?


You can control the security to allow only certain ip addresses to be accessible, control which folders is readable or writable, decide if Deno can access hrtime which could be abused to fingerprint you and more:

https://deno.land/manual@v1.27.0/getting_started/permissions


1.3M modules...

Node developers be like: "Oh that function I just wrote is so beautiful, let's make a module!"


A function used by ten of thousands of projects, battle tested and unit tested, maintained by the community, and with many weird cases taken into account. It doesn’t sound bad to me. The overhead is a one folder and one JSON.


left-pad


Yes it was a mistake from NPM to break the dependency tree. They fixed the issue relatively quickly.


The concern with left-pad isn't just that someone can break the build pipeline, it's that the author could just as easily have decided to push malicious code. By the time npm could react to that, your code base could already be compromised.

Each additional dependency you add is another vector for a supply chain attack. If you keep your dependencies to the large libraries (say, use lodash instead of a hundred tiny libraries), there are at least fewer people in your supply chain. But if you are depending on left-pad for its 10 lines of code (that still managed to have bugs), how many other random tiny dependencies did you bring in? How many individuals are in your supply chain? All it takes is for one of them to go rogue.

(Version pinning can help, but npm's default behavior is to accept minor version upgrades instead of pinning to the exact version that was live when you installed it. Lock files usually just get updated and recommitted without very much thought, so they don't help much either.)


I understand the concerns but it’s a risk most projects are willing to accept.

Automated builds should use ‘npm ci’ to use the exact versions.

If a developer goes crazy and push broken updates, it has been reversed and made the front page of most IT news websites in the past.

The supply chain of software rely on so many things. It’s impossible for anyone to trust everything. I don’t trust OpenSSL for security, the Linux kernel still doesn’t have unit tests as far as I know, my intel CPU is a black box with proprietary microcode that is known to allow network access to anything , so many tiny libs in C are half maintained,…

left-pad being offline for a few hours many years ago or fakers doing an infinite loop in one promptly removed release last year are not preventing me to sleep.



There is no issue with having each functionality being its own function or cli command. That's just good design.

The issue is when each function/command is a separate package that needs to tracked separately and depends on 10 additional packages.

I would like to see the typical dependency tree of a linux distro, I guess it is shallower and has a smaller fanout than a JS application.


Judging the shitstorm of left-pad, they might do one thing, but they don’t do it well.


this is exactly why the unix philosophy is flawed.

this is not a flex.


Except that the philosophy is beautiful and extremely useful due to composability

It's those random lib authors who take the philosophy too literal that fail the community


> Except that the philosophy is beautiful and extremely useful due to composability

The philosophy is beautiful on the surface, which is exactly how far beauty extends.

Write some load-bearing software using a litany of software all connected by pipes throwing untyped text streams around to various handles. If you do this, and you do not experience problems due to composability, one of the following is true: 1) you aren't doing real work, 2) your software exists in an environment which never changes, which is very rare for most people (meaning that you have fixed your environment enough to wallpaper over problems with composability.)

statically compiled stuff and strongly typed data is the only way to longevity that is worth the time you need to put in to a system that must work.


The philosophy itself is solid. The implementation is another matter. The only reason to go monolithic is because of technical limitations or commercial and financial considerations IMO.


> The philosophy itself is solid.

For some things, yes. For ad-hoc stuff, composability is amazing. Once in awhile I need to use a wrench to fix my car, and I have a whole array of wrenches to accommodate most needs I have with that car. I would never drive the car with a wrench still in place holding something together, and that's what the Unix philosophy promotes: take all these small tools which each do their thing and create some larger "grander-purpose" software with them.

That works if you need to do a one-time (or at least infrequent) analysis of some raw text, but I have learned over and over to not rely on the passing of plain text between programs for anything that you need to work later on. Something always changes which breaks it. Always. Maybe not for a year, but some update happens which subtly changes the output of a command which breaks stuff down the line.

With typed data checking and compiled applications (the opposite of passing untyped text between individual tools) I can easily account for errors before they happen, and rely on tests which exercise these things, and gain significant performance improvements over piped single-purpose tools, in less time overall.


I fully agree about the problems and limitations with plain text and the lack of structure and types. It was the composability aspect of the philosophy I had in mind.


So what's the difference between Deno and Bun? Are they just drop-in node replacements?


Both Bun and Deno aim to be batteries included, shipping Typescript and a test runner etc., by default. Deno initially aimed to diverge from Node and conform to the web platform as much as possible, i.e. use ES Modules and Web APIs over custom APIs. They've moved back towards Node over the last while to get more adoption. Bun broadly aims to be Node compatible, but they seem to try to stick to Web APIs too. Bun seems to value performance as a primary concern, with start-up time being an often-discussed metric. The value being improvements to local developer toolchains or fast starts in edge environments. Deno seems to value developer experience, with the goal being an overall better Node. They also benchmark performance against Node and seem to be faster in some places but not others. Finally, Bun uses the JavaScriptCore engine from Webkit, which it seems can be faster than V8 in some situations.

This is the perspective of a relatively detached observer who has played with both a little and kept up with their development somewhat but hasn't done a serious project with either.


Bun is an all-in-one JavaScript/TypeScript bundler, runtime, package manager, and transpiler focused on being being fast and a drop-in replacement for Node. Much of it is written from scratch in Zig.

Bun also adds many runtime APIs like a builtin websocket server, FFI, Bun.mmap, SQLite, Bun.Transpiler and more.

`bun dev` let’s you use bun’s transpiler for frontend code

`bun install` is an npm client you can use with Node and it installs packages 20x - 100x faster than npm/yarn/pnpm

`bun run` let’s you run package.json scripts really fast

(I work on Bun)


I literally only just realized that Deno is an anagram of Node...


'node'.split('').sort().join('') == 'deno'


Migration tool from NodeJs


support for next.js soon??


I feel conflicted about this.

It makes 100% sense for Deno to do this from a business and marketshare standpoint: they need those modules in order for people to make the kind of projects they're making with Node. But I was really hoping that Deno would be a reboot: flush out all the awful NPM modules out there you don't even know you have a dependency on and create a JS module ecosystem worth its salt. In some ways "1.3M new modules" is more scary than impressive.

But alas, here we are. We've got the JS ecosystem we deserve.


> flush out all the awful NPM modules out there you don't even know you have a dependency on and create a JS module ecosystem worth its salt

This kind of view is very prevalent in the web community, and in my opinion it's often the failing of the ecosystem. Good engineering takes time, there's almost no project out there that nailed everything on day one. It is through iteration and progressive improvement we achieve big things. Anyone with a crash course in programming can publish a library in a week. But it takes proficiency and hard work to maintain an interface over years or even decades. There is also no such thing as a perfect solution. Every solution makes tradeoffs and being able to adapt to tradeoffs relevant to your problem is a huge aspect of engineering.

Deno going the scorched earth way would mean pulling back the ecosystem by at least 10 years. There is no guarantee they will get things "perfect" this time around. In all likelihood, they eventually would've come up against the same set of constraints the original ecosystem did, make similar tradeoffs and there would be a conversation 10 years from now about needing a reboot.

The right way to do is to embrace legacy and build solid tooling where the legacy gives you rough edges. Another big aspect of engineering, in my opinion, is not being fanatical about aesthetics. Function over form, every time. I'm glad Deno is taking this route, I was previously critical of their choice to fork core packages like dotenv, semver, base64 etc, but hopefully this will move the community towards consolidation rather fragmentation.


I agree - Deno did this right. They built the runtime ignoring NPM existed so it wouldn't compromise design principles, but left the window back into that ecosystem open so they could reap the benefits of "backward" compatibility.

I do think if Node has a successor, it won't be a future version of Node - it will be Deno. There is too much that Deno does better to ignore once all the compatibility is there to migrate.


This is 22 years old, but still something I consider essential reading for developers.

https://www.joelonsoftware.com/2000/04/06/things-you-should-...

TLDR: Starting from scratch is rarely a good idea. (just read it though, it's not that long).

The fact is that there are lots of good npm packages that have had tons of time and effort put into them, and years of running in production to prove their worth. Yeah there are some dumb packages and micropackages seem like a mistake, but there's no reason to throw it all out.


If the new js module ecosystem really is better, then js developers would publish new modules there instead of npm, and eventually legacy modules on npm would be migrated over. I don't think that is the issue. You don't have to use NPM to use deno, it's just an option. But yes, NPM is scary.


Yes but people usually follow the path of least resistance. If modules are on NPM they'll stay on NPM if there's little incentive to rewrite as ESM isomorphic modules and publish for Deno.


yes, exactly. There needs to be more incentive


How would that happen?


by improving the js module ecosystem enough. I don't know if that's possible, it seems like more of a behavior change than technology.


You're mixing two different things:

1. The legacy language baggage of different syntaxes, bad Node APIs, non-standard import styles, etc

2. The existing ecosystem of actual packages and their capabilities

It's a good thing that Deno isn't a reboot of #2. Despite the recurring narrative on HN, the scale of the NPM ecosystem is a good thing. It's one of JavaScript's greatest strengths. I don't understand all the people complaining that they're given too many powerful and free packages to choose from. Whenever I've brought Deno up to people at work or otherwise, their major source of hesitancy for using it on real projects is (rightly) that it's lacking Node's vibrant package ecosystem: "Does it have a mature auth library? What about an AWS SDK? GraphQL server? Client? Redis library? What if we need something later that we don't know we need yet?"

As for #1- Deno is still a reboot, just a softer one. It still gives people lots of reasons to write new packages in the well-formed, web-standard-based way that Deno directly supports. NPM compatibility is a bridge to get us there gradually (because that ecosystem is so valuable, and can't be rebuilt overnight).

It was always a bold move to try and go scorched-earth, and especially with the new pressures from Bun, I think it was the right call to make this exact level of compromise.


Well, Deno by itself won't solve the supply chain attack problem. Crowd review might[0]. It's cute that npm/yarn/github/dependabot/renovate all warn if there's a security issue, but at that point it's a bit too late. (Better than nothing, definitely, a step in the right direction, but not a solution.)

Clean reboots are always hard, usually don't work out. Just look at the Py2-Py3 transition. Also there's a Perl6 story somewhere here. (Relevant xckd[1], relevant c2 entry[2] )

[0] https://github.com/crev-dev/cargo-crev

[1] https://xkcd.com/224/

[2] https://wiki.c2.com/?SecondSystemEffect


> hoping that Deno would be a reboot

That's exactly what they tried, by settting up a Deno-only registry.

But there is just way too much work and momentum happening in the NPM ecosystem. Many library maintainers wouldn't want to bother maintaining separate packages for a barely-used ecosystem, and users will be less likely to switch because they can't use the existing (huge!) ecosystem.

It's the same language after all, "just" a different standard library (which is very impactful for lot's of server-side code) and packaging/distribution mechanisms.

This move was probably inevitable.


I like what Deno is doing, it also has a bunch of great features. The functionality of it is great and the execution is sharp.

However, my main gripe with Deno is that it's tied to one company and it won't solve issues that it doesn't have. As an example of this, a version of nodes cluster module is not supported so there is no way of running one deno process per cpu which is very bad if you're hosting it yourself and want to utilize the full potential of your hardware.

It also means that if your app is bound to some CPU heavy action, like generating a large excel-file or similar, your app will go down since no requests will be processed due to the single thread nature. Of course, such actions can be solved with a web worker but what if you make a coding mistake which renders an unhandled exception? In this case the app would probably go down and if the end user for example does the same action over and over again (which end users tend to do in frustration), the app will go down again and again.

Deno as a company has probably little interest in solving this as the solution is to run it on their paid service. You could of course have several servers hosting the app, but that gets expensive real quick especially if you are a small shop. Another example is to run the entire app process as web workers but then you need to spin up many processes that all have their own ports which you need to add a load balancer in front of it. This is kind of advanced and adds unnecessary complexity to the app IMO.

Also, if Deno the company company fails, what then will happen to Deno the project?


You could solve the resource starvation issue by running multiple instances of your app and binding to a port using the SO_REUSEPORT option. This will allow multiple instances of your app to use the same port.

This option is also quite good for deployments as you can have instances stop reading from the port while client traffic is still being served from other instances on the port. This works well for HTTP requests, but less so for something like gRPC.


Just out of curiosity, how would you configure that in a systemd configuration file?


You use the setsockopt system call. This is an example in C:

    int sfd = socket(domain, socktype, 0);

    int optval = 1;
    setsockopt(sfd, SOL_SOCKET, SO_REUSEPORT, &optval, sizeof(optval));

    bind(sfd, (struct sockaddr *) &addr, addrlen);
In Go, you can use syscall.SetsockoptInt. Most languages have a way of setting this option. You have to create the socket yourself and pass it into your HTTP server in most cases, but it depends on the library.

Edit: oh sorry, you meant when systemd is opening the port for you. It looks like you can set ReusePort=yes in your configuration? https://www.freedesktop.org/software/systemd/man/systemd.soc...


why not just build multithreading into deno? Is this only a paid feature?


regarding multiprocess, I think solutions such as kubernetes and similar are the way to go (rolling restarts / updates, autoscaling, advanced load balancing etc) rather than having it managed within the runtime libs

> what if you make a coding mistake which renders an unhandled exception?

AFAIK unlike node, Deno's API is largely promise-based and therefore all unhandled exceptions should become unhandled rejections withou crashing the process - so this should be less of an issue than it is with node (Of course, using older node libraries makes crashes more likely)


There is an ocean of distance between wanting to use all the cores on a CPU and spinning up (or paying for) the monster abstraction that is kubernetes.


Just as a point of full disclosure, I'm the author of recluster (https://github.com/doxout/recluster) and its accompanying modules (such as https://github.com/spion/sticky-listen) and I also have some experience with managing k8s by myself - so I have some experience with both approaches.


> AFAIK unlike node, Deno's API is largely promise-based and therefore all unhandled exceptions should become unhandled rejections withou crashing the process - so this should be less of an issue than it is with node

Perhaps you're right, I have not tried this out in Deno-land so I cannot say wether this actually applies to Deno the same way.

I just know that I got burned by this exact issue in node and I think this can be a security issue that people usually don't really think about. If you know that some api is a node api, if you manage to find a bug that is crashing the process one could probably script it to trigger it to crash over and over again. Even if it restarts quickly, usually it takes a second or and in that time one could bring the entire api down.

I am not aware of any instance that this have been used against some api in practice but most other languages handle this much better than node does and the javascript way of crashing on error is one of the things that made me question using javascript for backend services at all.

Regarding the kubernetes... no thanks. I would not go into that beehive unless forced. I didn't like the complexity increase of adding a load balancer and that is like a drop in the ocean of the complexity that is kubernetes.


k8s is not really that complex by default, despite all the complexity hype around it (if you look carefully you'll notice a significant number of the people saying this have a serverless SaaS to sell). You can run something simple such as microk8s on VMs and get better results more quickly than any manually managed process manager and/or load balancer.

You can, of course, make it more complex by piling a ton of stuff on top of it, and most cloud provider offerings do that. But its not required.


You can use PM2 with Deno


Somehow I feel like if you need to install Node to run Deno, you may as well just run Node from the beginning.


There was some discussion of using rayon for CPU heavy ops... I'm not sure what makes excel generation CPU-intensive and blocking? (I wonder if it can stream.)


Similar with MySQL. As long as the code is open sourced...I'm ok with it...heck MySQL managed to turn into MariaDB and yet many still stick MySQL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: