Hacker News new | past | comments | ask | show | jobs | submit login
Deno 1.0 (deno.land)
2081 points by 0xedb on May 13, 2020 | hide | past | favorite | 583 comments



> ... Deno is (and always will be) a single executable file. Like a web browser, it knows how to fetch external code. In Deno, a single file can define arbitrarily complex behavior without any other tooling.

> ...

> Also like browsers, code is executed in a secure sandbox by default. Scripts cannot access the hard drive, open network connections, or make any other potentially malicious actions without permission. The browser provides APIs for accessing cameras and microphones, but users must first give permission. Deno provides analogous behaviour in the terminal. The above example will fail unless the --allow-net command-line flag is provided.

The Deno feature that seems to draw the most fire is dependency management. Some skeptics may be latching onto the first point without deeply considering the second.

Deno is just doing the same thing a browser does. Like a browser, there's nothing that JavaScript running on sandboxed Deno can do that a browser can't - in principle. So the security concerns seem a little over the top.

The one caveat is that once you open the sandbox on Deno, it appears you open it for all modules. But then again, that's what NPM users do all the time - by default.

As far as criticisms around module orchestration, ES modules take care of that as well. The dependency graph forms from local information without any extra file calling the shots.

This seems like an experiment worth trying at least.


See the thing about the sandbox is that it's only going to be effective for very simple programs.

If you're building a real world application, especially a server application like in the example, you're probably going to want to listen on the network, do some db access and write logs.

For that you'd have to open up network and file access pretty much right off the bat. That combined with the 'download random code from any url and run it immediately', means it's going to be much less secure than the already not-that-secure NPM ecosystem.


> That combined with the 'download random code from any url

What protection does NPM actually give you?

Sure, they'll remove malware as they find it, but it is so trivially easy to publish packages and updates to NPM, there effectively is no security difference between an NPM module and a random URL. If you wouldn't feel comfortable cloning and executing random Github projects, then you shouldn't feel comfortable installing random NPM modules.

> and run it immediately

NPM packages also do this -- they can have install scripts that run as the current user, and have network access that can allows them to fetch, compile, and execute random binaries off the Internet.

From a security point of view, Deno is just making it clear up-front that you are downloading random code snippets, so that programmers are less likely to make the mistake of trusting a largely unmoderated package repository to protect themselves from malware.

I lean towards calling that a reasonably big security win on its own, even without the other sandboxing features.


> What protection does NPM actually give you?

Dependency version pinning comes to mind. The main difference between this and a random URL is that at least you know that if the module gets bought by a third party, your services or build system won't auto update to some rando's version of the package. IIRC there have been cases when a version was replaced as well.

I think this could be fixed quite easily if one could add a hash and a size after the url, to force a check.


I was curious about this so I looked into it. Seems like deno allows for lock files (similar to package-lock.json for NPM) https://deno.land/manual/linking_to_external_code/integrity_...


Yeah, basically sounds like they could implement it à la Content Security Policy in the browser and it would be well understood right off the bat.

Or similar to node_modules, have some way to pull your dependency graph & host locally — At least for enterprise-y adoption I imagine that people will want to have _their_ copy of the code and choose when to update it even if in theory the remote code is locked down.


That is what I figured too. People are rightly concerned about the security implications of this new paradigm of including package dependencies.

These concerns and the conversation around them are good and healthy. Give it some time. People will experiment with what works and over time best practices will emerge for the set of trade offs that people are willing to make.


Arguably you can get (even more reliable) version pinning by copying typescript from that random URL & storing it in your own S3 bucket. Sure, you have _some_ work to do, but it's not that much and you 100% control the code from there on.


Well, I suppose they do (or will) provide a self hosted version of the registry. Like npm does.


If you publish your module versions on IPFS that would provide a guarantee to your users the module versions do not change once published. But hashes are not very memorable as module names.


> If you publish your module versions on IPFS...

Well, using message digests, NPM or Yarn can pretty much guarantee content addressable versions, too. Do not have to use IPFS or blockchains, just because...


A single source of trust for the dependancy transport.


> That combined with the 'download random code from any url and run it immediately', means it's going to be much less secure than the already not-that-secure NPM ecosystem.

What deno does is move package management away from the framework distribution. This is great - one thing I hate about node is that npm is default and you get only as much security as npm gives you. (You can switch the npm repo, but it's still the overwhelming favourite because it's officially bundled.)

Deno can eventually give you:

  import lib from 'verified-secure-packages.com'
  import lib from 'packages.cloudflare.com'
So you'll be able to pick a snippet repository based on your risk appetite.


But if Lib itself imports from "unsecure-location.com" deno will access that location and get that file.


The idea of the above example is to show a controlled distribution can be made that would verify all levels of imports if needed, which is very promising.


Both the network and disk access permissions are granular, which means you can allow-write only to your logs folder, and allow net access only to your DB's address.


So it's reimplemented chmod and iptables?


Typically, chmod and iptables are not used to restrict applications. Applications are restricted by virtual machines, containers, sandboxes, AppArmor profiles, SELinux policies…


There's a fairly long history of giving applications their own uid to run under which puts chmod and chown in control of filesystem operations the app is allowed to perform. "Typically" maybe not, but it's hardly unusual.

iptables + namespaces gives you the rest.


+ you can make a network namespace and have separate iptables just for that namespace/app, you can for example give the namespace/app a VPN connection without affecting the rest of the system. And other apps can join the namespace and communicate as if they had their own isolated network.

NodeJS is also working on policies (1) which allows you to change permission to single modules or files.

1) https://nodejs.org/api/policy.html


chmod/chown has been the de facto (if not de jure) method securing LAMP stacks for as long as I have been alive. Not that I recommend taking the advice of a LAMP stack too seriously :)


If the de facto method refers to "chmod 777", I wouldn't call that securing ;-)

But indeed, if there is a separate user account for the application, then chmod can be used for some control to its access to files and directories.


A bit more like OpenBSD pledge() and unveil()


> For that you'd have to open up network and file access pretty much right off the bat.

For the network access I have an issue[0] open that asks to separate permissions to listen on the network from permission to dial. Also, along the way I want to have the ability to let the OS pick the port as an option.

Permissions are meant to work by whitelisting. So you wouldn't open access to the whole system just to talk to your DB, or to operate on some files.

[0] https://github.com/denoland/deno/issues/2705


Maybe this will develop into a standard of multi-process servers (real micro services you could say), where the permissions are only given to a slice of the application.


Sounds like privilege separation[1].

[1] https://en.wikipedia.org/wiki/Privilege_separation


That sounds like the Postfix architecture [1]

[1]: https://www.akadia.com/services/postfix_mta.html


Reinventing QNX will always be cutting edge.


QNX is hands down amazing! No car manufacturer could ever come close to having their in-house infotainment system being as snappy as QNX...which is why they gave up and switched to QNX! Fine print: Tesla not included.


Now that would indeed be an interesting way of building servers.


Sometimes it's ok to think "this project isn't for me" and just leave it be. The cynical-security-concern act is boring.


Contrary to the impression I seem to have given you, I'm actually super excited about Deno and am planning to write my next MVP app in it.

That means that I am actually a lot more vested into it, and if I want to put it in production, then I have to be concerned about things like this.

When somebody says they think X is broken, and they present a solution Y which they say is better, I am definitely entitled to ask why they think Y is better when I can't see the difference.


But you don't seem to be genuinely seeking answers, at least not in this thread. Does seem you're already convinced of the projects faults.

You're entitled to your opinion, of course. I had to read through the docs to understand their module system and intent. And I find it very exciting.


Security is literally the main selling point of this thing. Otherwise just use node.


It’s one of the selling points. One of the main points I took away was

” We feel that the landscape of JavaScript and the surrounding software infrastructure has changed enough that it was worthwhile to simplify. We seek a fun and productive scripting environment that can be used for a wide range of tasks.”

Sounds intriguing to me. As a fan of starting projects of as simply as possible, I will certainly be tinkering with Deno.


There are a lot selling points. To me, the main one is typescript with no build.


  yarn global add ts-node prettier
  echo 'alias deno="ts-node"' >> ~/.zshrc
  echo 'alias deno-fmt="prettier --write"' >> ~/.zshrc
Deno provides a standard library, good defaults, top-level async-await, doesn't break browser compatibility, better API to integrate with runtime.

Internals are nicer but that's with anything without ugly legacy.

They are working to get node_modules work in deno so I am kind of worried that it will be nodev2 all over.


Clearly they have never dealt with JavaScript build tools and npm. A complete nightmare.


Who hasn't? Isn't this precisely one of the pros of Deno?


Yes. That's why I said that.


Promise-based APIs sold me.


Strawman security questions without an understanding of the tool are not very useful.


For the use-case you describe, your just going to need network access: no file access and no process-forking needed, this is a big surface attack reduction.

Moreover Idk how granular the network permission is, but if its implementation is smart, you could block almost all outbound network access except the ones to your DB and the few API you may need to contact.


> means it's going to be much less secure than the already not-that-secure NPM ecosystem.

I have only the bare minimum of like, experience with nodejs. Would you mind fleshing out why that is so?


> For that you'd have to open up network and file access pretty much right off the bat.

I think that overall you're right, but it's worth noting that deno can restrict file system access to specific folders and can restrict read and write separately. It's plausible to me that you could have a web server that can only access specific folders.


I don't think running a public web server application is one of the envisioned use cases here. It looks like a tool for quickly and dirtily getting some job done. But I agree that to get something useful done, you probably need to open up a bunch of permissions, so you're still running arbitrary code on your machine.


It's always a good idea to run in a container, which limits the ports you can listen on, directories allowed for writing and reading, and can have its own firewall to limit outgoing connections.

If you don't need the firewall, you can just run in a chroot under a low-privilege user.

I mean, if you do otherwise, you are not following best practices and the voice of reason.


The manual looks pretty sketchy, but it seems you can limit file access to certain files or directories and that could be used to just give it access to the database and log files.


Looking at the flags, one can envision future updates providing flags for scoping to directories, PIDs, domain/IP ranges


I don't think thats very accurate. You really need to gi watch the first Deno video made by Ryan Dahl at JSConf.


If I am building a real world application I'm going to vet the libraries I use.


Simple solution to the dependency management (spitballing): a directory of files where the filename is the module name. Each file is simply:

  <url><newline><size in bytes><newline><hash>
And then in an application:

  import { serve } from deno.http.server;
If you want nested levels so deno.X.X wouldn't be a ton of files you could possibly just do nested directories so deno/http/server would equate to deno.http.server.

Most people would want the option to do dependency management on a per-project basis as well. Simply allow a command-line parameter to provide one or more other directories to source from first (besides presumably the global one for your installation).

If we wanted the file to be generated automatically, maybe something like this:

  import { serve } from deno.http.server at "https://deno.land/std@0.50.0/http/server.ts";


Until someone thinks that it should follow redirects which probably leads to the same thing that got apt: https://justi.cz/security/2019/01/22/apt-rce.html

Not saying that makes it a bad idea, but importing/downloading trusted code over http(s) is not simple even if the protocol sorta is.


> This seems like an experiment worth trying at least.

Yup! I am really excited about Deno and curious about how popular it will be in a few years.


I guess I'm wondering why Deno is targeting V8 instead of Servo? Maybe I'm mistaken, but Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

[0] https://servo.org/

[1] https://wiki.mozilla.org/Quantum/Stylo


>Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

Servo is absolutely not production ready. A couple of particular pieces of Servo, such as Stylo and WebRender, can be considered production-ready, but no so much the project as a whole.


Servo uses Firefox's SpiderMonkey, which is written in C++, as its JavaScript implementation.


Servo is an experimental project designed to build and test components that can be integrated into Firefox. It relies on Gecko for JS.


SpiderMonkey, not Gecko.


If you're getting into Deno and want to keep up with new stuff from the ecosystem on a regular basis, we're now publishing https://denoweekly.com/ .. issue 2 just went out minutes after the 1.0 release. I've been doing JavaScript Weekly for 487 issues now, so this is not a flash in the pan or anything :-D

Of course, Deno has an official Twitter account as well at https://twitter.com/deno_land :-)


I suppose .land is the new .dev now ;)

Am curious how Parallelism could be handled in the runtime? Besides exposing WebWorkers, would shared memory be a possibility? V8 looks like its heading toward a portable WebAssembly SIMD accelerator.

>>> Promises all the way down

Async / await is a great pattern for render loops by resolving to continuous window.requestAnimationFrame calls. Here is a nifty example computing a Buddhabrot and updating new data quite smoothly:

http://www.albertlobo.com/fractals/async-await-requestanimat...


Root certificate not trusted for https://denoweekly.com/ on both chrome and firefox.


Maybe they fixed this in the last 2 hours, but it works for me (firefox, linux).


Weird, it still says that "Cisco Umbrella Root CA" is not trusted. Maybe its only from certain countries.


I'm Canadian and in Canada for what it's worth. Clicking on the lock tells me that it was verified by lets encrypt. The root is "Digital Signature Trust Co." Common Name "DST Root CA X3".

Cisco sounds like a router might by running a MITM on you?

Edit: This looks to be confirmation that that root (or one by a very similar name) is used by a MITM tool:

https://docs.umbrella.com/deployment-umbrella/docs/rebrand-c...


Thanks for the diagnosis. Accessed from a different network and with no issues. And got the right certificate this time.


Same. FF/macOS


Hmm, interesting! Thanks for the report. I just ran it through Qualys SSL Labs and everything passed. (We got a B though because we still support TLS 1.1.)

It's a multi-domain Let's Encrypt X3 certificate and I believe most LE users will be in a similar boat now.


Keep up the good work, JS weekly is a wonderful resource.


Good to see you here!


right on top of it Peter! nice


Great :)


> TSC must be ported to Rust. If you're interested in collaborating on this problem, please get in touch.

This is a massive undertaking. TSC is a moving target. I occasionally contribute to it. It’s a fairly complex project. Even the checker + binder (which is the core of TS) is pretty complex.

One idea that comes to mind is to work with Typescript team that they are only using a subset of JS such that tsc can be compiled down web assembly and have llvm spit out a highly optimized binary. This not only benefits demo, but the rest of the internet.

TSC has done some great architectural changes in the past like doing mostly functional code, rather than lots of classes.

The target we should be aiming for is a powerful typed language like typescript that complies very quickly to webasshmbly that can run in guaranteed sandbox environments.


There already exists and experimental compiler that takes a subset of TypeScript and compiles it to native[1]. It might be able to target wasm instead of asm.

Also: If I'm not entirely mistaken Microsoft initially planned to have a TypeScript-specific interpreter in Explorer. This also might indicate that something like that could be possible.

1: https://www.microsoft.com/en-us/research/publication/static-...


I wonder how possible it would be to just use this:

https://github.com/swc-project/swc

It's still not feature-complete, but there aren't any alternatives written in Rust that I know of.


SWC does not do any typechecking. It is equivalent to babel.



This does seem like a dangerous side-path unrelated to the Deno project's needs.

From the description, it doesn't sound like Deno needs the type information for V8 optimizations (I thought they had explored that, but I don't recall, and the description here is unclear), so maybe switching to more of a two pass system of a simple as possible "type stripper" (like Babel's perhaps?) and leave tsc compilation for type checking as a separate background process. Maybe behind some sort of "production mode" flag so that type errors stop debug runs but in production assume you can strip types without waiting for a full compile?

Maybe even writing a type stripper in Rust isn't a bad idea, but definitely trying to capture all of tsc's functionality in Rust seems like a fool's errand.


Typescript already has transpile-only mode that lets it run without performing those checks and just emit.

I use it with ts-node all the time for my docker images that require fast startup.

node -r ts-node/register/transpile-only xyz.ts


v8 has the ability to snapshot a program just after it loads, but before it executes. If you snapshot after doing some kind of warmup, to trigger the right optimisations, you get something that should fire up ready to go, which is probably the main problem - the compiler being repeatedly invoked and parsed from javascript and compiled on the fly.


One problem with taking the v8 snapshot and using it as a binary executable is that it will probably be much slower then running v8 live. Although the startup time will be 10x faster. The runtime will be 10x slower.


The notes here mention that V8 snapshots also didn't provide the speed-up/optimization Deno was hoping for.


There is https://github.com/AssemblyScript/assemblyscript. It's not using llvm, but it's compiling a subset of typescript to webassembly.


I see the sass / node-sass fiasco all over again...


This is referring to lib-sass being in C?


The dependency management is highly questionable for me. Apart from the security concerns raised by others, I have huge concerns about availability.

In it's current form, I'd never run Deno on production, because dependencies have to be loaded remotely. I understand they are fetched once and cached, but that will not help me if I'm spinning up additional servers on demand. What if the website of one of the packages I depend on goes down, just as I have a huge spike in traffic?

Say what you want about Node's dependency management, but atleast I'm guaranteed reproducible builds, and the only SPOF is the NPM repositories, which I can easily get around by using one of the proxies.


Why can't you download all the packages you use actually with your source code? That's how software has been built for decades...

I'm a desktop developer so I understand I'm the dinosaur in the room but I've never understood why you would not cache all the component packages next to your own source code.

Since this is straighforward to do I presume there is some tradeoff I've not thought about. Is it security? Do you want to get the latest packages automatically? But isn't that a security risk as well, as not all changes are improvements?


For Node, the main tradeoff is number and size of files. Usually the distribution of a node module (that which is downloaded into node_modules) contains the source, documentation, distribution, tests, etc. In my current project, it adds up to 500MB already.

They would do well to have an option to optimize dependencies for vendoring.


You're right. We call this "vendoring" your dependencies. And it's a good way to do things.


You can commit your node_modules folder into your repository if you'd like.


That is exactly what NPM does.


So build your own npm?


Hi! The response to your fears are in the announcement. "If you want to download dependencies alongside project code instead of using a global cache, use the $DENO_DIR env variable." Then, it will work like node_modules.


Ah, in this case, I would then have to commit my dependencies into my VCS to maintain reproducible builds. I'm not sure I like that solution very much either. I've seen node_modules in multiple GBs, and I'm sure Deno's dependency sizes are going to be similar.


True, but that's what people using Go have been doing for years without complaining much, so I guess it works fine for most workload.

And before npm fixed things after the left-pad incident, the npm builds where not reproducible either (as demonstrated by the said left-pad incident).


> True, but that's what people using Go have been doing for years without complaining much, so I guess it works fine for most workload.

I hate to break it to you but dependency management has been a massive issue in golang until the devs formally adopted go mod.

Only Google seemed okay with checking in their dependencies to version control. Everyone else was doing crazy hacks like https://labix.org/gopkg.in


Checking in dependencies to version control is the sane option. Then you can more easily see what's updated and track regressions. Some people like to refactor their code any time there is a syntax sugar added to the language - often adding a few bugs while doing it, which is a PITA, but version control is still better then no version control.

You will ask, what about adding the OS to your SCM too, yeh why not have the full software stack. But you can generally draw a line between strong abstraction layers: Hardware | Kernel | OS | runtime | your app. Some modules do have strong abstraction layers, but others are just pure functions which you could just as well copy into your own repo.


It created a hugely fractured open source ecosystem as well.


The vendoring has never been the issue though.


I have only used Go once at work, and I actually dislike most of it (and dependency management was one of the annoying things with Go), nonetheless it is has never been a show stopper and there have been thousands of developers using it when vendoring was the only option.


Dependency management is one of the biggest complaints I have seen around Go - I don't think this is accurate.


I don't like it either, but it still works well enough for many people.


Go dependency management is quite good now with "go mod", plus your dependency tree isn't going to look anything like your typical JavaScript dependencies, otherwise you're doing it wrong..


> that's what people using Go have been doing for years without complaining

I haven't seen anyone commit vendor and not complain about it. But now you finally don't have to commit vendor for reproducible builds. All you need is a module proxy. The "all you need" is not really meant seriously of course.

And I personally prefer to not commit vendor and complain about it.


Go compiles to a static binary. It’s not downloading and running source on your production servers. Isn’t that the concern here?


That is one of the things I hate about go. Right up there with lack of generics and boilerplate error handling.


This hasn't been a thing in Go for a long time. Go dep and now go modules fix this.


You could use a separate git repository for the dependencies. That way you keep your core project repo tight and small and clean, but you still have your dependencies under version control. If that separate repo grows to a few GBs or more it doesn't really hurt anything.


In practice modules will be available from sources that will have similar reliability to npm: github.com, unpkg.com, cdn.pika.dev, jspm.io, etc.


Which then raises the question - how is it better than NPM? If there are going to be centralized repositories (like NPM), and if I have to download my dependencies into a $DENO_DIR (like NPM), and if I am then loading these dependencies from local files (like NPM), how is it any different to NPM? Except for being less secure by default?

This is starting to look like a case of being different just so you can say you're different.


NPM is a dependency management failure which is why you are ending up with hundreds of dependencies in the first place. It sounds like you want to reproduce that insanity in Deno. Deno is set up in such a way to dissuade you from the stupidity by default but allow it in very few steps if you cannot imagine a world without it.

In my opinion this is Deno’s biggest selling point.


> Deno is set up in such a way to dissuade you from the stupidity by default but allow it in very few steps if you cannot imagine a world without it.

Could you elaborate on this? Is it that Deno is against the whole 'small packages that do one thing well' principle and instead in favor of complete libaries? How exactly would it dissuade me from installing hundreds of dependencies?


The default design style for a Deno application is that the application becomes a single file. Just like packages coming off Steam. This requires that dependencies are packaged into the application before it is distributed to others. The idea there is to include only what you need deliberately and it manage it as a remotely written extension of your application.


Having a single executable file, makes distribution easier, but while I'm developing the app, I'll still have to manage all of it's dependencies right? How does Deno aid during development?

> The idea there is to include only what you need deliberately and it manage it as a remotely written extension of your application.

I have a node app, in which I deliberately only included the dependencies I need. The package.json lists exactly 8 dependencies. However, the node_modules folder already has 97 dependencies installed into it. The reason of course is that these are dependencies of dependencies of dependencies of dependencies.

Wouldn't Deno have this same issue? Are the dependencies also distributed in compiled form as a single file akin to windows DLLs?


it's better because there will be more choice.


I am always confused by deno folks. You can install from a git repository using yarn/npm.

How is that not "decentralisation"

And if you are importing single files from a remote url, I would question your sanity.


> install from a git repository using yarn/npm

yep, that's basically the same. deno has the benefit of using the es module system like it is implemented in browsers.


Node supports node_modules, not npm. Anything can build the node_modules.


Doesn't this mean more opportunities to inject malicious code?


Only if you tell your application to retrieve from untrusted locations.


To solve your issue, you would do exactly how you do your node deployments: download the deps in a folder in CI, then deploy the whole build.


Except that now, the download deps in CI step can fail if one of hundreds of websites for my hundreds of dependencies goes down. If the main NPM repository goes down, I can switch to a mirror and all of my dependencies will be available again.


To be the rubber duck, if wiping the cache at each build is a risk to your CI, what could you do to keep your CI up?

1 - not wipe the cache folder at each build? It's easy and secure. Oh and your build will be faster.

2 - use a cached mirror of the deps you use? It's like 10min to put in place and is already used in companies that care about security and availability anyway.

3 - you have https://deno.land/x if you want to put all your eggs in the same npm basket


Yes, I think I'd probably settle for solution number 2.

I still don't understand how this is better than NPM, and how Deno solves the horrible dependency management of Node, but maybe if I actually build something with Deno I'll get some answers.


From the post:

> [With NPM] the mechanism for linking to external libraries is fundamentally centralized through the NPM repository, which is not inline with the ideals of the web.


> which is not inline with the ideals of the web

Subjective.

> Centralized currency exchanges and arbitration is not in line with the ideals of the web! - Cryptocurrency

Nek minute. Besides, let's get real here; they will just end up centralized on GitHub. How exactly is that situation much different than npm or any other language ecosystems library directory being mirror-able?


The centralization of git on Github is completely different in nature from the centralization of Node packages on npm.

git does not require Github to be online to work, nor relies on Github existence for its functionality.


I'm talking about the centralization of software packages(Go, Deno) on GitHub as it applies to dependency resolution.


I'd highly recommend mirroring packages anyway. Obviously this isn't always necessary for small projects, but if you're building a product, the laws of the universe basically mandate that centralized package management will screw you over, usually at the worst possible time.


You answered your own question. Nothing stops you from using a mirror with deno too.


Which again brings me back to something I'm still not understanding - How is Deno's package management better than NPM if it is extremely similar to NPM, but slightly less secure?

I'm only asking because lots of people seem to be loving this new dependency management, so I'm pretty sure I'm missing something here.


We need to distinguish between npm, the service (https://www.npmjs.com/) and npm, the tool.

Deno has the functionality of npm, the tool, built-in.

The difference is that like Go, Deno imports the code directly from the source repository.

In practice it's going to be github.com (but can be gitlab or any code hosting that you, the author of Deno module, use).

NPM is a un-necessary layer that both Go and Deno has removed.

It's better because it's simpler for everyone involved.

In Go, I don't need to "publish" my library. People can just import the latest version or, if they want reproducibility, an explicit git revision. Compared to Go, publishing to npm is just unnecessary busy work.

I've seen JavaScript libraries where every other commit is related to publishing a new version to npm, littering the commit history.

In Go there's no need for package.json, which mostly replicates the information that was lost when publishing to npm (who's the author? what's the license? where's the actual source repository?).

As to this being insecure: we have over 10 years of experience in Go ecosystem that shows that in practice it works just fine.


How do you list the dependency libraries if you don't have a package.json?

Do you manually install a list of libraries provided by the author's readme?


The simplest approach is to either import anything anywhere, or have a local module that import external dependencies and then have your code import them via that local module.


The dependencies are imported in the source code of the package.


NPM, the tool, has had the feature to be able to install directly from GitHub instead of npmjs.org for many many years as well. No one really used it unless as a workaround for unpublished fixes because it has no other tangible benefits.


I like it because it's simpler. I know what happens when I import from a URL. I'd have a hard time whiteboarding exactly what happens when I `npm install`.


What happens?


My least favorite thing about importing from NPM is that I don't actually know what I'm importing. Sure, there might be a GitHub repository, but code is uploaded to NPM separately, and it is often minified. A malicious library owner could relatively easily inject some code before minifying, while still maintaining a clean-looking repo alongside the package.

Imports from URL would allow me to know exactly what I'm getting.


install from the repo then?

You can install a specific version from git via yarn/npm.

How do you trust a url more without reading the code?

What's going to stop deno ecosystem from putting minified js files on cdns and import them?


It's decentralized.


Or use something like Nexus or Artifactory to host a private copy of dependencies.


I think the primary way to manage dependencies should be in a local DIR and optionally, a URL can be specified.

The default in Deno is questionable choice. Just don't fuck with what works. Default should be safest followed by developers optionally enabling less safe behaviors.


Using a universally unique identifier like a URL is a good idea: this way, https://foo.com/foo and https://bar.com/foo are distinct and anyone who can register their own name gets a namespace, without relying on yet another centralized map of names->resources.

After all, the whole point of a URL is that it unambiguously identifies resources in a system-independent way.


No one is questioning the utility of URLs. Using URLs to specify dependencies right in the import statement is a horrible idea.


How is it any worse than using names from a namespace controlled by “npmjs.com”: if you’re concerned about build stability, you should be caching your deps on your build servers anyways.


I've never used npm or developed any javascript before but it sounds equally horrible.

Not decoupling the source of the package (i.e., the location of the repository whether it is on remote or local) and its usage in the language is a terrible idea.

  from foo import bar
  # foo should not be a URL. It should just be an identifier.
  # The location of the library should not be mangled up in the code base.

Are we gonna search replace URL strings in the entire codebase because the source changed? Can someone tell me what is the upside of this approach because I cannot see a single one but many downsides.


The whole idea of a URL is that it’s a standardized way of identifying resources in a universally unique fashion: if I call my utility library “utils”, I’m vulnerable to name collisions when my code is run in a context that puts someone else’s “utils” module ahead of mine on the search path. If my utility module is https://fwoar.co/utils then, as long as I control that domain, the import is unambiguous (especially if it includes a version or similar.).

The issue you bring up can be solved in several ways: for example, xml solves it by allowing you to define local aliases for a namespace in the document that’s being processed. Npm already sort of uses the package.json for this purpose: the main difference is that npmjs.com hosts a centralized registry of module names, rather than embedding the mapping of local aliases->url in the package.json


Allow me to provide an extremely relevant example (medium sized code base).

About 100 python files, each one approximately 500-1000 lines long.

Imagine in each one of these files, there are 10 unique imports. If they are URLs (with version encoded in the URL):

- How are you going to pin the dependencies? - How do you know 100 files are using the same exact version of the library? - How are you going to refactor dependency resolution or upgrades, maintenance, deprecation?

How will these problems be solved? Yes, I understand the benefits of the URL - its a unique identifier. You need an intermediate "look up" table to decouple the source from the codebase. That's usually requirements.txt, poetry.lock, pipenv.lock, etc.


The Deno docs recommend creating a deps.ts file for your project (and it could be shared among multiple projects), which exports all your dependencies. Then in your application code, instead of importing from the long and unwieldy external URL, import everything from deps.ts, e.g.:

    // deps.ts
    export {
      assert,
      assertEquals,
      assertStrContains,
    } from "https://deno.land/std/testing/asserts.ts";

And then, in your application code:

    import { assertEquals, runTests, test } from "./deps.ts";
https://deno.land/manual/linking_to_external_code#it-seems-u...


This was my first instinct about how I'd go about this as well. I actually do something similar when working with node modules from npm.

Let's say I needed a `leftpad` lib from npm - it would be imported and re-exported from `./lib/leftpad.js` and my codebase would import leftpad from `./lib`, not by its npm package name. If / when a better (faster, more secure, whatever) lib named `padleft` appears I would just import the other one in `./lib/leftpad.js` and be done. If it had incompatible API (say, reversed order of arguments) I would wrap it in a function that accepts the original order and calls padleft with the arguments reversed so I wouldn't have to refactor imports and calls in multiple places across the project.


Yeah, this sort of "dependency injection" scheme is better than having random files depend on third party packages anyways: it centralizes your external dependencies and it makes it easier to run your browser code in node or vice-versa: just implement `lib-browser` and `lib-node` and then swap then out at startup.


I believe the long term solution to the issues you raised is import maps: https://github.com/WICG/import-maps

It's an upcoming feature on the browser standards track gaining a lot of traction (deno already supports it), and offers users a standardized way to maintain the decoupling that you mentioned, and allows users to refer to dependencies in the familiar bare identifier style that they're used to from node (i.e. `import * as _ from 'lodash'` instead of `import * as _ from 'https://www.npmjs.com/package/lodash'`).

I imagine tooling will emerge to help users manage & generate the import map for a project and install dependencies locally similar to how npm & yarn help users manage package-lock.json/yarn.lock and node_modules.


Yeah, I agree, but that intermediate lookup table (a) can be in code and (b) can involve mapping local package names to url package names.

One off scripts would do `from https://example.com/package import bar` and bigger projects could define a translation table (e.g. in __init__.py or similar) that defines the translation table for the project.

Embedding this sort of metadata in the runtime environment has a lot of advantages too: it’s a lot easier to write scripts that query and report on the metadata if you can just say something like `import deps; print( deps.getversions(‘https://example.com/foo’)`

One of the best parts about web development is that, for quick POC-type code, I can include a script tag that points at unpkg.com or similar and just start using any arbitrary library.


That's exactly what Go does - it works fine


Good luck finding which of foo.com/foo or bar.com/foo is the foo module you want though…


Good luck finding which of google.com/search or bing.com/search is the search engine you want though


This is true actually, and that's why being the default search engine is so important Google pays billions each year for that.


It could be a good idea if they were immutable, like IPFS links.


That might work for some projects, but can quickly blow up the size of the repo.

I don't think it it is an unsolvable problem. For example, other solutions could be using a mirror proxy to get packages, instead of directly from the source, or pre-populating the deno dir from an artifact store. It would be nice to have documentation on how to do those though.


A better solution is something like https://vfsforgit.org/


That's not necessarily better. For one thing, it doesn't support Linux yet. For another, afaik, Azure DevOps is the only git hosting service that supports it.

Even if it was better supported, I wouldn't want to start using it just so I can include all my dependencies in git. Of course if you are using something like vfs for git anyway, then increasing the repo size is less of an issue. It still feels wrong to me though.


Yeah, I'm not really advocating the use of GVFS specifically, but what I am saying is that once you've lived in a world where all your dependencies are in your repo you won't want to go back, and that Git should improve their support for large repos (in addition to checking in all our dependencies, we should be able to check in all our static assets).


It's just a URL right? So could you not mirror the packages to your own server if you're so concerned, or better yet import from a local file? Nothing here seems to suggest that packages must be loaded from an external URL.


> or better yet import from a local file

And this is different from NPM how? Except that I've now lost all the tooling around NPM/Yarn.


It's different because it is much further removed from a centrally controlled dumpster fire. The JS, Node and NPM ecosystem is a pain on so many levels. Blindly trusting developers to follow semver by default. Leftpad. Build toolchains. The whole micro-depedency madness. Having a peek into your node_modules is like looking into the depths of hell.

Not saying Deno won't devolve into this sad state at some point. Maybe it already has. But it seems to try to combat some of the problems by being honest and pragmatic about dependencies, promoting minimal external tooling and removing some of the dangerous abstractions from NPM.

To me Deno seems like a desperately needed and well thought out reset button.

Semi-related rant over.


It's different because it doesn't rely on require() which is non-standard JavaScript.


Node v14 supports ESM modules and the import syntax, which is standard Javascript.


setTimeout is non-standard JavaScript too but I bet your code base has multiple instances of its usage.


Is it? It's on every browser I know.



Ah, makes sense, it's not part of the spec so it might be missing on other environments. Thank you.


Then you either vendor as others have said, or use the built in bundle tool to produce a single js file of all your code including dependencies.

https://deno.land/manual/tools/bundler


Several responses to your concern but all of them seem to say "you can cache dependencies". How does Deno find dependencies ahead of runtime? Does Deno not offer dynamic imports at all?

If I have an application that uses, say, moment.js and want to import a specific locale, typically this is done using a dynamic import.


You’d deploy builds that include all the dependencies. This isn’t Node where you have to download all deps on your production server, they are built into your deployed files.


Ever used Go? Not that different.


Go supports "vendor" folders for storing dependencies locally after initial download. That combined with Go Modules means you can handle everything locally and (I believe) reproducible.


deno has the same support with $DENO_DIR and a lockfile.


Reproducible builds, sure. Security? that's a different story.

https://github.com/ChALkeR/notes/blob/master/Gathering-weak-...

- node ships with npm

- npm has a high number of dependencies

- npm does not implement good practices around authentication.

Can someone compromise npm itself? probably, according to that article.


How is this different from requiring npm to be up to install packages?


I like what Deno is selling. URL like import path is great, I don't know why people are dismissing it. It is easy to get up-and-running quickly.

Looks like my personal law/rule is in effect again: The harsher HN critics are, the more successful the product will be. I have no doubt Deno will be successful.


Your law is hilarious. I tend to check the comments before reading a post: if they say the idea is terrible, I know I should read it.


The GoLang-like URL import and dependency management are indeed an innovation in simplicity while simultaneously offering better compatibility with browser JavaScript.

Perhaps the HN-hate is not about simplified greenfield tech as much as it is about breaking established brownfield processes and modules.


My concern is what happens when popular-library.io goes down or gets hacked?

Or how about attack vectors like DNS poisoning? or government-based firewalls?

I know there's this[1], but somehow I still feel uneasy because the web is so fragile and ephemeral...

At the very least I would like to have the standard library offline...

[1] https://github.com/denoland/deno/blob/master/docs/linking_to...


> what happens when popular-library.io goes down or gets hacked?

What is anyone going to do about it? Anything has a chance of getting hacked or goes down just when you need it, be it GitHub, npmjs.org...

Blaming the tool for not having a protection against DNS poisoning is a bit far fetched.


ultimately i guess it is about how/if deno caches its imports. with node.js/npm you have the exact same problems, just the source & sink occur at different places (package installation)


With Node.js you install the packages in a dev environment, and test extensively, then push all the code, including node_packages folder to production. Running npm on the prod server is forbidden. At least in theory =)


You can always download the scripts and host them yourself, right?


Do you also share these concerns about golang? Isn’t it basically the same system?


Golang does have https://proxy.golang.org/, which is fairly recent, but yes this is absolutely a problem in Go.

See the "go-bindata" problem.


This seems to be the case, yes. It's like the critics unconsciously know it's better, and that is where their energy comes from.


More like "There can't be a better stuff than what I'm accustomed to and like" feel.


> It is easy to get up-and-running quickly.

Almost all successful, mainstream, techs are like that. From a purely technical perspective, they are awful (or where awful at launch), they were just adopted because they were easy to use. When I say awful, I mean for professional use in high impact environments: financial, healthcare, automotive, etc.

Examples: VB/VBA, Javascript, PHP, MySQL, Mongo, Docker, Node.

Few people would argue that except for ease of use and ubiquity, any of these techs were superior to their competitors at launch or even a few years after.

After a while what happens is that these techs become entrenched and more serious devs have to dig in and they generally make these techs bearable. See Javascript before V8 and after, as an example.

A big chunk of the HN crowd is high powered professional developers, people working for FAANGs and startups with interesting domains. It's only normal they criticize what they consider half-baked tech.


Forget the (reasonable) security and reliability concerns people have already brought up with regard to importing bare URLs. How about just the basic features of dealing with other people's code: how am I supposed to update packages? Do we write some separate tool (but not a package management tool!) that parses out import URLs, increments the semver, and... cURLs to see if a new version exists? Like if I am currently importing "https://whatever.com/blah@1.0.1", do I just poll to see if "1.0.2" exists? Maybe check for "2.0.0" too just in case? Is the expectation that I should be checking the blogs of all these packages myself for minor updates? Then, if you get past that, you have a huge N-line change where N is every file that references that package, and thus inlines the versioning information, instead of a simple and readable one-line diff that shows "- blah: 1.0.1, + blah: 1.0.2".


Another thing is that with package.json every dependency can say which versions of its dependencies it works with. This lets you update a dependency that is used by other dependencies and only have a single version (most up to date) of it. Some package managers also let you completely override a version that one of your dependencies uses, allowing you to force the use of a newer version.

With Deno, both of these use cases seem way harder to satisfy. None of your dependencies can say "hey, I work with versions 1.1 to 1.3 of this dependency", instead they link to a hardcoded URL. The best chance of overriding or updating anything is if Deno supports a way to shim dependencies, but even then you might need to manually analyze your dependency tree and shim a whole bunch of URLs of the same library. On top of that, as soon as you update anything, your shims might be out of date and you need to go through the whole process again. To make the whole process easier, Deno could compare checksums of the dependencies it downloads and through that it could show you a list of URLs that all return the same library, but this would be like the reverse of package.json: instead of centrally managing your dependencies, you look at your dependencies after they have been imported and then try to make sense of it.


> None of your dependencies can say "hey, I work with versions 1.1 to 1.3 of this dependency"

That's a real problem that needs to be solved.

Also, what happens when two lib A and lib B depend on different versions of lib C? Each have their own scoped instance of C?


The Deno solution is either:

* A deps.ts that handles all external dependencies and re-exports them

* Import maps

Neither of these really give you a way to do the equivalent of "npm update". But I almost never want to update all my packages at once.


You don’t like checking for security updates?


Lately I just merge GitHub's pull requests for that. ;)

I don't like running "npm update" to try and get security updates, though. npm packages aren't very rigorous about PATCH level changes.


Well, at least you seem to be using cargo for the rust parts.

Since this stuff is breaking an new anyway, it would be nice to see dependency resolution and a reasonable way for historically reproducible builds (prod runs server 1.0.4 which is 5 years old, let's build that locally to do some bug fixing, using the same dependencies, gradually bringing it up to current 3.7.1...).

Sounds like the lifecycle management of deno projects will about as much fun as php before package management. And about as reliable.


I prefer a unix approach where each tool does a single thing.

The runtime does runtime things and someone can build a package manager to do package manager things.

The benefit of having runtime + package management bundled into one is you have an opinionated and standard way of doing things - the downside is if the package manager stinks then the whole ecosystem stinks.


Just come up with a convention like host a file called version.ts that lists all available versions. Brute forcing for available versions sounds dumb.


Yes, that is of course the point. There’s an infinite number of possible later versions to check for. The suggestion to poll for new versions using cURL was sarcastic. These “conventions” you speak of actually get handled by... package managers! If not everyone who hosts a package needs to “know” about magic files and make sure they make it following a spec that isn’t enforced by anyone and doesn’t break immediately but rather much later when someone tries to update. It’s like everyone managing their own database row with an informally agreed upon database schema.


Maybe a convention will arise, that you do all the imports in one file (basically like a `package.json` file) and import that from the rest of your code? It seems hackish to me but could work.


This is explicitly listed as best practice by Deno [1], but it doesn't handle the updating problem at all.

[1] https://deno.land/manual/linking_to_external_code#it-seems-u...


You have to stop thinking in terms of NPM where it takes 1000000000 packages to do anything. A Deno application is designed to be distributed as a single file. You can override the default behavior to have NPM like stupidity, but if that is really your goal why bother moving to Deno in the first place?


Forget 10000000 packages. Many languages often make use of 10s of packages. If I have several projects, each with around 10 packages, and no automated way to just check if all my projects’ respective dependencies have security updates that could be applied, it seems to go against the stated security goal.

Separately I’m not sure what is enforcing this “small dependency graph” aside from making it hard to import things I guess. I wouldn’t be surprised if you end up with the normal behavior of people coming up with cool things and other people importing them.


> and no automated way to just check if all my projects’ respective dependencies have security updates

Dependency management is a major cornerstone of any infosec program. There is more to that than just auto-installing a new dependency version.

> I’m not sure what is enforcing this “small dependency graph”

Because a large dependency graph is slow, insecure, and fragile.


> Dependency management is a major cornerstone of any infosec program. There is more to that than just auto-installing a new dependency version.

We seem to agree? I said check. It’s very useful to have something tell you what’s out of date and what the updates are.

> Because a large dependency graph is slow, insecure, and fragile.

I asked “what”, not “why”. What is enforcing this idea you have of how Deno will be used? I feel like you want it to not be used with lots of dependencies, thus aren’t accounting for how to handle them. However, just because that’s the desired way to use it doesn’t mean it will be used that way. Lots of dependencies may end up still becoming the norm, at which point you’ll wish you would have more clearly defined how it should be done instead of letting the first third party solution win (as ended up happening with npm).


Congratulations on the 1.0 release! I've been using Deno as my primary "hacking" runtime for several months now, I appreciate how quickly I can throw together a simple script and get something working. (It's even easier than ts-node, which I primarily used previously.)

I would love to see more focus in the future on the REPL in Deno. I still find myself trying things in the Node.js REPL for the autocomplete support. I'm excited to see how Deno can take advantage of native TypeScript support to make a REPL more productive: subtle type hinting, integrated tsdocs, and type-aware autocomplete (especially for a future pipeline operator).


Seconded, a Deno TS REPL would be amazing, but they probably have a few bigger fish to fry yet :)


> bigger fish to fry

> fish

I see what you did there, and I approve.


I evaluated replacing ts-node with deno but if I use -T and install ts-node globally that seems equivalent to deno to me.

I think stepping outside the npm ecosystem is going to be a bigger issue then people think.


Repl.it recently announced a Deno REPL https://repl.it/languages/deno


I really wish they had docker-compose / Terraform support. Just not sure at what point that becomes "free" hosting.


i wonder if it's conceivable to ever write typescript in a REPL


There are both ocaml and haskell repls, so it can be done with languages whose type systems are the focus. Not sure if there's anything specific about typescript that would make it hard, though.


Does anyone else see the import directly from URL as a larger security/reliability issue than the currently imperfect modules?

I'm sure I'm missing something obvious in that example, but that capability terrifies me.


I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway. You can even depend to your non-npm repo (github, urls...) from a npm-based package.

If you want to "feel" as safe, you have import maps in deno, which works like package.json.

Overall, I think Deno is more secure because it cuts the man in the middle (npm) and you can make a npm mirror with low effort, a simple fork will do. Which means you can not only precisely pin which code you want, but also make sure nobody knows you use those packages either.

Take it with an open mind, a new "JSX" or async programming moment. People will hate it, then will start to see the value of this design down the road.


> I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway

npm installs aren't the same as installing from a random URL, because:

* NPM (the org) guarantees that published versions of packages are immutable, and will never change in future. This is definitely not true for a random URL.

* NPM (the tool) stores a hash of the package in your package-lock.json, and installing via `npm ci` (which enforces the lockfile and never updates it in any case) guarantees that the package you get matches that hash.

Downloading from a random URL can return anything, at the whims of the owner, or anybody else who can successfully mitm your traffic. Installing a package via npm is the same only the very first time you ever install it. Once you've done that, and you're happy that the version you're using is safe, you have clear guarantees on future behaviour.


My assumption would be that new men in the middle will arise, but this time, you can pick which one to use.


btw: https://github.com/denoland/deno/issues/1063

they know there is a bad mitm vector and won't fix it


This is why I think a content addressable store like IPFS would shine working with Deno


That solves this specific problem nicely, although AFAIK IPFS doesn't guarantee long-term availability of any content, right? If you depend on a package version that's not sufficiently popular, it could disappear, and then you're in major trouble.

It'd be interesting to look at ways to mitigate that by requiring anybody using a package version to rehost it for others (since they have a copy locally anyway, by definition). But then you're talking about some kind of IPFS server built into your package manager, which now needs to be always running, and this starts to get seriously complicated & practically challenging...


One advantage of having a centralized repository is that the maintainers of that repository have the ability to remove genuinely malicious changes (even if it's at the expense of breaking builds). Eliminating the middle man isn't always a great thing when one of the people on the end is acting maliciously.


I'm just thinking out loud here, but it seems to me that you could just make sure you're importing all your dependencies from trusted package repos, right? And since the URL for a package is right there in the `import` statement, it seems like it'd be pretty easy to lint for untrusted imports.

I don't detest NPM in the way that some people do, but I have always worried about the implications of the fact that nearly the entire community relies on their registry. If they ever fell over completely, they would have hamstrung a huge amount of the JS community.


It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

It's also exactly what the websites you visit do. ;)


> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

This is definitely false. For all the problems with the NPM registry and the Node dependency situation, an NPM package at a specific version is not just at the whims of whatever happens to be at the other end of a URL at any given moment it's requested. This is a huge vulnerability that the Node/NPM world does not currently have.


That is a fair point. I don't think most people who use npms really pay much attention, though, and you're still just an npm update away from getting something unexpected (because really, who puts explicit versions in package.json?).

Deno does have lockfiles: https://deno.land/manual/linking_to_external_code/integrity_...

I prefer imports from URLs. And I loathe npm. I get why people would disagree though.


Deno has lock files and caches files locally on first import.


I'm not sure how a lock file would help in this scenario, unless you're also committing your cache to source control (like a lot of folks did in the bad old days of NPM). The local cache is great, but that doesn't prevent the content of those URLs changing for someone who doesn't have access to your cache.


yeah, but we regularly clear out our cache and lock files, so this doesn't really solve the issue, unless you're commiting all of your packages


Why are you _regularly_ clearing lock files? If you're bypassing lock files you're going to have the exact same issue with npm or yarn or any other package manager that downloads from the internet.


Dunno about OP but I pin versions in package.json because it allows me to control the versions and upgrade major versions only when explicit and necessary, and rely only on the lock file to keep it the same between commit time and the production build.


That doesn’t actually work and gives you a false sense of reproducibility and stability. Sure your top level dependencies might not change without explicit changes to package.json but every time you run npm install without a lock file all transitive dependencies are re-resolved and can change.

Always commit your lock files people


What about the dependencies of your dependencies? You're gonna get burned when a breaking change gets introduced a few levels deeper than your package.json. Not everyone follows semver perfectly, and sometimes malicious code gets distributed as one of these transitive dependencies.


That's fine for one developer pushing to production from their own machine. But I've you have aCI server and you're working with other people you're going to want to know that everyone is working with the same modules.


What! Clearing lock files seems wild. How do you know you're getting the right code when you install dependencies?


For Deno the only issue is the first time when you do not have it cached. Deno compiles in all dependencies when building so the only point of failure is the machine you’re building on.

I don’t know the state of the art anymore, but I’m sure they have ways to make it easy to vendor deps in the repo.


> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

I'm not sure how this works in detail here, but at least in NPM you got a chance to download packages, inspect them and fix the versions if so desired. Importantly, this gave you control over your transitive dependencies as well.

This seems more like the curl | bash school of package management.

Edit: This is explained in more detail at https://deno.land/manual/linking_to_external_code and indeed seems a lot more sane.

> It's also exactly what the websites you visit do. ;)

Well yes, and it causes huge problems there already - see the whole mess we have with trackers and page bloat.


The good thing about this is you can effectively build a register service that serves the same level of trust that npm provides, because at the end of the day that is the only deferential in this scenario as npm can just as well return malicious code.


Thanks for sharing that link. Seems much more sane, but not without issues. I'm sure this will continue to be iterated upon.

Even with all NPMs flaws, I do feel this is a bit of throwing the baby out with the bath water. Time will tell.


AFAIK there is no option to allow a website to read and write random files anywhere to my hard drive period. At most a website can ask the user select a file or offer one for downloading. In the future maybe it can be given a domain specific folder.

That's not true here. If I'm running a web server I'm going to need to give the app permission to read the files being served and access to the database. That something that never happens in the browser.


The tldr is Deno also gives you a chance to download + inspect packages, and then lock dependencies. The mechanism for import is different, but the tooling is good.


Sure do. I wonder if they have a checksum mechanism like browsers do?

You can add an “integrity” attribute to script tags in the browser.

https://developer.mozilla.org/en-US/docs/Web/Security/Subres...


One advantage of urls is that you can link to a specific git sha, tag, or branch for a dependency, e.g. on github.


So exactly like existing tooling can already do, then?


Sure, I probably phrased that poorly -- it's not a unique advantage, but benefit of having URLs be the only way to link to dependencies versus a centralized, dominant package manager.


It's not just about the integrity. The url may very well provide what they claim to provide, so checksums would match, but it's the direct downloading and running of remote code that is terrifying.

This is pretty much like all the bash one-liners piping and executing a curl/wget download. I understand there are sandbox restrictions, but are the restrictions on a per dependency level, or on a program level?

If they are on a program level, they are essentially useless, since the first thing I'm going to do is break out of the sandbox to let my program do whatever it needs to do (read fs/network etc.). If it is on a per dependency level, then am I really expected to manage sandbox permissions for all of my projects dependencies?


If you afraid of "direct" downloading and executing some of that code, then what do you think happen when you npm install/pip install a package? I'm very interested if you can expose a new attack vector that didn't exist with the previous solutions.


You can generate modules on the fly on the server, that require next generated module recursively blowing up your disk space. If deno stores those files uncompressed, you can generate module full of comments/zeros so it compresses very well for attacker and eats a lot of space on consumer side.


Does Deno have some built in way to vendor / download the imports pre-execution? I don't want my production service to fail to launch because some random repo is offline.




You can also use the built in bundle command to bundle all of your dependencies and your code into a single, easily deployable file. https://deno.land/manual/tools/bundler.


Deno caches local copies and offer control on when to reload them. in term of vendoring you can simply download everything yourself and use local paths for imports.


How would this work with transitive dependencies? Sure I can control which parts I import myself, but how do I keep a vendored file from pulling in another URL a level deeper?


Unlike node, recommended deno practice is to check-in your dependencies to the VCS.

> Production software should always bundle its dependencies. In Deno this is done by checking the $DENO_DIR into your source control system, and specifying that path as the $DENO_DIR environmental variable at runtime.

https://deno.land/manual/linking_to_external_code


    du -hs node_modules
    
    1.7G node_modules


> in term of vendoring you can simply download everything yourself and use local paths for imports.

So I basically have to do manually, what NPM/yarn do for me already?


I do not speak for the project, but based on my understanding part of the point was to avoid the magic of npm.

You can use lock-files, bundles, and many other features that makes dependencies management easier.


Ah from that perspective I can see how this might appear to be better. Personally, I like the 'magic' of NPM (which to be honest I don't really think is all that magical, it's quite transparent what's happening behind the scenes). This 'magic' means I no longer have to write 200 line makefiles, so it definitely makes my life easier.


Some of that convenience will still be included, a couple of things that deno will do differently from node will be that there is no standard index.* file to load and import path include the extension.


I assume you would just download the packages and serve them yourself.


espacially since https is not enforced! https://github.com/denoland/deno/issues/1063


CMIIW, wouldn't enforced https means you can't use intranet or localhost url?


you could use a flag to re-enable http :)


More than likely programming as a whole will get better because of this...

Do you trust this thing? Better off developing it yourself, or working with something you trust then.


deno requires that you give the process explicitly which permissions it has. I think it's much better than praying that a package has not gone rough like with node. If you don't trust the remote script, run it without any permission and capture the output. Using multiple process with explicit permissions are much safer.


I'm wondering about the practicality of importing from URLs. I didn't see it addressed, but an import like this will be awfully hard to remember.

    import { serve } from "https://deno.land/std@0.50.0/http/server.ts";
Anyone know if there are alternatives or a plan for this aside from "use an IDE to remember it for you"?


The convention is to make a `deps.ts` and re-export what you need. Like this: https://deno.land/x/collections/deps.ts

I don't find versioned URLs much more difficult to work with than package@<version> though.


i'm wondering if they'll end up adding a 'dependencies.json' to eliminate the boilerplate from 'deps.ts' and to simplify tooling. that'd be revolutionary! ;)

jokes aside, i wonder how import-via-url will impact tooling. having to parse arbitrary JS (or even run it, for dynamic imports?) seems like it'd make writing a "list all dependencies" tool much harder than a "dumb" JSON/TOML/whatever file would. though i guess Go does a similar thing, and afaik they're fine


Well they do have import maps! I think everyone likes shorthand package names.


You are not alone, this is very unsafe in my humble opinion.


How is it any different than how it works in the browser?


Does it also terrify you when code running in a browser does it?


The code running in my browser isn't a multi-tenant production server, with access to the filesystem and DBs.


Except that with Deno, everything IO related is turned off by default and has to be granted access before it becomes a process. It's the first bullet point on the landing page.

Here is the page with more detail. https://deno.land/manual/getting_started/permissions

It can even restrict access down to a specific directory or host. This is cool.

Whereas any NPM module can map your subnet, lift your .ssh directory, and yoink environment variables, wily-nily.

It's happened before.


That still doesn't prevent imported modules from yoinking anything you did grant access to, though. For instance, if my service connects to a DB then `uuid` can slurp the contents.

It'd be nice to have some capability model where modules can only access things through handles passed to them, but probably infeasible for a project like this.


You can actually run things as Workers in Deno and get some sandboxing abilities: https://github.com/denoland/deno/blob/master/docs/runtime/wo...


From the article: "Also like browsers, code is executed in a secure sandbox by default. Scripts cannot access the hard drive, open network connections, or make any other potentially malicious actions without permission."


That just means you have to run with the -http -fs, etc. flags. But you are using those when writing any nontrivial Deno app like a webserver anyways.

"web browsers already do this ;)" isn't a good comparison.


"But I have to turn all that stuff on" is also not a good comparison.

Actually, no Deno webserver I've written gets fs access. Some only get --allow-net.


I think that's the main selling point of deno, sandboxing.


For the uninitiated, worth noting that one of the names on this post, Ryan Dahl, was the original node.js developer.


Also for the uninitiated, fly.io is migrating to deno for their Serverless functions: https://news.ycombinator.com/item?id=22621926


Precisely the use case that Deno should be used for. Serverless functions are a great use case for Deno right now.


[flagged]


"Automatic asynchronicity[sic]"?! Node and all async io is cooperative multi-tasking, the opposite of automatic. The node event loop is a thin wrapper around the epoll() system-call (or one of its equivalents), and leaves the details of multi-tasking to the app developer. It's no wonder you weren't able to scale under node if you thought something was happening automatically for you!


Maybe he meant forced async as DOM/node blocking APIs are async/callback based since it's the only multitasking model available (or it was before workers)


Why do you say this?


It would be great if this line were at the top:

>> Deno is a new runtime for executing JavaScript and TypeScript outside of the web browser.


Thanks. Came looking for this post. Way too many web developers just assume that everyone follows every nook and cranny of their ecosystem.


> In Deno, sockets are still asynchronous, but receiving new data requires users to explicitly read()

Interesting. If I understand correctly, they're essentially using pull streams[0]/reactive streams[1]. I compiled a few resources on this topic when I was digging into it a while back[2]. I've found the mental model to be very elegant to work with when needing backpressure in asynchronous systems.

As for the dependencies-as-URLs, I don't mind it, and may prefer it. I've been experimenting with minimizing my dependencies lately, and vendoring them in git submodules. It's worked fairly well.

[0]: https://github.com/pull-stream/pull-stream

[1]: https://github.com/reactive-streams/reactive-streams-jvm

[2]: https://github.com/omnistreams/omnistreams-spec


I would call it: "They are essentially using the plain async conversions of regular system calls and Stream (e.g. known from Java/.Net) APIs"

All the reactive streams stuff still had been push streams, just with some backpressure bolted on. The issue was that without async/await, you always end up with some kind of callback model, which then again results in a push model.

Whereas with async/await you can just mostly model IO like any kind of synchronous IO.


When describing the input stream back-pressure problem:

> To mitigate this problem, a pause() method was added. This could solve the problem, but it required extra code; and since the flooding issue only presents itself when the process is very busy, many Node programs can be flooded with data. The result is a system with bad tail latency.

Are they saying that even with correct use of pause() there are still issues?


They're saying since the issue doesn't present itself until it's too late most code does not use pause().


I'm not sure this is referencing the js api, which will be based around promises like node. I believe this is referencing the rust implementation, which is built on top of tokio and rust's async/await.


Is this different from how streams work in most other languages, e.g. Java, Go, Python?


Most languages block by default, so backpressure is much easier to model: just don't read until you need more data and the sender will block.

But in JS the receiving end will just keep firing off data events until you pause it, or use something like reactive streams to request data as you're ready for it.

That's my understanding of the situation at least.


Of all the problems with NPM, it being centralized is the last of them imo. Having experienced the mess that is go's decentralized dependency management, I'm not sure why anyone would want to replicate it.

Putting aside security, availability and mutability is a massive problem, anyone can stop hosting their module, or worse, change an existing published module at any time.

Why not take some inspiration from maven central, and run a central repo that actually provides some validation on the quality of and consistency of published artifacts.


> Why not take some inspiration from maven central, and run a central repo that actually provides some validation on the quality of and consistency of published artifacts.

Because then it would be hard to have a new hot framework every day.


Deno has the best of both worlds, as I see it.

Users who need centralization / QA from their packaging ecosystem can always opt in to a centralized repository.


Maven is an excellent model to emulate, IMO as well.


So it's an alternative to Node? Sounds incredibly cool, honestly. This:

> Supports TypeScript out of the box.

Seems like a small thing, but it has me interested all on its own. It's a huge pain to set up a whole build process just for TypeScript (which is generally the only preprocessing you need outside of the browser).


You'll like https://www.npmjs.com/package/ts-node - it allows zero processing use of typescript


Word of warning though - ts-node can be excruciatingly slow. We recently switched a project from using ts-node in our dev environment to compiling with tsc and running with node, and shaved around 5 minutes from our startup time.


Well, that answers something I was wondering about recently.

I noticed in a couple of popular TypeScript (+React fullstack) boilerplates, that they were using ts-node to run the server in production.

Unlike babel-node, there's no mention in the documentation to avoid using it in production - but I figured there'd be performance impact, since it's transpiling on the fly (I suppose just once per require).


You had 5 minutes startup time?!


More then one second and people will test their code less often. Five minutes and people start relying on the type-checker. Like with the chicken and egg problem, what came first, the type-checker or the need to have a type checker?


Zero processing is a little generous, for things like like ES Module support (and interop), ts-node can be a struggle to get going.


As in .mjs files? Personally, I just want to import my own .js / .ts files and external npm packages which works well - https://github.com/gunn/covid-19-scripts/blob/023579e1cf/get...

The catch for me is that it's probably not suitable for a sever in production as mentioned elsewhere in this thread.


It doesn't really help that much with TypeScript native NPM-packages being published with JS and D.TS (TypeScript-definitions) instead of the original source, does it?)


Yes it's great! My deno projects are really simple directories, `deno run server.ts` is nice and tidy.


One question.

> The browser provides APIs for accessing cameras and microphones, but users must first give permission. Deno provides analogous behaviour in the terminal.

I read this and I started looking around for the camera API or maybe for the Audio API. And the thing is that I can't seem to find anything about it. I can't see anything about it in "The Manual" or in the API reference.

Then I thought that there may not be documentation because it just mimics the browser's API. Ok, but... there must be some command-line flag to give permission to it, right? Can't find it either. Maybe "it was just an example; there's no media API just yet"?

But then I set out to find available command-line flags in general. And I can't find those either. There's this [0] but is that all? There's just --allow-net, --allow-read and --allow-write? Or is there some other place where the available permission flags are listed?

[0] https://deno.land/manual/getting_started/permissions


I don't see it listed in the docs, but try running `deno run -h` to see the command line help. It should produce output like:

    -A, --allow-all                    Allow all permissions
        --allow-env                    Allow environment access
        --allow-hrtime                 Allow high resolution time measurement
        --allow-net=<allow-net>        Allow network access
        --allow-plugin                 Allow loading plugins
        --allow-read=<allow-read>      Allow file system read access
        --allow-run                    Allow running subprocesses
        --allow-write=<allow-write>    Allow file system write access
etc


Thanks.


Deno does not provide API to Video & Audio. It was just an analogy.


I think they could've used an analogy using a feature they do actually support, then.


Would it still be an analogy though?


Confirming that there is no media API in Deno.


Thanks.


> Internally Deno uses Microsoft's TypeScript compiler to check types and produce JavaScript. Compared to the time it takes V8 to parse JavaScript, it is very slow.

> Early on in the project we had hoped that "V8 Snapshots" would provide significant improvements here. Snapshots have certainly helped but it's still unsatisfyingly slow. We certainly think there are improvements that can be done here on top of the existing TypeScript compiler, but it's clear to us that ultimately the type checking needs to be implemented in Rust.

> This will be a massive undertaking and will not happen any time soon; but it would provide order of magnitude performance improvements in a critical path experienced by developers. TSC must be ported to Rust. If you're interested in collaborating on this problem, please get in touch.


Deno really shouldn't run TypeScript files directly. Not only is TypeScript too slow for this, it receives far too many breaking changes.

How will Deno decide when to upgrade its TypeScript compiler version? Will Deno have to have breaking changes every three months or so?

Also, Deno appears to allow import TypeScript files with .ts extensions while tsc doesn't. This alone means the same code won't run in Deno and compile with tsc.

JavaScript at Stage 3 and above is a stable compile target, and Deno should just run that.


I would rather have Clojurescript than Typescript, but absolutely, less is more, and it feels like a bad idea to tie implementation of this runtime to another dependency.

My love of the C++, Java, C#, ad nauseum, lineage of Enterprisey OOP languages turned to hate about 15 years ago, though, so take my assessment with a grain of salt.

“C with classes” seemed like a good idea, we were told back in the 80s in school, since garbage collection is impractical. Er, yeah. Worse is better...


GC is impractical in many domains.

And in the 80s it was even more so due to CPU and RAM performance.


I’m a huge TS fan and I agree. Deno is just running tsc for me it seems. So far I don’t see much advantage. It also potentially ties Deno to current trends. TS is pretty dang popular, but what if something else comes along and scoops it?


Note that Deno started out as Typescript runtime, it was it's tagline, they added the JavaScript later there.

> TS is pretty dang popular, but what if something else comes along and scoops it?

Earth is rather small, type inferencing dynamic language is really hard. There is simply no other language or team that has achived anything like TS. It's very unlikely that anyone else will do it for JS again. You can just look at amount of work in TS already.

Only thing that can "scoop it" is another language entirely taking off and leaving JS developers in the history. In that case the point becomes useless, it can happen to any runtime for any given language.


I read in a comment somewhere else in this post that you can just point deno at JS files to skip the TS compiler. I agree it's probably not a good idea to couple the Typescript version with the runner.


It's just compiling TypeScript into JS when you import it, and uses tsc to do it. It can load JS just fine as well if you want to use Babel or similar.


I realize that it loads JS.

The issue is that the TypeScript it loads today, it might not load in the future.


My workaround is to use //@ts-ignore above the import, but this is clunky. I just tried Deno and found that TypeScript doesn't support .ts extensions yet.


A note on Deno and the safety of its V8 bindings:

> All of the V8 source code is distributed in the crate itself. Finally rusty_v8 attempts to be a safe interface. It's not yet 100% safe, but we're getting close. Being able to interact with a VM as complex as V8 in a safe way is quite amazing and has allowed us to discover many difficult bugs in Deno itself.

We'll also have to wait on who will be the first to have a production ready Javascript Engine written in Rust that could replace V8, if you're talking about safety.

Even with the "safe" bindings, you will still get the same C and C++ vulnerabilities found in V8 that will also be present in Deno.


For this sort of problem Rust's safety guarantees don't buy you all that much. If you're generating machine code and implementing a GC that's not the sort of thing that Rust's typesystem can prove correct.

That said, a rust JS runtime would be amazing just because it's easier to integrate into rust projects.


Well, sure, but a big chunk of the bugs in JRE were in the library, not the code generation or GC.


Speaking of: Neon, a system for writing Node libraries in Rust is pretty cool:

https://neon-bindings.com/


Not sure exactly how much it buys you but I suspect there’s still room for research especially since Mozilla themselves has interest in a Rust JS JIT. I am curious to see if HolyJIT goes anywhere and what kinds of safety improvements it could potentially offer.


Deno plans to not expose V8 internals, unlike node. I do not know where the V8 vulnerabilities came from, but it should decrease the attack surface.


But we have to admit, that most people will be happy enough with a Node.js with browser APIs.

TypeScript is just an extra.


esbuild can compile typescript a couple orders of magnitude faster: https://github.com/evanw/esbuild/issues/81#issuecomment-6250...

though i think it just strips the typescript annotations. but this may be workable if during dev there's some kind of guarantee that the typechecking has already been done.


Massive undertaking is a big understatement. It would be better to not tie into TS/Flow and let one of the two biggest tech companies implement the best, fastest typechecker.


What is the ongoing performance cost of using the official TypeScript compiler for long-running applications? Or is this primarily a concern of startup time for scripts and short-lived programs like CLIs?


AFAIK, tsc is just a compiler, not a runtime. Once you compiled, you're just executing javascript so it only impacts startup time.

ts-node is a runtime able to run TypeScript. It is definitely much much slower at execution, not just at startup time. It's useful for hacking around, I use it as a REPL but even for a dev environment it's faster to use tsc's incremental compilation with a file watcher, and execute the resulting JS


It's purely in the compilation step, V8 executes javascript.


>> Internally Deno uses Microsoft's TypeScript compiler to check types and produce JavaScript. Compared to the time it takes V8 to parse JavaScript, it is very slow.

>> We certainly think there are improvements that can be done here on top of the existing TypeScript compiler, but it's clear to us that ultimately the type checking needs to be implemented in Rust.

Funny, I was just talking about something like this in an earlier TypeScript discussion today. I was saying that I don't understand why Microsoft doesn't have their own native TypeScript runtime engine by now.

I have nothing against TypeScript as a language but I hate the TypeScript transpiler; it should be packaged as an engine, not as a transpiler. I just hate having to debug mangled transpiled code and dealing with sourcemaps. I want to be able to run and debug my own code, not some alternative mangled version of it.

I think Deno is a promising project in the sense that they seem to understand the drawbacks of transpilation and are actually trying to provide TypeScript support which feels native. Everyone else seems to be ignoring this problem including the creators of TypeScript.


> I don't understand why Microsoft doesn't have their own native TypeScript runtime engine by now

what would they do with it? ship it in Edge? great, now some small fraction of users can run TS natively. but most can't, so everyone would still transpile to JS anyway...

it could work if Google did it, but i don't think MS has enough market share to have an influence here


It would work for server-side tools (like Deno). For example, cloud functions on various cloud platforms often support TS.


yeah but i don't think MS uses TS-on-the-server enough or has enough Azure TS users to justify putting in the effort


They have ChakraCore (which is js), but the thing about a "TypeScript runtime" is it's not that different from a JS runtime. TypeScript is a super set of JavaScript so it already has to do the JS runtime work. v8 is pretty good at that.


>I just hate having to debug mangled transpiled code and dealing with sourcemaps. I want to be able to run and debug my own code, not some alternative mangled version of it.

Look into ts-node. It lets me run and debug the TypeScript itself before transpiling it.


ts-node still transpiles under the hood, just as this does.


Number of things come to mind:

- How do you update dependencies? They are urls spread along many files which can be everywhere... Do I have to find and replace every import statement?

- In some enterprise environments, we use mirroring of package distributors (Nexus, jfrog etc.). This give us the ability to audit what packages are being imported, create a cache of packages so that a single delete or unpublish (like good old leftpad) won't break all applications, etc. Since package locations are hard-coded in the package files, it becomes challenging to do this.

The argument that deno is mimicking a browser is thin. Browsers are client side, they don't access databases or lives inside your firewall. Server applications are much more sensitive to security issues than browsers.


A lot of comments here can be answered by reading the documentation.

https://deno.land/manual/linking_to_external_code#faq


I briefly looked over this project when this link first popped up and didnt think much of it, but then i was surprised to see this huge surge in votes. I dont do much in the javascript and related world - can someone explain what in particular about this project has generated such interest? Even after reading the top comments, I feel like im missing the bigger pitcure.


>>I dont do much in the javascript and related world - can someone explain what in particular about this project has generated such interest?

The majority of the interest lies in the fact that Ryan Dahl[1] was the original creator of Node.JS[2], which is currently a very popular Javascript runtime and web backend.

Dahl released the initial version of Node.JS in 2009[3]. After a decade of experience working on Node.JS and growing the community, Dahl decided to create an alternative Javascript (and Typescript) runtime in mid/late 2018 called Deno which is now v1.0.

[1] https://en.wikipedia.org/wiki/Ryan_Dahl

[2] https://nodejs.org/en/

[3] https://en.wikipedia.org/wiki/Node.js


> After a decade of experience working on Node.JS and growing the community, Dahl decided to create an alternative Javascript (and Typescript) runtime in mid/late 2018 called Deno which is now v1.0.

Just to nit-pick: Ryan Dahl had stepped away from node and the surrounding ecosystem in 2012. So for the vast majority of node's lifetime, he hasn't really been involved. Node in 2012 was a very different project.

(That doesn't take away from his observations or from the things that deno does differently. Just trying to reduce confusion.)


I think the goal was to eradicate JavaScript (Deno was originally TypeScript only. But in order to run fast, TS first need to compile to JavaScript). After buying Github and NPM, moving people over from Node.JS to Deno would be the final blow to the JavaScript community.


I don't think that's correct. From what I remember of watching his talk[0], Ryan is a fan of JavaScript. TypeScript gives optional typing, so you can still write normal JavaScript anyway. I don't think there was ever a plan to enforce types in Deno.

[0] https://www.youtube.com/watch?v=M3BM9TB-8yA


Congrats to the Deno team and community on reaching 1.0!

We've launched Deno support on repl.it here (in beta): https://repl.it/languages/deno for folks to try out.


This is super effing awesome.

You know what would be extra amazing? Incorporating (at some point in the future) all of the very smart stuff that the unison people have been doing. That would be _tremendous_ for code re-use and guarantees about remote packages. What I wouldn't give for a JS vm that could do all that. . . .

* https://www.unisonweb.org/docs/tour


I like that Deno prefers URLs as module specifiers. Need immutable/content-addressed dependency graphs? Publish and consume modules via IPFS.


Maybe I’m missing something but the reason why the concept behind unison is so neat is that identifying libraries by AST allows for the kind of code re use that Joe Armstrong was talking about. It’s not about urls as identifiers, that’s superficial.

What’s the relevance of IPFS?


This is exciting news indeed. Deno hitting 1.0 still makes the ecosystem surrounding it quite young, but it's an important milestone and I'm excited to see what comes of it in the future.


For context:

Deno's lead is Ryan Dahl, the creator of Node.

In the past, Dahl's expressed regrets over choices he made early on in Node's development, and on the direction Node has gone since he left the project many years ago.

He bravely presented on this topic at jsconf eu, 2018: https://www.youtube.com/watch?v=M3BM9TB-8yA, it's a fantastic talk.

The last 10 minutes are a pitch for what a "better Node" would look like, if he were to start from scratch in 2018. The end result of that train of thought this project, which we should probably think of as bourne-again node.

Deno's development is (presumably) strongly influenced by his experience and furstrations with Node's shortcomings in both performance and developer ux.


And now, he'll make a whole different set of good and bad choices to potentially regret later.


such is the curse of having the audacity to develop software :)


Sure, but there's a tendency to start over when the development gets hard to maintain or support instead of just fixing the mistakes.

This really feels like the fundamental response in the js world and why we see much churn.


But isn't this an inherent problem in software development that is really super hard to not do?

Let's say you are deciding how to make node when it was first conceived and how it would work. You've made decisions about how the thing fundamentally works. Then after using it and developing for it many years and after having millions of critical software projects dependant on it, you slowly start to see the shortcomings of the software that you could only see at this stage. The problem is that these shortcomings come from a false assumption or solution to the fundamental problems you had to solve when making node. Now, the only way to fix node is to change how it fundamentally works. But if you do that, millions of users' code will break. So, do you require everyone now to fix their broken code and potentially piss everyone off? Or do you give people a choice? Stick with node if you aren't noticing any of those fundamental problems you found, or switch to the new thing on your own time? It's a question of how to affect the least number of people in the least negative way.


I think the answer is a break in backwards compatibility on a major version release.

People in JS/frontend world are willing to drop the world for the latest new thing, I see this as less jarring than, say, Python 2 -> Python 3.


I find it amusing that you use the example of Python 2 -> Python 3, a breaking change in a widely used language, that has famously been very difficult and long for organisations to deal with.

Compare that with javascript which has never had a breaking change. On top of that Typescript is a backwards compatible superset of javascript.

More to the point, Ryan has a humble explanation of what regrets he has about Node.js[1], why they exist and in some cases why there isn't an easy fix.

The point that I assume you're making, that sometimes it is better to spend significant energy to fix something, rather than throwing the baby out with the bath water, is a good one. However I'd suggest this is not one of those cases.

[1] https://www.youtube.com/watch?v=M3BM9TB-8yA


> I find it amusing that you use the example of Python 2 -> Python 3, a breaking change in a widely used language, that has famously been very difficult and long for organisations to deal with.

Why is that amusing? I specifically chose that example for that exact reason. I was highlighting the difference in the audience and use case.

> However I'd suggest this is not one of those cases.

I don't see the argument that supports that, either in the post or your reply.

The thing is, I can see Beepboo 1.0 being announced in 2025 to address the things that went wrong with deno. Because there will be design mistakes. And at what point do you say 'oh too many people rely on this software to fix this, I have to start over'?

Couple this with a very real trend-chasing and resume pushing in frontend dev and I'm starting to understand why people are so cynical about this stuff.

Typescript is something more palatable to me because it wasn't throwing the baby out with the bathwater.


> Why is that amusing? I specifically...

My apologies I misread.

> ... there's a tendency to start over when the development gets hard to maintain or support instead of just fixing the mistakes.

The thought that Node.js should have been 'fixed' instead of creating Deno is where I disagree. At a glance I can see a few reasons:

- Node.js maintainers + community may not even think there is something to be fixed (see various discussions in this thread about module resolutions)

- Politics, death by committee, inertia

- Effectively a dependency with npm registry (although not technically)

- Lack of backwards compatibility with changes (e.g. module resolution)

> The thing is, I can see Beepboo 1.0 being announced in 2025

Node.js was initially released in 2009 so it's probably fairer to suggest Beepboo 1.0 will be released in 2030. And yes, if it improved on Deno and solved inherent problems that couldn't be solved internally, I would wholeheartedly cheer it along.

I think it's also worth mentioning that Node.js is at a level of stability and maturity that people who plan to and have already built on it, aren't left abandoned.


> People in JS/frontend world are willing to drop the world for the latest new thing

Except there aren't any breaking changes in JavaScript, are there? Even in Node if anything is deprecated that is done over time in many years.


When other people go ahead and build billion dollar companies on top of your development you can't "just fix the mistakes"


Why not? Billion dollar companies were built on flash, on asp, on Perl ...

Software changes, languages change. Billion dollar companies adapt or migrate.


Not necessarily. I prefer trying to improve what we have verses making a new thing every time we have a relatively minor disagreement, even if that disagreement is with our past self.

EDIT: I'd like to add that clearly he is able to spend his time as he wishes.


I generally agree. Although, some of these are pretty "opinionated" breaking changes (promises all the way down, package manager changes). It would be hard to convince the whole node community that these upgrades are worth the risk forking in a python 2/3-esque way.

Forcing TS is a change node could adopt in the next major version if everybody agreed, but the node community might be too big and diverse at this point, to make such an opinionated switch.


afaik modifying node doesn't make sense here – it'd be so incompatible that it'd effectively be "a new thing" anyway


Good that it’s called Deno, not Done!


> The last 10 minutes are a pitch for what a "better Node" would look like, if he were to start from scratch in 2018.

Is it fair to say (a managed) deno is what Cloudflare Workers is? If not, what would be key differences between them?


PM on part of Cloudflare Workers and someone who was in physical attendance for this talk here.

They're not really directly comparable other than "A JavaScript runtime built on top of V8."

Workers doesn't support TS directly, though you can compile TS to JS and run it, of course. (My team maintains a worker and this is what we do, and it works well)

Deno has its own APIs, as does Workers. Worker's main API is the Service Worker API, Deno's is not.

Workers is focused on V8 isolates as a means of achieving multi-tenency. I don’t believe Deno does anything specific here.

Deno is mostly implemented in Rust, the Workers runtime is written in C++.

Deno is open source, Workers is not.

Workers is being used at scale in production, Deno just launched its 1.0.

I am very excited to see what happens with Deno. :) Fun history: I had been dreaming about doing "Chakra Core + Tokio" a few years back, but didn't find the time. I'm skeptical of the dependency approach Deno is taking, we'll see what happens!


> Workers is focused on V8 isolates as a means of achieving multi-tenency. I don’t believe Deno does anything specific here.

Deno implements the web worker API, which launches different isolates. You could implement something kind of like CF Workers in pure TypeScript, but probably not replicate your resource enforcement.

It's also a pretty good Rust v8 implementation. Before we (fly.io) abandoned our JS runtime we were rebuilding it with that.

Ironically, we also tried Chakra Core + Tokio. It sure would be nice to have a not-google JS engine.


Thanks for confirming, that’s what I meant by “specific”, I was guessing they implemented the spec. It’s just a very big focus of Workers, and I don’t think it’s a focus of Deno. Not good or bad, just a difference.

Ah interesting! Did you abandon it for reasons other than “deno exists”? Would love to hear more about how it turned out, good or bad.


We tossed it because people needed to do much heavier compute than we expected and we realized running arbitrary executables was more useful. It wasn't techniecally bad, just wrong for our customers.

Deno’s existence gives me hope we can bring back the JS API, I'd love to have nginx-as-a-TypeScript-API.


> which we should probably think of as bourne-again node.

Or maybe as Perl 6


This project looks really cool to me.

I'm glad the link to the video is there, because my intuition about pronouncing the name was incorrect.

It looks like it could be "deeno" OR "denno", and I was pushed to the former by the presence of the dinosaur graphic. Isn't that old fashioned dinosaur on the Flinstones pronounced deeno? Anywho... good name overall: short, no collisions (i think?), and no strong baseline associations.

And only right this instant am I realizing that this is an anagram of node... oh my god i'm slow. Now it's a great name.


What video? Mind linking directly?


I think he meant Ryan Dahl's talk: Design Mistakes in Node https://www.youtube.com/watch?v=M3BM9TB-8yA


- "Rust has its own promise-like abstraction, called Futures. Through the "op" abstraction, Deno makes it easy to bind Rust future-based APIs into JavaScript promises."

That's exciting! I wonder if that binding is public for end users to use, or if it's an internal design


This exists for wasm too; it’s super easy with wasm-bindgen. I wrote a program with multiple layers, worked flawlessly. (Promise in a future in a promise created by Rust and passed back to JS)


Given that all API calls in JS have to go through the interpreter, couldn't unsafe API calls have an implicit parameter added that requires a per-package token, with the API accessed through an interface requested by the package that pins that interface to that token? Additionally, you could allow package authors to set their token as a cryptographic public key from a package manifest, so they could grant access to the API to sub-packages within their own ecosystem. If you passed the API interfacing object to a package outside of your control, the calling code's token wouldn't match the API interfacing object's expected token and would kick a security access exception. This would be completely transparent to the package author. With such a system, access to unsafe APIs could be granted to only a shortlist of packages.

Something like this is already done with only allowing access to certain APIs if they are called from certain types of event handlers, until the callstack of the event handler is left.


The fundamental problem of dependency management, as anyone who ever kept bookmarks knows, is that the external sources might go away at any time. It's not a problem unique to Deno or Go. Take Java for example, its standard dependency management tool is Maven, which relies on external repositories of artefacts to function. From time to time, a repository might go down, or it might be moved to another address, or a library might be migrated to another repository. Whenever this happens, your build might break. The solution is to proxy all requests at the organizational level through your own server which keeps a mirror of any package that has been referenced once. That way, if it become unavailable externally, then you can still use it internally. You can also override packages with patched versions if they misbehave.

I think that Deno should define a standard dependency fetching protocol, and allow proxying transparently at a native level. If it does, that kind of dependency management will be fine.


> A hello-world Deno HTTP server does about 25k requests per second with a max latency of 1.3 milliseconds. A comparable Node program does 34k requests per second with a rather erratic max latency between 2 and 300 milliseconds.

Under-mentioned feature. I may migrate my personal site to Deno just for that latency drop.


Yeah it seems like the p99 latency of node, which isn’t great, doesn’t get mentioned much.


if you require modules by URL, how does it make sure that the URL always contains the same library?

I've read the docs and it says it caches on the initial execution and doesn't update unless it's forced to update, but what happens when you for example publish a deno module to github, and someone else downloads it and runs it, and turns out the URL contains completely different library at his execution point?


This is something that we'll further work out in future versions. For now you can use a lockfile: https://deno.land/manual/linking_to_external_code/integrity_...


> Deno can store and check module subresource integrity for modules using a small JSON file. Use the --lock=lock.json to enable and specify lock file checking. To update or create a lock use --lock=lock.json --lock-write.

HALLELUJAH that there is a clear, simple separation of when (a) you expect a lock file to be checked to guarantee integrity and (b) when you want it to be generated. The complete insanity that was npm shrinkwrap and lockfiles for years, summed up in this stackoverflow post https://stackoverflow.com/questions/45022048/why-does-npm-in... , always baffled me in that it seemed like it could have just been so easily avoided about being explicit when you're writing a lockfile vs. when you're using it.

That said, why not be even MORE explicit about it, i.e. "--use-lock=lock.json" vs. "--write-lock=lock.json"?


This looks really promising.

However, it still seems risky to me in case a library is not available. Is there a central repository planned? Or are you expected to vendor everything and ship your project with dependencies included?


Instead of referring to the version could you refer to the hash? Then a simple integrity checker could confirm file changes.


The import from url feature is drawing a lot of initial skepticism - but I think once you look closer at it there are even more things to be wary of.

The initial thing is the fear "whoa, that is hella insecure!" which, I agree NPM is basically the same problem... although, with NPM - Git system, you can 1. fork a version and use that version (which you can do with this, just host different url) or two you can freeze version you are using and theoretically only update after checking stuff out (I guess you would have to do the same with host different url, seems more difficult process)

Also NPM basically makes a local cache of the files you will be importing, I guess the first time you run your program it must get the files and then caching keeps them from updating until the resource updates? Maybe I'm missing something and it isn't like that but if it is like that I find that weird because you are arbitrarily adding extra performance overhead to parts of your system as code loads and sees it needs to get a resource that has updated? I guess what would end up happening is versioning would be in the url and an expires header set a long time in the future, like you might do with static assets now, but Deno doesn't require that, it will be an outgrowth of this import from url system. Will there be situations where someone has done the caching poorly, or set the header badly or whatever and you are loading a resource too often? I would expect so, anyway I guess there will have to be testing of Deno's caching https://www.mnot.net/blog/2017/03/16/browser-caching

Finally, in NPM's system if a module I'm importing is a little bit weird I can always just go into the folder quickly and start debugging and maybe fixing the code, and then when I've got things working the way I think they should with changes do an actual fork pull-request, or just fork and use my fork of the code, or if it turns out as it often does that I've misunderstood something revert my changes to their code and go fix my code instead. I can of course still achieve the same effect with Deno but the workflow would have to be different and I think would end up being more convoluted, at least in the beginning.

These are the initial worries that I get when seeing import from some url. I suppose someone has already thought these things through and I'm unnecessarily worried so if that someone is you (the reader of this long post) you can maybe assuage my worries.


Well is has a huge security hole that ry wont fix https://github.com/denoland/deno/issues/1063


I would like to write a long damn with an exclamation point at the end but the margins of my screen are insufficient to hold it.


What is the reason for making a JavaScript runtime based on browser APIs that cannot also be a browser?

Or in other words, wouldn't it have been easier and better to make an optionally headless version of the Servo browser with additional native APIs and some enhancements like being able to run JavaScript directly in addition to HTML?

The choice made means that Deno can't be used, at least directly, to make desktop applications, and also doesn't have all the features that browsers have for free, like multiprocess and multiple sandboxes, network traffic inspector, local storage, GPU compute, etc.


The Javascript engine is a lot smaller code surface area than an entire browser rendering engine.


This is a very interesting idea - basically the new JVM is a headless browser. Fascinating to think Of the possibilities.


I feel explicit location imports are a step backwards. Node ecosystem is moving toward workspaces and Yarn Plug'N'Play style manifest lookups for deps. The de-coupled import from where physically the dependency is being satisfied from seems desirable and it is a common next-level pattern in project and dependency management across language ecosystems.

Sure, I guess you could shim it somehow like is happening in the node ecosystem but then.. I'm not sure what the stock situation brings to the table TBH.


As a functional developer who doesn't care for TypeScript in any way, it's frustrating to see Deno has it "built-in".

As a node developer writing pure none-compiled JavaScript, my run time is extremely fast from changing code to seeing its results. It takes milliseconds for me to run brand new code in my terminal.

If I make an index.ts file simply adding two numbers together (with no TypeScript), there is a 1 to 2 second delay while it "compiles" my "TypeScript". (Again I don't write TypeScript so I don't need this feature). I should note, subsequent runs will be faster, but as I'm developing I don't want delays between my runs.

Since it's not even native TypeScript, it's just embedded the tsc which builds and caches the compiled code, why not let TypeScript developers add their bundling pipeline in the same way as they've always done.

Anyway, I'm very interested in the fact it's built in Rust and seems to be faster than node after compilation in many aspects. Congratulations to the Deno guys and thanks for all the effort.

[edit] Correcting myself. I was wrong. As @jon889 informed me, you can actually run a `.js` file and it will skip compilation. Again I remain that the TypeScript compiler should not have been included, but the fact I don't have to use it completely mitigates my main concern.


> As a node developer writing pure none-compiled JavaScript, my run time is extremely fast from changing code to seeing its results. It takes milliseconds for me to run brand new code in my terminal.

Deno runs JavaScript faster than node:

   echo "console.log(1 + 2)" > not-ts.js
   time deno run not-ts.js
   3
   0.01s user 0.01s system 89% cpu 0.025 total
   time node not-ts.js
   3
   0.03s user 0.08s system 53% cpu 0.203 total
> Since it's not even native TypeScript, it's just embedded the tsc which builds and caches the compiled code, why not let TypeScript developers add their bundling pipeline in the same way as they've always done.

You can!


> Deno runs JavaScript faster than node

Yup, and that's really cool and exciting. My issue isn't with runtime timing. That's not what costs me money. Developer time is what costs money, and every change having to "recompile" my already raw JavaScript costs way more than saving 0.02s at runtime.


Can't you just pass it a .js file and it will skip the TypeScript compiling completely?


That's not gonna help if you're using any part of the standard library though (or presumably most 3rd-party modules), since it's written in TS, right?


I'd assume you can just use regular node modules in that case?


I assume their TS parts will be pre-compiled, at they should be.


Haha, yup. Thats much better. Thanks!


Haha that's what I was trying to demonstrate!


Since typescript has no runtime, it doesn't really need to be compiled, you just strip away all the type annotations. There are tools like SWC[1] that can process TS in milliseconds if you want to skip the type checking.

[1] https://swc-project.github.io/


I'm a functional enthusiast and I find it much more enjoyable in TS. There is a lot of cool and interesting functional theory that only relates to types. Having a good type system tht TS provides has also saved me so much time debugging. It's invaluable


Any tips for how to work more functional programming into my TS code? I'm trying to learn more functional techniques.


I have found it good to learn about FP in general and then practice applying that to TS.


Don't ever think of trying a compiled language like C++ or Rust, then :)


I won't :P

But in seriousness, Go compiles like lightening for me. But yeah, I think everyone has their own needs when they choose their tools. I choose Node for its speed at development and runtime.


Dunno why you’re downvoted. I’m in the same camp. Compilation speed is an important factor in choosing a language.


It's the inclusion of TypeScript by default that allows it to have a much nicer developer experience compared to Node/NPM.

And as others have noted: there's no downsides for those who wish to simply use straight JavaScript.


Just a random though but if they want speed they could consider making a typescript JIT instead of typescript -> javascript. Typed languages can run much faster than untyped. Maybe that's impossible with current typescript given anything can be changed, eg (30 minutes into running something does Math.floor = ... etc, but maybe some kind of escape analysis could help or maybe push for some new keyword to mark code as safe to optimize.


Building a new JIT for a dynamic language, if it is supposed to be competitive with JS compilers, is a project several orders of magnitude bigger than building a runtime like this.

TypeScript's design is very much that of a typing layer over a dynamic language. It's unsoundness and various loopholes to escape typing mean it's not much of a help for typed compilation.


Microsoft _do_ have the research project Static TypeScript that does try to do this with a subset: https://www.microsoft.com/en-us/research/publication/static-...

But your point stands I think this subset will be incompatible with how many (including myself) writes TypeScript.


Won’t Meteor.js make this project extinct?


Clever ;)


I predict cdnjs/jsdelivr types will explode in use for Deno packages since they can make bandwidth and security guarantees behind what's actually returned from the URL as well will be able to accomodate versioning in the path. That seems to be what this dep system is built for, unless I'm missing something.


If you need a server framework for Deno, please take a moment to check out Pogo. It's got a powerful and friendly API. It also has the best documention and plenty of tests.

https://github.com/sholladay/pogo


What exactly does Deno do? I still can't figure it out: is it a secure version of Node, if so, how?


It’s an alternative to node. If Node is Photoshop than Deno is Pixelmator. They both take in similar kinds of output and make similar things, but have a very different approach on how the tooling is built and used.


Good explanation, thanks!


using typescript makes it safer. also i believe we can trust the authors, because, they wrote node (or were heavily involved at the beginning)



Well, there is a known MITM vuln in Deno by design and the team refuses to fix it soooooo

REF: https://github.com/denoland/deno/issues/1063


Lockfiles should be used for production code, as per https://deno.land/manual/linking_to_external_code/integrity_...


Can checksums/hashes be specified directly in the source file?

EDIT: I mean hashes of dependencies.

That is important for single-file scripts, if this is meant to be useful as a bash replacement for scripting.

Having to download two separate files for a script and execute it with special arguments already adds too much friction to that scenario.


>Can checksums/hashes be specified directly in the source file?

That would defeat their point actually :D malicious attacker could inject any script by hacking on the network and replace modules that are downloaded through http


How? Say I have `script.ts`. That file exists locally on my computer and the code inside it is trusted (say it was downloaded from a trusted github project via https). It contains

    import { dependency } from 'http://whatever.url/@1.0.3'
        with hash '6f09aa686a6263f9e992'
or something like that. If an attacker replaces stuff during transfer of the dependency, then the hash won't match (assuming a collision resistant hash function). This can be transitive if the dependency also has hashes of its dependencies directly inside it (and a "only allow that" mode could enforce it).

The additional benefit is that you also don't need to trust the server to not replace code for a URL under your nose from one you have previously verified. So like a lockfile, but without needing to download a separate file (because you also want verification on the first run) and a command-line argument on launch - which makes it a much more viable replacement for bash scripts (in fact, that's why I care about the ability, I wouldn't mind https-only). Am I missing something here?


Oh, I did not get what you meant ^^ well, there are still other issues than integrity with not using https.


Yeah, I'll edit the original comment to clarify that I mean hashes of dependencies.


there are a lot of issues with using a non secure protocol to do anything over the internet, actually someone summarized the issue on the Deno issue tracker.

According to them, confidentiality is also a risk. also someone could also send you garbage that would polute the memory of deno until it explode.


That is not enough at all and there are other attacks! I can't belive in 2020 some people still need to be explained why not enforcing https is a terrible thing!

For instance, will a lockfile prevent someone from eavesdropping on the download of a modules through http? If so, please kindly tell me how!


https prevents MITM but doesn't prevent the modules being backdoored or otherwise altered at the source.

I would prefer https-only, sure, but it doesn't buy you very much security.


Well, diasabling http by default is basically "Internet 101" here.

I don't want to write an full lecture on how many attacks are possible when people don't use https. It has been commmon knowledge for way more than a decade


That's really just a HTTP problem though, isn't it? If you use HTTP, you're exposing yourself to MITM attacks; that's on you.

Is there any possible way to face this vulnerability without either 1) linking to a resource over HTTP or 2) loading a resource from someone else who linked to another resource over HTTP?

Case one is the devs fault for doing it; case two is also the devs fault for not even checking their dependancies.

As infrastructure grows, there will be tools that either extend the environment to block/log HTTP stuff, or catch HTTP URIs in static analysis. Both of these, specially in combination, would be more than enough.


well, most of Deno marketing is that it is safe by default. In 2020, not enforcing a secure protocol to share source code is a no go at all. I really don't get your point here defending something that has not made any sence since the end of last decade.

Knowingly leaving this kind of things in the codebase totally invalidate Deno being secure. There is not possible discussion in 2020 about https not having to be enforced. People thinking otherwise should not be allowed near a computer for their own good.


Would you say linux is insecure because a user can download an arbitrary shell script and run it?

I know it's not an identical problem, but it does demonstrate that we probably agree that the onus is on the user to assess the risk of any arbitrary code they run on their machine, including the risk associated with the transport they use to obtain that code.

Funnily enough I actually agree with you that I would prefer to prevent http imports by default. However doing so won't make importing a library secure, and conversely allowing it doesn't mean it is insecure.

As an aside, I noticed you have posted the same one line message about the risk of a MITM attack with http imports 4 times in this thread. You might find it more helpful to contribute to the discussion by explaining why you think that.


> Would you say linux is insecure because a user can download an arbitrary shell script and run it?

Linux is not branded as a "Secure thing" right? Here Deno is building marketing on something inacurate.


Most people describe Linux as a much safer OS even though windows puts more restrictions on running code from the internet (to the extent of marking downloaded files as potentially dangerous and asking if you really want to execute them). I would totally understand if HTTP(not s) was used by default at any point, but by writing a URI starting with `http://` into the file, the programmer is actively telling the program to download that file and use HTTP for that. Secure by default doesn't mean preventing the programmer from doing insecure things.


Let me tell it another way:

browsers have been benefiting from decades of innovation to mitigate the security issues of execution of JavaScript.

CORS headers is the latest of theses innovations. Deno allow you to fetch code as a browser would without providing you with any of the safety browsers can have. Mostly because it would not make sense to have a runtime doing that.

Deno is not a browser but takes the risks of a browser. Running Deno install is as safe as browsing the internet using Windows CP without SP 2 and Internet explorer bellow 6.

Also, importing a module in https does not mean this module won't import anything using http. Should you review the code of all imported modules? This is virtually impossible.

Deno must disable http by defaulkt and provide a flag to re-enable it. This is factually a security issue in Deno.


> Should you review the code of all imported modules? This is virtually impossible.

I wouldn't be surprised if this was exactly the direction that Deno was trying to move towards. Fewer direct dependencies with some amount of transitive trust.

I.e. "[Deno] has a set of reviewed (audited) standard modules"

> Windows CP without SP 2 and Internet explorer bellow 6

I get the point you're trying to make with this hyperbole but browsers still let you view http pages (by default).

> Deno must disable http by defaulkt and provide a flag to re-enable it. This is factually a security issue in Deno.

Again I agree with your idea about disabling by default but there is another perspective (and I think Ryan deserves some empathy).


At this point, there is clearly a vuln in a tool that brands itself as secure and in opposition with another project.

The marketing around Deno has been made toward that and it makes no sense to reach 1.0.0 with such a big security issue unhandled.

Also, this part is even more frightening https://github.com/denoland/deno/issues/1064#issuecomment-43....

At this point, it is clear that Deno is lying for marketing reason by calling itself secure.

Of course Ryan deserves empathy, so does Bert. But in the meanwhile during their talks at major conferences, they have trolled a lot another project. The maintainer of that other project now get weekly/daily pings from deno supporters trolling them.

Deno's culture seems big around trolling atm, a CoC could have fixed it, the th (B)DFL has decided another way.


It seems like this is not simply about the decision of whether to allow http by default and security of dependencies.

I'm not familiar with the surrounding politics and don't particularly want to be involved, but I appreciate the explanation.


I am a long-time Node.js user. My question about Deno is "is there any indication that this project will have a different trajectory from the node.js project (also started by some of the same people as Deno), or should I expect Deno to also follow a 10 year arc?"

I think evolution & starting over can both be appropriate steps forward, I don't question the Deno project, but to have the exact same people who created Node.js say "OK here's the new new JS hotness," I think "how can we expect this time to be different" is a reasonable question. Or perhaps these ecosystems are not meant to last/be supported more than 10ish years?


Are imports async by default in Deno? That seems to be how it works given that you can import from a URL. This is interesting since async imports are currently a part of JavaScript(or at least a proposal?) by using the `import()` function.


there are no "async" or "sync" imports in js. there are static and dynamic imports. An implementation is free to spend as much time as it likes between parsing a file to get the static imports and actually running the file doing whatever it likes, including fetching urls, reading the file system, etc. dynamic imports return a promise but the implementation is free to resolve that promise immediately (e.g. `return Promise.resolve(loadSync(module))`)


I suppose that makes sense since loading a module from a file on a disk isn't all that different from loading a file over a TCP connection.


Is there some deterministic way of measuring JS execution time?

Explanation for why I want that:

I've started making an game RTS game - think starcraft but you can program your units.

Currently I'm trying to decide what language to expose to players. The two main requirements are that it's secure (I'll be running player A's code on player B's computer) and that it has a deterministic method of counting execution time (so that player A and player B's computer can both make the same decision on whether or not a script took too much time).

I'd appreciate any hints towards other languages I should look at as well.


I guess it would depend on the JS engine. If the player is the one writing JS, and you want a scaled down environment (since it's not an app in the browser), probably making your own JS parser/engine could be the way to go? You'd want the JS engine to be inside the game engine somehow.


> and you want a scaled down environment

This is definitely a requirement, a very scaled down one since I don't want any interaction with the outside world apart from the game engine (for determinism purposes).

> probably making your own JS parser/engine could be the way to go

That sounds like a lot of work to do well. If I had infinite time I suppose re-implementing a common language would be the best way forwards, but I don't, especially for a hobby project.


Yeah I can't comment much on how long a JS engine would take. You could start using V8 I guess, but I would think using an existing JS engine inside a game engine might by tricky because they seem complex. JS the language is easy to create a parser for, and that might be what you really want to be custom in order to do other things in between the JS code. Maybe try looking for stripped down JS engines that have source code to see how hard it is. I'm also assuming you will code this game with C++ for performance, but maybe a JS/HTML game could leverage V8 from the browser, if that was your initial thinking.


I'd be delighted to use V8... if I could figure out a way to get a deterministic estimate of the runtime of scripts (e.g. in languages that compile to a bytecode I could add a "bytecodes executed counter" easily enough).

I'm coding the game in Rust, but I really don't care what the language is coded in, linking in a language runtime isn't a problem.


Lua is frequently used for interactions with game engines and programming basic logic. Maybe see if there's a Lua engine in whatever language your game engine will be written with?


I'm writing in rust, but am happy to deal with linking whatever runtime will work best for this.

I have security concerns about all the lua runtimes I've seen. They (very understandably) do not seem to be particularly battle tested against malicious use, and they have a history of security issues when I check their bugs list. Otherwise I'd love to use lua.


Some compatibility with Npm modules would have been better choice for adoption.


Even if this new thing has slightly nicer syntax or usage of promises or whatever, does this alone justify it's existence? Learning all this new tooling and ecosystem all over again? Dividing the web development ecosystem even more? I really don't see how the problem it's solving is big enough for us to care really. I mean - enough already. In 8 years I'm sure Ryan Dahl can come up with an even better runtime written in Grust, will we then all be migrating our codebases from Node -> Deno -> Dynamite?


Also you can get most of the benefits of async/await without all the downsides of promises: https://www.npmjs.com/package/casync


Question:

    deno run https://deno.land/std/examples/welcome.ts
This works - it downloads the code from that URL, then compiles and runs it.

But if you visit https://deno.land/std/examples/welcome.ts in your browser you get back HTML, not raw code.

Anyone know how this works? Is deno.land a special case or is there some Accept header cleverness or something going on?


Was waiting for the first person to point out that what you get when you visit a url is not guaranteed to be the exact same on a subsequent visit.

Not seeing how url-based package management is safer when a package host can use a server that sends a special payload to certain requester ips, headers, cookies or referrer.

Until there are firm guarantees around what you get from a url, a trust-able third party is needed, even if just as an option.


Figured this out myself:

    $ curl 'https://deno.land/std/examples/welcome.ts'
    console.log("Welcome to Deno ");
So this is an Accept header trick. If I do this instead:

    $ curl 'https://deno.land/std/examples/welcome.ts' -H 'Accept: text/html'
I get back the full HTML version of the page.


Some people for the last decade: wouldn't it be great if node had a comprehensive, idiomatically coherent, thoroughly typed standard library ? Look, typescript is right here !

No-one, at all, for the love of jeebus: Do another node-like thing, but make it in rust on top of C++ (or is the other way around ?), somewhat similar but generally incompatible, have a whole different set of APIs, a bunch of new tools and yet another package/module system.


I don't understand how Promises help with backpressure.


So I can import using the URL "https://somekind.com/ofpackage.ts", that's great but what if that random endpoint changes?

The benefit of NPM is that every package version is immutable...

Also first class Typescript support is great, but they are just using tsc internally... isn't that directly tying a typescript version to the DENO version?


I'm probably just one more person among the hundreds who are missing something here but: if the big thing about deno is that you can directly fetch dependencies from source with, for example, a GitHub repo URL, instead of going to a package registry like npm, where the package is decoupled from the source, why don't you simply do this already by changing the package.json and swapping the npm identifier with a URL to the repo?

You can already do decentralisation with npm by simply not using`npm i some-package --save` and instead, modify the package.json yourself.

Now the problem with deno is that if I import 5 packages (locked to a specific version) across 50 files and I want to update them to a new version, am I really gonna need to go into every single file and change the URLs??

That's just insane, I get the whole idea of decentralisation but why can't we concentrate all packages into a single file??

Sire this is all good for distribution because you won't have to run `npm i` every time you want to run your program, but more often than not, I'm the one that's going to use my program and I don't mind running `npm i` beforehand and feel safe that I have all my packages downloaded and ready to go, instead of worrying that my program will fail at run-time because the URL for one of my packages is down..


> We believe JavaScript is the natural choice for dynamic language tooling; whether in a browser environment or as standalone processes.

I disagree with this assessment. Lua is still far superior as an embed-able scripting language/runtime. I suspect the preference for JavaScript is mostly due to the Deno developers' familiarity and preference.


> Lua is far superior

I disagree with this assessment.


LuaJIT is nearly 3x faster than V8 JavaScript the last time I checked. Just anecodal DDG search [0]. Maybe V8 has gotten faster since the last time I checked.

[0] https://duckduckgo.com/?t=ffab&q=is+luajit+faster+than+v8+ja...


I think a big part of the point is to build a js runtime for existing js code and libraries.

Probably it will not replace node soon, but it is a possible outcome.

Also didn't luajit had the problem that it would never move to more recent version of lua? I remember that some years ago there was some talking about this.


Lua also has a smaller footprint I think.


It has a worse standard library than Javascript. And its static typing solutions don't compare to Typescript.


Funnily enough the state of art in typed lua is typescript: https://github.com/TypeScriptToLua/TypeScriptToLua


It depends what you mean by "superior". JavaScript (well v8) is far safer as a scripting runtime. And there are a ton of existing JS libs for solving problems. AND there's TypeScript, which is pretty great.


i suspect it is more that the whole world is familiar.


> TSC must be ported to Rust. If you're interested in collaborating on this problem, please get in touch.

Have you talked with the TS people to see if maybe there are some bottlenecks that could be rewritten in Rust rather than rewriting everything? Even though I would love to have TSC in Rust, it seems like it would be a huge amount of work.


> We seek a fun and productive scripting environment that can be used for a wide range of tasks.

So Lua, but Javascript? Could be neat


I think Deno is fine, and I think the project is interesting, but I also think those who want to use Typescript should do so, and stop pretending they're doing anything with Javascript. If I wanted deal with repetitive code like "function add(x: number, y: number): number {...}" I'd just use Java. There's reasons I love JavaScript, and not dealing with types is one of them. When is Typescript going to have its own version of the V8 engine and just leave JavaScript out of it? Honestly, it harkens back to the classic Microsoft Embrace, Extend, Extinguish strategy that I though we left behind in the early 2000s...

I like that Deno is resetting some of the underlying assumptions that Dahl made in Node. But I think he's throwing the baby out with the bath water. I actually like and prefer an explicit node_modules directory. Grabbing everything I need, and then being able to zip it up as needed to save or share is convenient and easy. Not sure why so much effort was made to avoid this module pattern. I really think using URLs is going to be a maintenance nightmare. Instead of changing one entry in package.json (great for testing new code or tweaking something locally), I have find/replace all the imports? Seems very odd unless I'm missing something big. Also, not a fan of the hidden .deno cache directory, but whatever.

Having to send along a shell script in order to even start an app is another problem. "Here's a JavaScript file which will do some task. In order to run it, you need to make sure you include these 10 command-line flags for security purposes. To make sure you don't forget them, I've included various shell scripts you can run depending on your platform." That's not improving anything.


> Instead of changing one entry in package.json (great for testing new code or tweaking something locally), I have find/replace all the imports?

This is my biggest beef. When building large scale applications there are many more files with imports than a typical large scale golang project that uses the url import scheme. It's just going to be a really annoying thing to deal with. ATM I see deno being great as a replacement for nodejs scripts and smaller scale projects. But for massive projects I don't see the gain.


How much time have spent with TypeScript? I'm sincerely asking, because the answer is either "not much" or you didn't do a lot of research. You've misunderstood (or chosen not to investigate) some very fundamental things about TypeScript.

> function add(x: number, y: number): number {...}

You don't have to do this in TypeScript. In fact, you can use TypeScript without transpiling your code and without creating any types yourself.

Why would you do that? Because a lot of open-source libraries now come with types included, so you get type-checking for those APIs essentially for free.

> I'd just use Java

Java has a nominal type system. If you want a new type, you have to create it.

TypeScript has structural typing with type inference. That means you "just code" and the compiler will pick up your types if you go. If you write inconsistent code, it'll warn you. If you want to refactor a type, you now have enough metadata for your code to do that confidently.

> When is Typescript going to have its own version of the V8 engine and just leave JavaScript out of it?

Microsoft has had its own V8 competitor since around the time TypeScript was released[1]. They literally can't leave JavaScript out of TypeScript because every JavaScript program is a valid TypeScript program. TypeScript is a superset of Javascript. Other than the type system, it does not introduce any new features that are not on the EcmaScript roadmap.

> harkens back to the classic Microsoft Embrace, Extend, Extinguish strategy

Uh, no, it doesn't, because they have (A) open-sourced all of it, and (B) they haven't extinguished anything. Even if they did, there are massive numbers of people, including Google, who use TypeScript in production and would keep it going. There is no practical way to extinguish anymore.

In fact, all of their behavior (e.g. with the .NET ecosystem) suggests that Microsoft doesn't do those things anymore.

1. https://en.wikipedia.org/wiki/Chakra_(JavaScript_engine)


> I also think those who want to use Typescript should do so, and stop pretending they're doing anything with Javascript.

It's constructive to make claims like people using Typescript are "pretending" to do something and should stop.

People use Typescript because they've decided that Javascript with static typing is useful.


> If I wanted deal with repetitive code like "function add(x: number, y: number): number {...}" I'd just use Java.

I don’t even like TS, but it’s pretty clear I can discount your opinion because you don’t even know what you’re talking about.


> Grabbing everything I need, and then being able to zip it up as needed to save or share is convenient and easy

Isn't there a risk of cross platform incompatibility with some of the packages that compile on install?


I have two questions: 1) How to port a node package to deno? 2) How to keep both a node and deno packages in the same repo?

Clearly without packages we are not going to use deno. Also I don't want to switch from well-proven packages like express/koa to an new unknown http server just because it supports deno.


There are some manual things you can do to ensure portability from Node to Deno:

* separate out the Node APIs from the rest of your application logic as much as possible as those will be different unrelated API conventions.

* convert your application from using require to using ES6 modules. This won’t work if you have more that few highly trivial dependencies as Node requires picking one way of importing files and that one way extends to your dependencies as well.

* Node uses callbacks for asynchronous logic where Deno uses promises, so ensure all callback functions are passed by reference so that the difference is an API difference instead of a logic difference.


Deno is a new thing. It's not node, and not having the entire npm ecosystem already around it can be seen as a blessing.

By the way a new http server doesn't "support" deno any more than Express "supports" node -- it's the other way around.


> for await (const req of serve({ port: 8000 })) { > req.respond({ body: "Hello World\n" }); > }

It's not the first time I see js iterators used and abused this way, and every time I feel that js and it's users are ready for full-blown monads and do notation instead.


First-class TypeScript support is the thing that excites me the most about this. It's so annoying when you want to navigate TypeScript code from a dependency in Node and it kicks you to a TypeScript-definition file instead of the actual TypeScript source.


I would really like a testing framework like AVA on deno, same amazing API of seperating parallel and serial tests and running them on multiple threads. Also I will need npm compatibility soon, once these are in, why would I ever write code in node.js?


Finally we can use it on production, great work everyone involved in making this happen!


For anyone that's interested, here's a list of projects I'm interested in or use with new major versions in 2020: http://maker.sh/2020


I tried it briefly on a side project a couple of months ago. I like some of the ideas in it, but my editor tooling (which is TypeScript-ready but not Deno-ready) wasn't ready for it. I might revisit in a year or two.


I don't want to be overly critical of a very well-intentioned and worthy effort, but it is looking less and less to me like Ryan and the rest of the Deno team understand what problems node.js developers care about. Specifically:

1. The sandboxing via --allow-net, --allow-fs, etc is very odd to me. I don't recall hearing about a lot of security issues that could have been prevented by this. I mean, I suppose this won't hurt anyone, but it's certainly never something that I've wanted. I expect most people will simply white-list most access by default in all new projects.

2. deno bundle: Initially, when I heard about this, I was excited. The need to bundle code via webpack and similar systems is one of the worst things about front end browser development. It introduces tons of complexity and makes debugging much harder. So I thought, wow, maybe deno is going to just give us a first-class solution to this problem that will just work. Nope. They are only concerned with bundling code such that it can be run elsewhere by deno. A friend put it thus: "we don't want bundlers anymore, so we're making deno bundle, the one true bundler!" "does it do any of the stuff that webpack does?" "mostly no, because it's not meant for that" Read the comments on https://github.com/denoland/deno/issues/2475 to get a sense of what I'm complaining about.

3. URL Imports. Packages in node.js are stupendously easy to use. Making me import from a URL instead of typing `yarn add foo` and `import * as foo from 'foo'` does not make my life any easier. It certainly makes the implementation easier, and I'm glad to see the last of node_modules and path-climbing module resolution. But the URL-based system offers no equivalent that I can see to lock files and automatic upgrades (e.g. `yarn upgrade`), which are two crucial features. This is something that node developers need and want, and if Deno doesn't provide it, 3rd parties will. My worry is that we'll end up with competing third-party solutions and a fractured ecosystem, due to Deno's failure of leadership here.

4. Stack traces: The single worst thing about using node is the frequency with which you get utterly useless stack traces that tell you nothing about the true source of the error. It was easier to debug C++ programs 20 years ago that it is to debug node.js apps today. But as far as I can tell, deno isn't making any efforts to improve this. That's ok, they don't have to fix every problem. But it does make me worry that they don't know what node.js developers actually care about, if this wasn't on their "must fix" list.

5. Typescript by default. I really like typescript. I mostly want to use it instead of javascript these days. But tsc is slow. Slooooooow. Really really slow. I am worried that most deno programs are going to be very slow to start up in practice. (Before you say "caching", 99% of the time that you want to start a program, it's because you just changed the source code. caching can only do so much in this case). And I am extremely skeptical of the plan to build a new typescript compiler in rust.

I'll conclude with praising a few things that I am very glad to see: No more node_modules, Web APIs, and a promise-based stdlib.


"It's important to understand that Deno is not a fork of Node - it's a completely new implementation. "

New implementation of WHAT? Is this a JS Engine, or a wrapper around V8? Which version?

Guys - whatever it is - I love it (!!!), but this is a very long and wordy article that does not well summarize the most relevant things, and uses words, long explanations for issues that I think are just a little specific.

"First Class Typescript" - does this mean just 'transpilation at runtime'? Or does it mean something more hardcore? What does it mean at all?

This could be 1/2 as long a 2x more explanatory and precise.

Finally - it all looks super cool - but why specifically am I interested in this over things like Node? Speed, security?

Not to take away from the great work.


This is interesting. Can you ship the Deno runtime with your runtime code? That's the thing that kills me about Javascript; having to assume the consumer has a runtime setup.


Compiling to a single binary is on the roadmap, but was dropped from 1.0


Great, thx, I'll definitely be keeping an eye on this project!


Could take that one step further and run apps from URL's like the browser does. eg. deno https://my.app


Is anyone else excited about a standard library out of the box?


I do a lot of django. Does anyone know if it can handle backpressure as described in the deno announcement? Or is that typically seen only in a microservices context?


Is the name Deno a play on shifting noDe over two spaces?


Yes. Maybe the next framework will be called "Done".


I want that hoodie but $100 is a little steep for me...


This sentence, five paragraphs below, says what this is about:

> Deno is a new runtime for executing JavaScript and TypeScript outside of the web browser.


Pet peeve: programming sites behind Cloudflare protection with email address protection enabled, because we regularly have things with at signs in them that aren’t email addresses, which it mangles, ruining my standard no-JS experience:

  import { serve } from "https://deno.land/[email protected]/http/server.ts";
I mean, std@0.50.0 is a syntactically valid email address, but… protecting email addresses on IP addresses? Why, Cloudflare, why?


As someone not from a js background, when would I use this? is it a drop in replacement for node with TS support? or something else


Deno will likely be well suited for network oriented server-side programs. Deno is not a drop-in replacement for node. It is an alternative JS runtime.


I think I'll use it for personal system scripts, as JS is the language I am most comfortable in.


I view it as a replacement for node that requires less overall handholding and ceremony.


No. Without some kind of signature check (SHA256 or SHA512 or something more modern - just say No). it's tempting though :)


This is fantastic. I would love to see vscode, coc, kde, gnome and other applications that use node extensions, to adopt Deno.


Basically better node by creator of node?


Yes


The key is, can it re-use all those npm modules? otherwise the ecosystem is too large to take on these days.


Does anyone know if there's an RSS feed somewhere for posts like these from the Deno project?


What do you think which major projects will use deno in future and which not?

React/React Native/Expo?


Soooooo yea. I'm excited.

(I've been using deno for ~6 months but I'm happy we hit 1.0!!!)


Waiting for Denodify so I don't have to do anything to switch from node.js


>JavaScript is the most widely used dynamic language

Wouldn't that actually be python?


There are way more browsers in use than Python runtimes. ;)


What makes you think Python is more widely used than Javascript?


Seems like a good fit for bundling and/or containerization of js-apps.


Main thing that would prevent me investing in this stack is that the runtime has been designed with typescript in mind. In a few years if the typescript thing blows over, you're left with a runtime and conventions skewed toward statically-typed programming patterns.


As someone who works with TypeScript extensively, TypeScript patterns are basically Javascript patterns - although perhaps with slightly fewer runtime type checks.


I used to dislike typescript, until I was hired to write it. It has warts, but the benefits outweigh them, and you aren't forced to use the parts you may not find beneficial. I think it's a good way to prototype a typed version of javascript, at the least.


Would like to see a comparison between node and deno using http2 and TLS.


This is fantastic news!


I'm drawn to the simple beauty of the deno.land website.


Installed and run the REPL deno, hit TAB twice there is nothing, node REPL give you auto-complete but there seems no for deno, not beginner friendly unfortunately.


This is pretty fantastic. excited to try it.


Nice. seems Deno is an anagram of Node.


In my mind, it came across as Deno(de) :)


I remember node stayed 0.11 or some sub 0.* numbers for the longest time. barely progressing version numbers, anything changed? Just wondering.


IMHO those were the golden years of developing with node.


> it's clear to us that ultimately the type checking needs to be implemented in Rust

I hope this results in a faster typescript compiler.


ELI5: why is this so much upvoted?


NODE -> * swap * -> DENO


Deno = no de, sorted ascending


Its improved security model


Added to the resume already


404 not found


http://deno.land itself (without the trailing `/v1`) seems to work. It's "a secure runtime for JavaScript and TypeScript".


it hadn't been deployed yet, it's up now


Hi, I've been thinking for a while that I want something like Deno that has something like WireGuard built in to it.

1) I make a public / private key pair.

2) I log on to a DNS-like system, and create an account for myself, upload my public key.

3) I start up a Deno server, giving it my private key, and I give it the address of the DNS. Deno uses my private key to inform the DNS that it exists and is serving traffic. (Much like any Dynamic DNS server.)

4) My friend does the same - makes a public / private key pair, finds a DNS he likes, uploads his public key, makes a Deno server and registers it with the DNS.

5) My friend and I trade DNS+public keys. It's a URL to the DNS that has the public key in it.

6) I go to some administrator page on my Deno, it looks a lot like this [1]. I create a new Connection, and plug in the DNS+public key of my friend. He does the same on his server.

7) On my admin page, I pick an app that I want to use to communicate with my friend. I can see a list of my Connections, and I just check off a box that the app has permission to communicate with that friend.

8) This app that I've picked, it only sees the world through an incredibly narrow API. It's like WireGuard. Deno handles all of the communication and permissions, much like Sandstorm.io. The first thing I want is just queued-up JSON messaging. The app can send and receive JSON messages. I envision that I want it to be in messaging mode (rather than "streaming online"), so it has something like ZeroMQ under the hood to pub/sub, client/server, all that. But roughly it has a bi-directional pipe for sending and receiving JSON and can queue up messages when the other server is offline.

9) I now have a sandboxed app that can't talk to the rest of the world, has its own private VPN-like connection to another process. All traffic is encrypted like WireGuard. The DNS aspect lets my friend move his server if the IP changes.

I feel like if some project kind of like Sandstorm, kind of like Dynamic DNS, kind of like WireGuard, and kind of like Deno all got together, they would make a beautiful thing.

When I think about how I wish Diaspora worked, as a way to replace Facebook, this is it. I wish it was built on top of something like this. This is the atomic unit of distributed, yet safe, that I wish internet apps were made out of. I imagine that I would be setting up hosting of the Deno server nodes for my friends and family, since I'm their tech support guy. But they'd be in charge of their own connections and apps in their nodes.

So, HN, time for you to tell me that it's 1) a terrible idea, 2) already exists, 3) why don't I just make it myself. Right? =)

[1] https://alpha.sandstorm.io/demo


You can run wireguard in a Linux network namespace, then start the app in that namespace. You can also use Systemd or Docker.


Awesome!


Nice, I deno about this.


FWIW I tried to install it in a) Windows 10 via PowerShell, success, it took 1 min. b) Linux (WSL1) with curl, 2 min, failed. 3) Linux(WSL1) via cargo.10 min failed with this jewel:

> error[E0433]: failed to resolve: use of undeclared type or module `proc_macro` --> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/swc_ecma_visit_macros-0.1.0/src/lib.rs:25:20 | 25 | pub fn define(tts: proc_macro::TokenStream) -> proc_macro::TokenStream { | ^^^^^^^^^^ use of undeclared type or module `proc_macro`

error[E0433]: failed to resolve: use of undeclared type or module `proc_macro` --> /home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/swc_ecma_visit_macros-0.1.0/src/lib.rs:25:48 | 25 | pub fn define(tts: proc_macro::TokenStream) -> proc_macro::TokenStream { | ^^^^^^^^^^ use of undeclared type or module `proc_macro`

error: aborting due to 2 previous errors

For more information about this error, try `rustc --explain E0433`. error: could not compile `swc_ecma_visit_macros`. warning: build failed, waiting for other jobs to finish... error: failed to compile `deno v1.0.0`, intermediate artifacts can be found at `/tmp/cargo-installFIkSV2`

Caused by: build failed


"Internally Deno uses Microsoft's TypeScript compiler to check types and produce JavaScript. Compared to the time it takes V8 to parse JavaScript, it is very slow"

TypeScript is way too slow for what it does and that's ridiculous for a language that rejects useful features because it doesn't want to "stray too far" from ECMAScript.

In my view, typechecking doesn't belong in the "compiler" part of TypeScript. It should stay in the IDE. The compiler should just strip type annotations and do other trivial transformations.


Here we go again guys https://xkcd.com/927/


Why are we still treating dynamic languages as "scripting languages" as if you can't do the same thing one does in Go?


So at this fine grain of semantic detail, making strong statements about differing labels feels a little silly, but I have always thought that the difference is dynamic refers to typing, whereas scripting is usually a reference to the language being interpreted rather than compiled. Dynamically typed languages can certainly be compiled (I think Clojure qualifies here?).


And in that case, what do we make of QuickJS, which compiles JavaScript to bytecode?


What does this comment even mean? I can’t make heads or tails of it.


And the very first line of sample code is adding a dependency on someone else's code.

Reminds me of https://npm.anvaka.com/#/view/2d/react-native

Good luck, though. Everyone lives in a different (mental) tribe I suppose...


The very first line of sample code is linking to Deno standard modules, basically the standard library for Deno, aiming to be feature complete enough to bring down the side of dependencies


So, I've never seen this before, but I'm 99.9% sure that's just part of the standard library being imported...


The claim "TypeScript is an extension of the JavaScript language that allows users to optionally provide type information" is misleading. Typescript is not an extension of Javascript (not all javascript code is valid TS code). Typescript is a subset of Javascript plus mild type checking. For more details a nice article https://medium.jonasbandi.net/here-is-why-you-might-not-want...


This is such a weird point to me, maybe pedantry at best. Typescript is, to 99% of people, just Javascript with types. With few exceptions, TS compiles 1:1 to JS.

That blog article just looks like an intro for beginners who don't quite know what static typing entails. Yes, static typing is different from runtime checks, that's why it's static.


The devil is in the detail


Do you hav any example of valid javascript that is not valid typescript ?


For example accessing static methods differ from vanilla Javascript. See https://stackoverflow.com/questions/33864167/accessing-stati...


I guess I'm wondering why Deno is targeting V8 instead of Servo? Maybe I'm mistaken, but Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

[0] https://servo.org/

[1] https://wiki.mozilla.org/Quantum/Stylo


You’ve made a category error; Servo is not a JavaScript engine.


> Servo is a modern, high-performance browser engine designed for both application and embedded use.

What does that mean? "browser engine"

Does it execute JavaScript code?

Edit:

I bought your Rust book on Amazon, supposed to be delivered this Friday. I can't wait!


Servo puts all the parts of a browser together. SpiderMonkey is what Servo uses to execute JavaScript. That’s shared with Firefox.

Ah cool! I hope you like it! :)


I won't be using Deno. For starters, I like javascript, not typescript, and I think Ryan was considering the idea of making Deno TypeScript only at one time. That is enough for me to never consider it alone.

I think it falls into the same category as Koa (TJs successor to Express) and the upcoming "Rome" project. Someone creates a successful open-source project and then at some point in the future decides to make a better one with questionable tangible benefit and various downsides of its own.

The downsides I can see of Deno so far are that it is slower than node, it is not compatible with any server-side node code because it has rewritten the standard library, and it is inspired and based on the Go standard library, seemingly using Go idioms and so on.

Its benefits are questionable, vague, and not really proven.


Hi,

I'm the co-founder of https://deno.services . We would like to make Deno first language in history comes with its own infrastructure.


Is there much point asking for signups when the only info there is is one sentence and a screenshot?



clojure.jar comes with tools.deps. It needs jvm as well though.


I think Deno is a mistake at this point... Yes we thank Ryan Dahl for his huge contributions to integrating libuv and v8 outside of a browser to create node.js but a lot of work has been created by a huge ecosystem of developers after this. I think this de-facto fork of node.js will only serve to create fragmentation, and confuse product owners about which platform to use.


If We were all follow your rationale, We will be all programing in COBOL still. Accept the pain of experiment and growing up.


Sometimes splitting a community is worth it, but it shouldn't be a decision made lightly. And Ryan Dahl should consider his words bear weight as the original creator of the platform. I think it's a fair issue for critique.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: