Hacker News new | past | comments | ask | show | jobs | submit login
CommonJS is hurting JavaScript (deno.com)
147 points by srwhittaker on June 30, 2023 | hide | past | favorite | 118 comments



Would anyone be interested in an article about the crusade to move JS to ESM? I've been considering writing one, here's a preview:

Sindresorus wrote a gist "Pure ESM modules"[0] and converted all his modules to Pure ESM, breaking anyone who attempted to `require` the latest versions of his code; he later locked the thread to prevent people from complaining. node-fetch released a pure ESM version a year ago that is ~10x less popular than the CommonJS version[1]. The results of these changes broke a lot of code and resulted in many hours of developers figuring out how make their projects compatible with Pure ESM modules (or decide to ignore them and use old CommonJS versions)--not to mention the tons of pointless drama on GitHub issues.

Meanwhile, TC-39 member Matteo Collima advocated a moderate approach dependent on where your module will be run [2]. So the crusade is led not by the Church, but by a handful of zealots dedicated to establishing ESM supremacy for unclear reasons (note how Sindresorus' gist lacks any justifications and how weak TFA's justifications are). It's kind of like the Python 2 to 3 move except with even less rationale and not driven by the core devs.

0 - https://gist.github.com/sindresorhus/a39789f98801d908bbc7ff3...

1 - https://www.npmjs.com/package/node-fetch?activeTab=versions

2 - https://github.com/nodejs/node/issues/33954#issuecomment-924...


ESM was a part of es2015. It's been 8 years that we've had to tangle with both cjs & esm. It's been absolutely awful for everyone.

This crusade is nowhere near zealous nor righteous enough against the infidels & non-believers.

But it also hasn't been effective enough at supporting/supplying the crusade either.

Matteo's statement was that Node hasn't stabilized their loader support so tools have a harm time migrating to esm. Imo it's a pity ecmascript never stabilized a module registry, that esm 1.0 shipped & most people thought it would happen; it's long felt like a bait & switch. But it wasn't a feature browsers needed or really wanted so that unfulfillment was unsurprising. Anyhow, IMO Matteo is making a technical point that it's still hard to finish the move, which is a different spin IMO than a "advocated a moderate approach".

Given the hurt we legitimately experience, I really wish Node and/or WinterCG or someone would prioritize figuring out & implementing whatever needs to go into a module registry/loader. And then beg the big tool chains that need this stuff to expedite their migrations, pretty pretty please.


> ESM was a part of es2015. It's been 8 years

Okay, but let's not resist inconveniencing ourselves with the facts.

ESM got browser support in 2017 and stable Node.js support in 2020.


> It's been absolutely awful for everyone.

It only became awful for me when people started publishing pure ESM packages; `npm i node-fetch` suddenly began resulting in broken scripts and I had to learn why. Prior to that, I happily used CJS outside of the browser and what I suppose is ESM in the browser (the `import` syntax provided by bundlers).

> a different spin IMO than a "advocated a moderate approach".

He said, "If your module target the Browser only, go for ESM. If your module target Node.js only, go for CJS."

This is moderate compared to the "Pure ESM" approach. The fetch API is built into browsers so I don't see why anyone would use `node-fetch` outside of Node.js, and yet the maintainers of `node-fetch` went Pure ESM anyway. Also that GitHub issue is titled "When will CommonJS modules (require) be deprecated and removed?" and his response was "There is no plan to deprecate CommonJS"[0].

0 - https://github.com/nodejs/node/issues/33954#issuecomment-776...


I mean, I took Sindre's move to ESM-only publishing as his own choice as the author and maintainer of his own OSS packages. In the spirit of OSS, he doesn't need to justify the choice, it's good enough that as the author he's making it. The previous CJS versions are still published, and anyone can fork the modules if they want to take on the maintenance burden of the CJS version.

That preference has had an impact on the Node ecosystem at large, given how prolific an OSS contributor Sindre is, but IMO that influence has been earned by the large body of work he's contributed.


And yet his older CJS versions get weekly downloads that are magnitudes higher than the newer ESM versions.

I'm working on greenfield projects that leadership is still insisting we avoid ESM-only packages for. The move that Sindre (and others like wooorm) made has not been well-received.


> And yet his older CJS versions get weekly downloads that are magnitudes higher than the newer ESM versions.

You write this as if that mattered...

Should he only works on stuff that gets more download? For what reasons? He works on what he wants to works on, it's amazing and he is lucky to be able to do it.

> The move that Sindre (and others like wooorm) made has not been well-received

It's normal to be sad to have lost someone that was working on something you needed, but anything else is just entitlement.


> You write this as if that mattered... Should he only works on stuff that gets more download?

It was a statement of fact. You appear to be drawing conclusions that were never hinted at nor implied. It's tiresome.

> It's normal to be sad to have lost someone that was working on something you needed, but anything else is just entitlement.

How and why are you applying entitlement and emotion to a documented statement of fact? Do you need to see links such as [1] to view that as fact? It's one of a myriad. Take your asinine analysis and commentary elsewhere, please.

[1] https://github.com/sindresorhus/meta/discussions/15


Many authors don't care if 5 dudes use their stuff, or the rest of the world, specially when taking zero monetary value out of it.


I'm one of them, as I'm a prolific open source contributor and maintainer. To argue that some high-profile authors don't get any monetary value from their public work is to be simply naive and uninformed. The two I mentioned have made careers on that work and derive the majority of their income from it.


OK so anyone disagreeing via their clicks on the down button, please provide a rebuttal. Have yet to see one.


This is irresponsible approach and this package should be forked by more responsible people. Person who made this change should be avoided in the future.

I'm all for ESM but alienating your own users is stupid. Create node-fetch-esm and support both versions until commonjs popularity would be low enough to be dropped.


That's all well and good, but what about those of us that already have a mountain of Jest tests with mocks that aren't supported in ESM mode? https://github.com/jestjs/jest/issues/9430

I've definitely worked around my fair share of CommonJS issues but until ESM "just works" I'm slightly pained by how aggressive the tone of this article is.


That’s mostly because of how Jest absolutely brutalizes the Require stack. It’s a problem with Jest AND CommonJS, not ES Modules.

In fact it’s a very illustrative example of why CommonJS should be taken out back and shot.


It's a problem with Jest and CommonJS only in so far as it's a problem with mocking in general: native ESM imports are read-only. You can roll your own dependency injection or use tools that parse/rewrite your imports at runtime, but this is hardly a good solution. I think the parent's comment still stands: there's a big gap remaining.


Are you saying I just shouldn't have access to that part of the runtime? I've also done my fair share of `require` hacking, and it's led to things like hot reload and lazy loading for Node backends. There's a lot of value in being able to mess with that part of the stack.


I really wish this sort of hackerly mojo was available in the browser too.

Being able to hot reload & instrument modules is power we should have. Will some people greatly mess it up & create hard to maintain software? Oh yes. But let not the bottom of thr curve be an excuse to clip off the top half of the bell curve. Ad Astra.

As esm was being specced out in es2015, there was an expectation there would be follow on work for a registry, where modules could be managed by the runtime.

That never happened. Heck, 2023 and we still don't have actually modular modules in web workers: there's no import-map support or spec! Issue #2 in import maps... 2. https://github.com/WICG/import-maps/issues/2


> As esm was being specced out in es2015, there was an expectation there would be follow on work for a registry, where modules could be managed by the runtime.

I've seen routing architectures that support HMR & runtime management of modules use an Object with methods that call `import('module-name')`. Would dynamic imports be sufficient to support the use case you described?


This is exactly my beef. The testability with commonjs is great, being able to easily mock dependencies is a huge boon for testing. I'd have no issue using only ESM if it had feature parity, or at least some way to achieve dynamic loading.


Unfortunately, migrations are necessary when major releases occur. At some point, the argument to cling onto old tech which holds back the entire ecosystem needs to be deprioritized. Years have passed already so it should not be a surprise when the ecosystem moves on.

Re mocks, an ecosystem should not be held back solely due to an arcane edge case. The apps that use test doubles can be rearchitected to support test doubles to the programmers' satisfaction. Most people do not use test doubles & the benefits of ESM & not having to deal with CJS outweigh the downsides of losing some convenience in mocking modules.

This is something that will not be popular with clingers to CJS, but it is something that will benefit the wider ecosystem. The vocal minority which is holding back the ecosystem should start making plans to migrate because it is in the process of happening right now...

So I think the general resentment is having to do extra work to support CJS which is not standard across all JS platforms...and some legacy libraries are still written in CJS requiring an interop. So it would be great to not have to do this extra work to support the legacy CJS on Node.js when every other JS platform is using ESM. In this case, one person's convenience comes at a cost to everyone else & at some point, that one person is going to have to suck it up & do the work to support his use case...just as everyone else had to do the work to support the legacy CJS for years now.


I did not perceive an aggressive tone from the article, I think you are projecting your annoyance with the Jest project and your architecture choices.


I can't tell you your perception is wrong, but I generally don't unload ornate rhetoric like "insidious saboteur" or verbs that relate to strong imagery like "rip out", "bury" when writing about code unless I intend for my tone to be aggressive.

My code works fine and the article isn't doing itself any favors by mocking me (heh) for thinking so.


Use vitest.


We're trying, we've had compatibility issues with node-canvas and our older version of graphql-js


Jest mocking is one of the worst anti-features ever invented.

We would be much better off if it had never been created.


Didn't see mention of Browserify and other bundlers after it that made CommonJS the defacto standard for client/browser libraries as well.

I think the biggest miss was not making mixed mode (default) for Node do it the way webpack/babel, etc did it by default in terms of interop. I get they wanted to make it more implicit to call cjs from esm, in the end it just inhibits conversion of existing libraries as dependencies are now a bigger hurdle.

Frankly, I like the Deno way of things better. I find it annoying, to say the least that the TypeScript team won't consider allowing you to import with a .ts(x) extension, and convert to .js(x) as part of the build process... no, you must omit the extension.

I've been using the import/export syntax since well before it was standardized via babeljs, these days I kind of want to remove webpack/babel from my pipelines altogether and mostly just rely on esbuild. I've also been using/following rome.tools development, having switched over several projects from eslint already, and will probably start with their bundler when it's ready.

I think there's a way to go with tree shaking and static analysis in that direction to reduce load. I also would not mind seeing the efforts to treat TS extensions as comments in the JS engines in that it would be easier to serve up straight TS/JS without bundling/minifying. I'm not sure we'll ever see a return to that in practical terms.

In the end, it's evolving. I'd also like to see Cloudflare, Deno and others come together on a more common server interface so that you can target multiple clouds with a single codebase. I don't know how well that would ever work out at this point though. There's aspects that I definitely like to all of them.


> I think the biggest miss was not making mixed mode (default) for Node do it the way webpack/babel, etc did it by default in terms of interop. I get they wanted to make it more implicit to call cjs from esm, in the end it just inhibits conversion of existing libraries as dependencies are now a bigger hurdle.

Huge huge agreement.

I forget the specifics but there was some super tiny corner case around maybe default exports that could potentially create ambiguity & that spawned a multi-year bellyaching around doing anything at all for interop. What Node got was incredibly hard fought for against much resistance to interop.

But the final compromises made everything so much more painful for everyone. So many esm projects but oh look a .eslintrc.cjs, how unsurprising & sad.

It's extra maddening because node had a wonderful just works (except that tiny tiny tiny corner case) interop via @standard-things/esm, which seamlessly let the two worlds interop. It'd been around for years before node started shipping support, and it was no ceremony just works bidirectional interoperability, and it took basically no effort or thought from the developers point of view to use. It sucked seeing us walk back from great, mired by frivolous over concern for a obscure corner-case.

https://github.com/standard-things/esm



Does anyone have a detailed understanding of why CommonJS (and its async incarnation, AMD) were not adopted by browsers?

I do much like the `import` syntax personally and its a little cleaner to read, but CommonJS and AMD were the undisputed winners of the module format until ES Modules were born. Not that I have a problem with ES Modules, I don't, however I am interested in what was so insufficient about the preceding formats that we couldn't have standardized on them

EDIT: I know about the deal with CommonJS being synchronous. That isn't per se an issue I don't think, esp. because AMD built on top of CommonJS primitives, and with minimal refactoring CommonJS code could be used in the browser when defined this way if asynchronicity is a must. Generally, what I "imagine" browsers doing with CommonJS is making the `require` calls async in the background (IE non visible to developers) so they can resolve the modules then parse the code. This isn't terribly different from how import statements work today.

I'm wondering why we didn't undertake the work to just improve the existing format, more or less.

EDIT 2: I'm interested from a historical perspective. I think ESM is the right choice and 100% the future.


I can speak to the one listed in the article: "difficult to tree-shake, which can remove unused modules and minimize bundle size."

This is because of a much deeper issue: static analysis is highly complex with the near-free-for-all that is CommonJS require & module.exports syntax. ES Modules is stricter and much easier to statically deal with.

At a high level, why? You can throw just about anything in an exports.module statement, and the syntax to "require" it also has a lot of leeway. You can actually see the code for this in the Node codebase--module resolution is handled in javascript @ /lib/internal/modules/cjs/loader.js vs /lib/internal/modules/esm (heads up, both approaches are a Lot to grok)

Understand that with the CJS approach, you can dynamically export modules at runtime under whatever name you wish, with whatever value you want, which may even include dynamic require statements themselves. Nightmare for static analysis.

It makes a lot more sense if you try it for yourself. Build a module resolution algorithm including: determining all the imports, all the files those imports are from, mixing with 3rd party and local imports, and building that chain recursively.

You can do it, but the edge cases surrounding CommonJS make it super difficult. I'd go so far as to say it's basically impossible to get 100% success in all the desired scenarios without directly invoking the code.


While I agree the dynamic nature of CommonJS would be problematic, there were successful projects around treeshaking commonjs[0] that worked really well.

I think dynamic imports have some of the same footguns here, to be honest. Can't deny ESM is easier to statically analyze though, that much appears to be true across the board based on available evidence.

[0]: https://github.com/indutny/webpack-common-shake


To be honest, I think the AMD incarnation is a complete non-starter. It’s just such a funky, weird little thing that only makes sense because it’s a compatibility shim. Nobody wants to directly author AMD, and someone shipping a JS implementation wants to ship features that people will use directly.

I mean, I guess people will directly write AMD modules, and make modules using some giant script that uses cat, but the future of JavaScript lies with making each source file a valid, correct piece of JavaScript. When each source file is valid and correct, and doesn’t need to be preprocessed in order to work, your tooling will work a lot better.

The browser authors know you can’t un-ship JavaScript features. ES6 import/export is damn good stuff, and people in the browser aren’t saddled with some weird compatibility shim like AMD.

The adoption of ES6 modules in the client-side landscape has far outstripped its adoption in Node.js. I honestly can’t wait for require() to die, in both its cjs and AMD variations. The tooling support for ES6 modules is miles better.


+this ... AMD was just weird and clunky to use in practice... CJS bundlers were much easier to grasp by comparison, and when browserify (and those that followed) came out, it was kind of a no-brainer at the time. If ESM were finalized maybe even 2 years earlier, we'd probably be using that already for everything. I think the Node team choosing to make esm/cjs interop more difficult than what babel had been doing slowed down the switch. I get the reasons why, I just don't agree with the approach in the end. I think if they made the interop good, and declared after Node v#, it would be esm only, that would have worked out better for everyone. The risk being a kind of Python 2->3 paralysis. If the interop was good for a few versions, I don't think the friction would have been that bad. Then after 2-3 years, the switch could have been much cleaner.


Yeah. The interop between ESM and CJS in Node is beyond horrible. My sense is that the developers are trying to iron out some differences so ESM can behave according to spec and CJS can also behave according to spec. These have to be problems with edge cases, right?

I don’t quite understand why you need the .cjs/.mjs stuff, either. You can tell the difference between an ES2015 module and CommonJS module after parsing, with the one exception of modules that have no imports and exports (which should be rare). What is the holdup, then? What’s breaking?


CommonJS requires invoking the code before the modules can be resolved, versus the ESModule syntax with "import" can be parsed out of the code separately (from the AST because it is a keyword). No invocation required.

I don't know if that's the entire story -- probably not -- but I do know that is one major differentiator for things like generating import-graphs and performing tree shaking.

(you can still do like `import('foo' + someVar)` which will only invoke dynamically at runtime, so I'm not sure how that case is dealt with)


> (you can still do like `import('foo' + someVar)` which will only invoke dynamically at runtime, so I'm not sure how that case is dealt with)

That case is dealt with more like a `fetch('foo' + someVar).then(r => eval(r.text()))` or similar (but of course it is not just a eval and it instead returns the exports of the module).

Dynamic imports and static ones behave very differently and static analysis generally ignores dynamic imports IIRC.

You also need to treat dynamic imports as async including everything that comes with that (error checking, awaiting, etc.)


`import(...)` returns a `Promise`, so it can resolve in the future after the file is parsed and compiled.


From what I remember, reading the conversations over the years...the issue was twofold:

1. Because `require()` is "just a magic function", it can't be statically analyzed by a JS runtime prior to actually running the code. This leads to limitations with regards to tree-shaking and other optimizations.

2. The last point leads to the even bigger (and probably "deal-breaker") reason for the change, the desire to fetch packages from URL sources. Since the syntax cannot be parsed efficiently, runtimes like Deno and Bun would have a much harder time fetching resources from URLs prior to running the code. The idea here, IIRC, was to eliminate the install step, the need for centralization on a single package manager and registry, and a general "non-Web" approach to the idea of packages and modules in JS.

I believe the `import` syntax was chosen to allow transitions away from `require()`, so that your programs wouldn't just stop working if ESM was enabled.


Your first point is absolutely spot-on but I am curious as to how much treeshaking was on the minds of masses at the time. The tooling of that era didn't really have any good support for tree shaking even for non-AMD includes and it was quite experimental tech (as in, I don't think it was a decision making factor for the majority of the tools on the scene).

The second point actually isn't strictly valid. I've written my own "all-in-one" async custom loader [0] that can require() CommonJS/AMD includes, regular "add a script tag" includes w/out any exports, or even css stylesheets all asynchronously, with asynchronous dependency trees for each async dependency in turn. You can define in the HTML source code a "source map" that maps each dependency name to a specific URL, so that you don't need knowledge of the filesystem tree to load dependencies.

Ideally, this source map can be generated via the tooling you use to compile the code (e.g. `tsc` is aware of the path to each dependency) but I haven't written my own tool to generate the require path to url map.

[0]: https://github.com/mqudsi/loader


Why can't implementations tree-shake through require(x) where x can be determined statically and warn where x cannot?


the import syntax makes it possible for the browser to start loading dependencies as soon as the module has been parsed, before it finishes being compiled. the require function would force the browser to download the script being depended on then and there, and since it's not async, the browser would need to pause script execution while the module is being loaded.


I don’t have an answer, and this is kind of superficial, but one thing I felt about the two was that import statements feel like compilation instructions. “Statically link this.” While Commonjs was a runtime function “synchronously acquire and parse this.”

I’m going to guess the good faith answer really involves some version of “CommonJS has some shortcomings and we didn’t want to confusingly write mostly-same syntax so we designed something new based on ideas from numerous languages.”


> and its async incarnation, AMD

A bare-bones implementation of AMD could be put together with less than a kilobyte of JavaScript (this is what we used at Mozilla for a minute circa 2012). Meanwhile, the ECMAScript folks were working on ES6, which was going to have a module system. Why would the browser build in support for a highly-opinionated system that you could implement yourself so trivially, all while a TC39-blessed standard was in the works?

> what I "imagine" browsers doing with CommonJS is making the `require` calls async in the background (IE non visible to developers) so they can resolve the modules then parse the code

That's not possible. You need to run the code to know what's being required: if I call `require('./' + getModuleName())`, you don't know what's being required until `getModuleName()` is evaluated. So you actually need to start running the JS. You need to pause execution of the code calling `require()` (a la `alert()`), and then you can download and parse the required module. When the file is downloaded, you can parse and execute the imported module. Each file would need to be downloaded/parsed/executed _synchronously_ in the order that each `require()` happens in: it's only async in so far as the JS pauses execution and picks up later.

> This isn't terribly different from how import statements work today.

Not so. You can find and resolve `import` statements (note: not `import()` calls, though these return Promises) without executing a JS file. You can parse the imports out of a file in one pass and fetch/parse/repeat for each import in the dependency tree before anything starts executing. Since "native" imports are static and declarative, you can resolve all of them without ever executing any code. And any dynamic imports return promises that the programmer needs to explicitly handle the behavior of at runtime.

> just improve the existing format

1. You'd have to kill dynamic imports (passing anything other than a string literal to `require()`, which would be impossible to do without breaking compatibility and couldn't be polyfilled.

2. AMD allowed a callback syntax for `require()` (it came out years before promises), which is cumbersome. Adding promises later would be challenging and leave technical debt.


> Adding promises later would be challenging and leave technical debt.

I wrote an "aio loader" many years ago that can load (in the browser) AMD/CommonJS/node or just "include this script in your html" dependencies that asynchronously loads dependencies (and their own dependencies) with support for use via plain `require()` without callbacks, `require(foo, foo => {})` callback support, and even dynamic async loading (`var App = await requireAsync("foo")`).

I never published it publicly (it's just ticking away on our production sites) but I was motivated to push it to GitHub just now [0].

[0]: https://github.com/mqudsi/loader


FWIW they did it[0]. Alameda is promise based AMD from the same folks that were key in AMD being successful when it was so successful

[0]: https://github.com/requirejs/alameda


Alameda does a great job of explaining why you can't just slap promises on something: it breaks the semantics of what `require()` returns. `require('foo')` returns foo, maybe. `require(['foo'])` returns `Promise<[foo]>`.


That's correct. For my own library (see sibling comment above) I started off with a single `require()` entry point that can be used to load a dependency, load a dependency and invoke a callback, or asynchronously load a dependency (e.g. return a promise) but then changed it to two separate functions (everyone's favorite `require()` plus an async version very cleverly named `requireAsync()`).


> Does anyone have a detailed understanding of why CommonJS (and its async incarnation, AMD) were not adopted by browsers?

I only started my career in ernest in 2012, but even then compatibility with old versions of IE was a major point, due to their high market share.

IE6 was officially retired in 2014, but even then it still accounted for 4.2% of the traffic:

https://www.computerworld.com/article/2488448/ie6--retired-b...

Then there were IE8-11, but it was IE6 which lingered way past its welcome, considering it was originally released in 2001.


> That isn't per se an issue I don't think, esp. because AMD built on top of CommonJS primitives, and with minimal refactoring CommonJS code could be used in the browser when defined this way if asynchronicity is a must.

This existed: the UMD module format was the turducken you got if you built modules to work both as AMD and CommonJS at the same time. AMD wrappers, async require, and a bunch of boilerplate to determine if the module was being loaded by an AMD loader or in a CommonJS environment (or worst of all, a CommonJS environment with AMD loader primitives).

It was a lot of ugly boilerplate. I don't think I ever saw a project intentionally write UMD modules by hand. I do recall some Typescript projects that distributed as UMD modules for a while, because that was boilerplate Typescript was always good at streamlining.

> I do much like the `import` syntax personally and its a little cleaner to read, but CommonJS and AMD were the undisputed winners of the module format until ES Modules were born. Not that I have a problem with ES Modules, I don't, however I am interested in what was so insufficient about the preceding formats that we couldn't have standardized on them

I think it is absolutely the syntax that needed standardizing. AMD was always a hack for module loading using available browser tech as best as it could and screaming for better syntax. There was so much pain every time working with AMD in making sure that define() wrappers were correct and the list of dependencies correctly matched the names and order of those as parameters of the module's function wrapper. AMD was always in desperate need of an import syntax. (One of the reasons Typescript was built was to provide such an import syntax ahead of ESM standardization. It's why I started using Typescript in the 0.x wilds.)

In many ways ESM were always the natural improvement of the AMD format. One of the things that hung browser standardization in various stages was debates about how compatible to be with AMD. There were multiple attempts and a lot of debate at "Loader APIs" that could be extension points to directly interface classic AMD loaders such as Require.js and the Browser's. Had one of those Loader APIs made the final cut it likely would have been possible to "natively" import legacy AMD directly from ESM.

Loader APIs lost to a number of factors including complexity and I think also the irony that CommonJS won the "bundler war" while those debates were going on. I think it must have seemed that the writing was on the wall that AMD compatibility was no longer that useful and Loader APIs were never going to be great for CommonJS compatibility (again, because of those synchronous assumptions that doomed CommonJS to always be the nemesis of browser modules).

(The dying compromises of the "Loader APIs" tangents is what eventually delivered importmaps.)

AMD compatibility without "Loader APIs" is basically impossible. Even though Require.JS was quite dominant, it was never the only loader, and part of its dominance was it was an extremely configurable loader with tons of plugins. There wasn't an "AMD loader standard" that browsers could emulate.

I generally do think that ESM is what we got trying to fix the syntax needs of AMD and clean up and actually standardize the AMD loader situation. In the end it didn't end up backwards compatible with AMD like it tried to do, but from my impression it certainly tried and that was unfortunately part of why ESM standardization was so slow and what led to such a larger mess of CommonJS modules in the wild in the time that took.


Today I came across a dilemma. I wanted to somehow stub or fake Azure's Service Bus. There's no official way to do it. So, I came up with an idea to overwrite its implementation with my own, by mapping @azure/service-bus to my own class in tsconfig.json, no luck, because the framework that I use is using Webpack under the hood, so I would need to create a plugin, rule, whatever, to make it work, but I didn't want to write a specific configuration just for Webpack.

Instead, I had another idea: to import @azure/service-bus dynamically, using ES dynamic `import`, in place of static imports. But, since I'm using Node.JS, I have to set the package.json type to module, so I can use top-level `await`, with dynamic `import` to import, with the help of a ternary, the correct implementation on the fly.

I had a extremely bad experience trying to convert a CommonJS project to use ES Modules before. So I did not go through with the plan.

Finally, after spending some time trying to not use CommonJS I gave up, and in place of dynamic import I used the "good" ol' `require()`, ending up with something like this:

    const { ServiceBus } = (isDev ? require('./my-service-bus') : require('@azure/service-bus')) as import('@azure/service-bus');
.

And that was that.

Project maintainers have to make ES Modules practical before it's pretty.


I wouldn't go as far as saying CommonJS is hurting JavaScript, but it's current status in the node ecosystem is definitely painful. I mean the default remains CommonJS, so using esmodules is awkward on 'native' node, but if you use the most popular bundling systems, the default becomes esmodules. I won't pretend I know which is better between the two, but the decision should be made to either make it optional or deprecate it if esmodules is really the Best Module System™. The current in-between is far from ideal, that's for sure.


Shrug

I think there should be more praise on these guys for what they accomplished given the state of JavaScript when they started. They saw a problem and came up with a solution. Was it perfect? No, but it's not this abominable creation.

Much like John Resig's work on jQuery nudged JavaScript forward, so did the work on CommonJS/Node.


Agreed -- the article actually acknowledges this point, but the clickbait title is not very generous.

CJS was doing just fine in Node.js for nearly a decade before ESM came along and made everything more difficult by shoving browser constraints into a server-side runtime. ESM may be the right direction for the whole ecosystem in the long run, but it's a little backwards to say the perfectly good incumbent system is "hurting" the language because everyone who invested in it doesn't want to go through the pain of migrating to a new fashionable system that is worse in many ways.


Quoting from the article:

"In 2009, CommonJS was exactly what JavaScript needed. The group took a tough problem and forced through a solution that continues to be used millions of times a day.

But with ESM as the standard and the focus shifting towards cloud primitives — the edge, browsers, and serverless compute — and CommonJS simply doesn’t cut it. ESM is a better solution for developers, as they can write browser-compliant code — and for users who get a better end experience."


This is what confused me in there (in the sense that the author seems to "get it" as to why CommonJS is still around). All of this ESM stuff has only (relatively) recently started to take shape. To say CommonJS is "hurting" JavaScript though seems overly-reactive. Technological evolution takes time and deep consideration to not create future messes. It's not the most comfortable but we're in the "messy middle" of moving from one to the other.


The problem with Javascript modules = Everyone will be forced to run a full web server to do anything (to handle CORS restrictions).

We all lose the ability to simply have a local index.html file, and have it Just Work (TM):

  <script src="script.js"></script>
This ability is amazing for demos, fast iteration, onboarding new devs and developing without a ton of layers of js ecosystem machinery.

Deno doesn't care about retaining this level of developer experience because Deno is marketing its own runtime / build step / ecosystem.


If you are same origin, you do not require CORS to use type=module. Also type=module works on HTML pages served from file:// (file:///.../index.html).


I just tried this and got "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at file:///home/**.js. (Reason: CORS request not http)" in firefox


It is not possible to do same origin from a file system. Everything served from a file system is considered to be different origins. This drives me bonkers.


Because browser makers only have respect for the bourgeoisie and clear contempt for the proletariat.


Is CommonJS even a thing for javascript running in web browsers? Moving from CommonJS to modules shouldn't affect any running in a browser.


Sure it is. I've used it via require.js. There were other ways too.


Using require.js is your choice. CommonJS often isn't a choice.


> <script src="/js/anything.js" ></script>

Shouldn't that be a relative path for it to actually work as you intend?


It just needs to be browser resolvable, assuming this element is meant to run in a browser. So absolute paths do work. For libraries, you may want to use import maps or absolute urls (similar to Deno). There's also esm.sh and jspm.org for modules that are able to be relatively easily converted.

I think it may be necessary/prudent to get some level of JSX support into browsers, much like the ts as comments efforts. Not sure how that will/would land. I was a pretty big fan of E4X, and had a prototype similar to React about a decade before it. In the end, who knows.


I think this is funny, since esm being "native" on browsers doesn't really matter until you can convince devs they don't actually need to use a bundler. So long as you're using a bundler, the browser's runtime doesn't really matter - you're using the runtime the bundler presents and emulates on the browser. Native ESM has proven to be quite painful in ecosystems that _don't_ rely on the presence of a bundler to patch over problems, precisely because of the issues of interoping with, I don't know, _literally any existing code_.

I can't think of a concrete benefit to a developer that ESM brings (just pain, but maybe I'm biased by what I'm exposed to). Probably why it's so slow to be adopted.


I've seen some great development environments where all of development/debugging is unbundled ESM directly in the browser. It's going to take a lot of momentum shift to swing the "bundler pendulum" back away from "always" to "as needed" (or even, shockingly, "never"), but I think it is going to happen. HTTP/2+ really does make "never" a bigger possibility than ever before, especially in cases like MPAs (and SSR SPAs that don't mind slower hydration outside certain fast paths).

Also, even some cases with bundlers, some of the modern bundlers (esbuild and swc) are still directly bundling to ESM now as the target. Lazy-loading boundaries and common/shared-code boundaries are just ESM imports and there's no "runtime emulation" there, just native browser loading at that point. They are just taking "small" ESM modules and making bigger ones.


I, too, thought http/2+ would encourage unbundled js, but unfortunately in a world where people are used to whole-program minification and dynamic app slicing, I don't think we'll _ever_ move away from bundlers. The build step is here to stay for most serious projects.

ESM may very well be the module system designed for a world that'll actually never exist, and will mostly just be an ill defined compilation target. But hey, maybe the next web module system will do better - those wasm working group people are working hard on their module system - and it's intended as a compilation target from the start, so shortcomings in it can be patched over by tools from the start :)


I do think it is "just" a matter of time/momentum. There's definitely a lot of cargo culting around bundlers ("these bundlers were gifted to us from the gods!", "these were my father's bundlers, I simply must use them!") and there's a lot of developers who "love" scaffolding and boiler plate that are just going to blindly do whatever `create-react-app` or `ng new` or whatever scaffolder of the day briefly masticates then vomits out for their baby bird mouths. Those things (culture, scaffolders) all move slowly (for lots of good reasons, too) and just take time and a few revolutionaries.

HTTP/1 still isn't going anywhere, anytime soon, especially in developer tools (because for better and worse HTTP/2+ requires TLS and developer tools are tougher to build with TLS), so it's still hard to combat the cultural assertions that "bundlers are best practice" and "bundlers are required for performance". But that too is something that shifts slowly over time. There are decades of momentum behind HTTP/1.x.

There's also still so much momentum behind frameworks like React, Angular, plenty more that spent so much time in CommonJS that they still haven't yet sorted out their ESM builds. That's also something getting better with time. Especially now that Node LTS covers enough that you can take an "ESM ONLY (in Node)" approach with confidence.

As I mentioned, I've had the pleasure of working with some projects that were 100% unbundled ESM in development and then spot-bundled (often with esbuild) to highly specific sub-bundles just for Production where Production performance noted improvements. The world of ESM-first is nice when you give it a chance. It will take a bit longer to get to "ESM ONLY, EVERY WHERE", but it still seems to be a matter of time/momentum rather than a problem with ESM.


I think it's still largely prudent to use bundler tools. I think the biggest issue with not going to ESM syntax comes down to static analysis and tree shaking. It's so much better with proper ESM, and will reduce overhead for the browsers. The bundlers definitely paper over various issues. Being able to import non-js resources like styles, json and other references is useful, to say the least. I don't think this will ever be really practical for direct browser use short of some generational compute and network improvements.

That said, we aren't that far off. Many site are spewing several MB of JS on load, and it's relatively well performing even on modest phones these days. At least relative to 90's dialup where the rule on loads was measured close to 15s. Things are absolutely snappy (mostly). I think the biggest hurdle today is shear entropy. React+MUI+Redux is imo pretty great, and getting to something similar in pure JS would take a lot of effort. Not insurmountable, but significant. There's still a new framework of the month nearly every month in the JS space.

Getting movement is hard. It'll take time and persistence.


> I really think that the needs of server side code are different enough than the needs of client side code that we’re better off drawing from Python and Ruby than from Dojo and jQuery.

This sentence sounds ok until Python and Ruby are held as the apparent gold standard of server development. That's not really the case, I think?


I think the historical context is necessary. At the time, Rails was a thing and python was taking server space from Java, PHP, etc. They weren't looking to those languages as standards, but rather as languages that were likely to have similar problems.


Thinking back to when CommonJS was implemented, absolutely. I don't think you'd want to call Spring or ASP the gold standards.


You know, now that I think about it, ASP might actually be the first example of a server side JS programming environment, kinda sorta. Nobody really did it, but you could use JScript instead of VBScript.


There was the Netscape JS stuff, but I don't think it was nearly as popular as Classic ASP by 2000. I wrote a lot of ASP in JScript, was able to reuse validation libraries, etc for common inputs/forms which was nice at the time. I think the difficult points, were COM iterators were kind of alien feeling in JScript, and developing COM controls at the time were awkward and a real pain to debug/diagnose even in VS at the time.

Not to mention, very little in the space of Classic ASP was open-source, free or even anything not insanely expensive from what I recall at the time. What was in the box was pretty much just JS and VBS, and most of what you could piece together was sluggish as all hell. Cool, you can use the spell checker from MS-Word... damn, three users tried to use it at the same time on the server. etc.

I have some fond and nightmare memories from those days. I think in terms of before jQuery (though scriptaculous, prototype and others were cool), after jQuery and after Browserify and 6to5/Babel. The ESM transition is much, much slower going.


Remember this was 10+ years ago. Many of the contemporary gold standards weren't fully a thing yet.


what are the contemporary gold standards?


Compared to JS it's hard to throw a rock and not hit something better than it...

But I can confirm deploying Python or Ruby apps is generally bigger PITA.


There's an explosion of module types.

Node has cjs and esm, Browser has legacy (create sub-objects in the global or window object) and esm.

I tried to write a polyglot, but was not successful. https://stackoverflow.com/questions/48396968/72314371 proposes a clever polyglot exploiting that await is parsed differently at the top level when in esm or not. However this doesn't help, because import and export keywords always fail hard (not catchable by try-catch) and eval (yuck!) doesn't help because inside eval your are legacy.

So you have to bundle if you want to provide for everybody... (shrugs)


I agree that CJS needs to go, and the sooner the better. However it's slightly irritating hearing this presented on the Deno blog. Deno is no better when it comes to causing issues for library authors who want to provide cross runtime compatible ESM modules. They bundle their own fork of TS which lags behind TS stable, no support for important things like `typeVersions`, no way to test your library with newer (or older) TS versions. You're better off authoring ESM libraries for Node then telling Deno users to use `npm:` imports.


I dislike CommonJS for a different reason: questionable "fs" module. This module mixes file I/O with file system accesses, which is confusing. More importantly, it does not provide line reading in a synchronous way. I sometimes work with multiple files simultaneously and need to control which file to read. It is awkward to do that with async APIs. Influenced by CommonJS, dart is similar. This ruins large text file processing. Most other languages can read a file line by line in a synchronous way.


Trying to reuse same code as it is written on both server and client is a bad idea.

As a language, JavaScript may support loading from file system or web. But, one of its use cases must not be code reuse on both client and server.

If I am to write a project in this way, how would it look like?

Assume, I write some server code in a server project that I want to use in client as well. How would I access that code from client? I need to make it accessible over internet. But, that is in a project that has other code as well, some of I do not want to be accessed over internet. What do I do? I will write rules in my web server to only allow access to specific files. Here and beyond, this process becomes complex and will get only worse.


Yep. I've started removing CommonJS support from my modules. I'm tired of wrangling JS development tools. ESM is the present, let's use it.


My opinion is that a moderate approach is required, maybe a new solution needs to be discovered.

I think ESM modules are most of the times desirable by developers of small to mid size packages. These developers want to have their packages used both on server and in browser.

On the other side heavy duty packages which are able to generate production loads (see Fastify) should use CommonJS. In my opinion production loads should not have importable elements, stay private and rather just run and get the job done. At the same time it is really not that important for such loads to have good loading times for the micro packages used.

This is debatable, but yet another aspect to consider: server side production loads must use CommonJS.


I just realized that programmers of production loads can quickly switch to ESM. Just perform a global replace of 'require' using 'import'.

What's actually preventing this from happening is the fact that not all official packages are ESM ready.

So the solution might be actually very easy. Set a deadline for the implementation of ESM and send frequent notifications to developers.

After the deadline all those running production loads will have to switch to ESM syntax when updating. This seems facile and organized.


The problem was never CommonJS. The problem was never ESM.

The problem has always been, and continues to be, stubborn backward compatibility. Node is making the same mistake that Microsoft made with Windows, that Apple did not make with OSX - rather than letting go of a system that's been outgrown and forcing the userbase to grow, they cling to the old way and the old API, allowing it to ferment.

This is the fault of the team working on Node and leadership making poor choices, including TC39.


The ESM-only packages like node-fetch are such a pain to work with in Node in a CommonJs project.

I think this forced ESM is causing a lot vulnerabilities in code base due to the pain converting projects to ESM is. I am busy enough to try to get tests pass for some new version of a package that got ESM-only.

Wished they would publish cjs and esm for packages that target Node. Fine you go ESM-only for browser only packages.


the benefits of esm are not compelling enough to rewrite everything. Browser-native module loading is a niche use case which can never be as performant as bundling (even if per-request overhead is minimized in http2, a chain of dependencies will lead to excess round trips).


Bundling works better with ES6 modules. Compare the experience of using, say, browserify to rollup/esbuild. Plus, with ES6 modules, you can do bundling for your release builds, and do your development work directly with the original source files.


> the benefits of esm are not compelling enough to rewrite everything.

It's rare that standards don't beat out non-standards in adoption.

ESM is the standard; there is only one way this ends,and it's not in favour of CJS.


> the benefits of esm are not compelling enough to rewrite everything.

Your "everything" somehow doesn't account for stuff that wasn't written in NodeJS's standards-incompatible way to begin with.


It hurts, it is an aboslut pain in the a... It is really time that this old tech dies and everybody converts their projects to ESM.


I hope that more modules transitioning to ESM will eliminate the practice of exporting a single function with other functions attached as properties. Drives my crazy


I also want to hurt Javascript.


My workaround is to use Typescript.


Wait, is ESModule something like

async function esmodule(commonjsModule) { return new Promise((res, rej) => res(require(commonjsModule))) }

?


I’ve managed several open source projects through their transition from CommonJS to ES modules and the least interesting part was the server-side of the equation. More often, the most exciting, and most excruciating aspect is wrangling TypeScript to emit browser friendly module paths. Anything short of relative paths everywhere won’t cut it, and TypeScript really doesn’t like to emit file extensions when TSX is involved.

There’s also some unsolved mysteries surrounding the path resolution defined by a package.json file, but at least there’s now a proper way to have a package use project root relative imports. Things usually go well until you get back to the browser, which now needs an Import Map to bridge the two worlds. I still haven’t figured out how to wean off NPM either since all the magic compiling CDNs use its namespace to create browser friendly bundles…sort of where we started from.

There’s a few foot guns on bundles now too, like deduping React so hooks work, along with some surprises about modules being stateful. And while Deno is pushing the dream forward, I can’t help but feel they compromised the vision too far for Node compatibility. At this rate, I could see Node v30 being a merge of the two projects.

I’m honestly happy it’s all coming along. It seems like this is JavaScript’s Python 3 moment, where everyone has to rewrite code to slightly new paradigm for the next generation of apps to fully appreciate. I’m most thankful for async imports operating like ordinary Promises!


Im about to make the transition for alasql. Any chance you can share a link to a repo you feel did this well?


> CommonJS is hurting JavaScript

That's just not good enough, it needs to strangle, crush and bury javascript...


Clickbait headline


[flagged]


Every time this is mentioned, the person is heavily downvoted.

But if you were here in the 2000, that's exactly what happened.

In fact, this very article starts with "JavaScript, the undisputed king of web development, is being sabotaged — not by a rival language or a revolutionary new technology, but by its own baggage from the past.". That's the first sentence.

Nobody wanted to work with JS, it was used to make snowflakes on your home page on Xmas at best.

Then things like gmail happened (using GWT!), and the web became the most awesome platform in the world. By then it was too late, JS was the de facto language of the coolest tech of human history.

It was such a trap Google spent billions and hired the best minds out there to make chrome, with a VM that could execute it fast enough, so the company could make its vision happen.

At the time, JS was a nuisance and an embarrassment, something we had to deal with. The "good parts" was the bible because it's a language you could only use if you ignore half of it.

Case in point, the most popular JS techs created after that existed precisely to avoid writing ES5. Like coffeescript, babel, webpack, etc.

JS is a decent language now, because the community invested so much into it. Provided you use tooling. A lot of it.

But that's not how it started. We were forced to use it, and had to spend considerable resources to make up for how bad it was.

I've yet to met anybody who knows Python/Ruby/C#/Java/Rujst/Go and JS to chose JS outside of the web. Because there, we have the choice.

Electron, unfortunately, might take a chunk of that choice, because once again, the browser platform is that great.


The most telling part of this comment is that you omitted PHP.

And the parent comment mentioning VBScript is the other half. VBScript was equally terrible.

Imagine a world where every interpreted runtime used Lisp. Or UCSD Pascal.

People would be here saying "this was forced on us!" And wishing that browsers ran MIPS assembler they could target with their new language of choice.


> I've yet to met anybody who knows Python/Ruby/C#/Java/Rujst/Go and JS to chose JS outside of the web. Because there, we have the choice.

Pleased to meet you! Isomorphic code bases are a case to use a single language, which before WASM, is Javascript & Typescript. I may prefer to use something different with WASM, which has even more interoperability than JS, becoming a bigger part of the ecosystem, but so far it would be Assemblyscript to more easily migrate my existing codebase. Assemblyscript has a Garbage Collector, so I may choose Rust or Zig if I don't want the GC. Rust probably has the most momentum, so despite the complexities, verbosity, & loss of productivity working with Rust, that may be a better choice when working with WASM. I would prefer Zig over Rust & Assemblyscript over both in cases of app development...but breadth & depth of the ecosystem matters.

I wouldn't touch PHP, Python, Ruby if I had a between using Typescript...as PHP, Python, & Ruby lack a decent static type system.

> Nobody wanted to work with JS, it was used to make snowflakes on your home page on Xmas at best.

I disagree. Javascript is a prototype based language, which provided opportunities for composition over inheritance. JS supported closures from it's beginning while other popular languages did not (well Ruby had limited support for closures). Inheritance & other OO concepts now considered harmful were prevalent with Java, C#, Ruby, Python, etc.

Sure, we could have used LISP instead, but for some reason LISP did not gain traction. JS was the closest programming to LISP which had popular appeal.

I suggest that JS changed the general programming ecosystem in profoundly positive ways & I, for one, preferred developing in JS over the other popular alternatives when I could.

To further bolster the case for JS, Typescript was built as an extension on top of JS, proving you can have the dynamic prototype based runtime with a powerful static type system.


I'd willing work on it nowadays, Node is really fast. Deno is really fast too, and TypeScript makes it pretty amazing to work with.

I think the ecosystem needs love again, because outside of Fastify I feel like server side JavaScript that isn't targeting a Lambda or Cloudflare worker type environment is a little spare of good options.

That said, Fastify is pretty amazing, and if you use graphql the main GQL servers are usually Apollo (JS based) or GraphQL Yoga (Also JS based), and they're really good.

One area that intensive JavaScript needs work on though is Machine Learning / Scientific computing. `number`, `BigInt` and `Math` aren't enough here. We need more built-ins added to the language around this (particularly around `Math`) IMO


> But if you were here in the 2000, that's exactly what happened.

That was 23 years ago.

It's an entire generation. A baby born after that point could now be working as a fully fledged adult.

> I've yet to met anybody who knows Python/Ruby/C#/Java/Rujst/Go and JS to chose JS outside of the web. Because there, we have the choice.

I would. Typescript is easily the best language for anything from app development to small projects and has been for years now. If you're bringing up Go/Rust, then TS has been incredibly good in that timeframe. Before that there was Pythong+Django and Ruby/RoR back when we did SSR, and Java applets or flash before that.


>> But if you were here in the 2000, that's exactly what happened.

> That was 23 years ago.

> It's an entire generation. A baby born after that point could now be working as a fully fledged adult.

And yet, it's no less true.

Regarding typescript, that's just, like, your opinion man. I mean, I think typescript is pretty good, especially given that its tied to the design choices of javascript, but objectively the best language. That's way too far.


> I've yet to met anybody who knows Python/Ruby/C#/Java/Rujst/Go and JS to chose JS outside of the web. Because there, we have the choice.

You probably haven't met many people. For backend you can choose any language, yet a lot of people (with knowledge of many languages) choose JS/TS. I'm one of those.

I enjoy TS more than python.


You’re not wrong at all.

But the parent lost complaining that JS exists at all is not really productive to a discussion about what effects JS today.

The “JS should have been something else” ship sailed a VERY long time ago.

It’s like people who go into C++ threads and complain it wasn’t rust from the start. That’s nice. But rust wasn’t there at the time.

It’s not useful for discussion.


> Heck, if MSFT had made VBScript available without license in browsers other than IE I think JavaScript would be an obscure footnote in history.

And we’d be moaning about how bad VBScript is, and how we wish another language had won.


It's almost a complete certainty that if Netscape used VBScript that the entire web browser concept would not exist due to legal action or being undermined technically. Don't you remember the dirty tricks they used with IE6? IE6 almost entirely cratered the web until MSFT unwittingly introduced XMLHttpRequest in an epic self-own.

In any case, WASM pretty much nullifies any problems with JavaScript going forward.


I agree that we should have another web programming option besides JavaScript.

However, the big boss still says "use JavaScript, period", and the tech economy dictates that supply side has the upper hand. The users are simple peasants who have no say in the matter. And frankly, none of them really care.

Unless Google execs start pushing an alternative on Chrome, we're stuck with what we have.


Microsoft made TypeScript instead, which has taken over the industry.


Which has taken over some parts of the industry. I work for a Fortune 100 company and Typescript isn't used widely at my company.


Not really an alternative to JavaScript, just an add-on.


Probably, but that didn't happen and here we are. JavaScript and the DOM are the language and API to manipulate page content, be it directly or as a compilation target.

(Me, I use them as a compilation target, so arguments like the ones in this article are about as interesting to me as arguments about whether the security rings in the next iteration of CPUs are appropriate for modeling proper access constraints).


No, ESM spec writers are hurting productive developers. I literally never think about require or import, because they're not problems I have.

But now, I get to fight importing dependencies with cancerous ESM design and think about language basics, while spec writers revert us all to CS 101 students trying to figure out how to do elementary things.

Which isn't going to change any of my mature code, either, I'm going to wrap the entire script in:

    (async () => {
    })();
and then:

    const { default: foo } = await import('bar');
So, has anything meaningful changed? No. I don't need to tree shake on the server, and if you're tree shaking on the client-side, you've already screwed up, and you're too inexperienced to realize it.


I hate hearing this pirate corporation language about JavaScript. We know you want it to be C++. The rest of us need it to be BASIC.


C++ is overrated. Use the crab language, instead.

/s




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: