This comment is gonna be bikeshedding, and definitely not a popular opinion in today's "modern" JS community, but here it goes anyways...
I don't understand why TC39 had to define a new syntax, new semantics, and new restrictions for modules in JS land when the JS community had already created solutions for modules in JS. People were using modules in JS before they knew that "loader" or "static import/exports" were a thing to be landed in ES6 in future. And frankly, what the JS community made for themselves was a lot more saner than what is coming natively to JS, and even more so compared to the native solutions for module loading in some other languages (compare to Python for example). I don't think any JS programmer wanted modules natively in JS. And with HTTP2, they would have just stopped bundling and a newer async implementation of `require` would have been conjured.
And all this "tree-shaking" etc. that is claimed to be possible only with the static syntax is blatantly incorrect. It is entirely possible to eliminate unused code with the commonjs/amd/umd way with a slightly more intelligent bundler. And the new "syntax" that they created is just a variant of destructuring. You can achieve nearly the same with ES6 destructuring and `require`.
The existing solutions that the JS community had created were insufficient.
CommonJS is anything but - the synchronous loading and node-style resolution doesn't work well over a network, and the scoping doesn't work out-of-the-box in a browser. I wish node had just discovered and used AMD-style modules.
AMD is pretty great for what it is, IMO. It's minimal, it works (without transpilation!), and it's pretty easy to understand how and why given the language and environment it's defined for. I'm kind of sad that it "lost" to CommonJS. Still, the syntax (even though given JS, it really just falls out) isn't preferable, and it's still imperative, which makes tooling more difficult.
Modules give a better syntax than either CommonJS or AMD, with asynchronous loading that works on a networks, and imports and exports are declarative so that tools are easy to write.
Tree-shaking and static analysis are only possible with CommonJS/AMD if you strictly limit them to declarative-style use: imports at the top-level only, exports as a literal, etc. Deviate from that and tools trip up. In that case, it's better to have a new, clear, declarative-only syntax.
> and the scoping doesn't work out-of-the-box in a browser
sorry, didn't get you on this one. Example?
> Tree-shaking and static analysis are only possible with CommonJS/AMD if you strictly limit them to declarative-style use: imports at the top-level only, exports as a literal, etc. Deviate from that and tools trip up. In that case, it's better to have a new, clear, declarative-only syntax.
Agreed with the former part, but I am not sure how the new syntax is better and clear. Infact the new syntax imposes limitations. The old regular would just not optimize in some cases. You won't lose anything or any optimization.
Further, the "top-level" only is a bit of an itch. I want to be able to do this:
if (DEBUG) {
import 'something';
}
which is a common usecase and something the builder can easily remove at build time. But ES treats that as a syntax error. The "old" way doesn't have this limitation. Surely the builder's grammar can be extended to support this with the new syntax, but it just feels more magical and weird.
You/someone might very well have an answer for that - honest question, not mocking :)
Replace your debug versions of libraries with stubs that contain no-ops. So in your code, just do:
import 'something';
And then create separate versions of 'something.js' for debug and release, where your release version just has stubs:
export function debugPrint(msg) {}
While the debug version might use console.log or winston or whatever.
If you compile your JS and the compiler provides tree-shaking, the calls will get optimized out entirely. Even if you don't, you can deploy the no-ops to prod and it'll use minimal bandwidth and no CPU time. And the best part is that it keeps the syntax of your code very clean; you don't have to litter your code with "if (DEBUG)", you just have it do what it needs to and those calls will equal doing nothing in prod.
(Not OP) Usually when I am using this technique it's for some 3rd party library that provides a useful debugging tool. To use your suggestions I'd need to create a wrapper that calls through to the 3rd party library in dev. Sounds like a huge pain.
You should be able to use Loader config for some of this: With the right plugin to your Loader you could do things similar to AOP/injection in debug/dev with a suitably complex Loader or set of Loader plugins.
Arguably things like this are part of why standardizing the Loader APIs has been so complex. But eventually it may pay out by letting you do some nice things very simply and in a way that is configurable for different environments and also transparent from your configuration how complicated or not your libraries and plugins may be set up. (Whereas dynamic rewriting in memory and dynamic loading and lots of scattered if() blocks can feel much more opaque.)
You'll be able to do that with System.import, kind of. It's async, and it returns a promise, which is sensible but it makes code a bit more complicated.
The ECMAScript wiki appears to be down but this polyfill has a good overview:
If you look at how browserify/webpack works, you'll see that AMD is the same thing except that you're coding the module wrappers by yourself manually, when browserify & webpack does it for you.
oh man, you're confused. AMD was originally a CommonJS compiler in its very early days. The guy who wrote it invented its own standard (AMD) way later than CommonJS.
How am I confused? AMD works in the browser, CommonJS doesn't. It's very simple, and tools that convert CommonJS to a format that works in the browser only prove that it doesn't.
The history of the formats does not change the facts of how they operate today.
At the very least, a first class syntax shows a different commitment level. Building a syntax into ES2015 states a definite, intended, and considered effort to deal with modules as a key part of the language moving forward.
An async implementation of `require` would still be subject to most of the same debate and complications of the "Loader" (how are module names converted to URL paths; how are URL paths deduplicated from being downloaded more than once or ran more than once; ...). You can't assume "Node-like" semantics in the browser (where it's more often more costly to walk the remote file system) and even amongst AMD solutions you'll find variations in how options are configured and how the different loaders work.
TC39 getting the syntax into ES2015 was as much a signal to WHATWG to stop kicking the "Loader" debate can down the road and actually make decisions and get things standardized as anything else. It's easy to believe that without that commitment of dedicated syntax in the language that the browsers would be a lot less incentivized, we'd still be having the Loader debate in 10 years and we'd still have increasingly subtle differences in AMD loaders and CommonJS loaders, and that nice new "async implementation of require" would continue to just be one option among many, with twelve different libraries implementing it in slightly different ways.
Personally, I think the ES2015 import/export syntax is much more aesthetically pleasing than AMD, and nicer to work with than CommonJS, but I can appreciate that there are plenty of aesthetic opinions on the subject.
It's not possible to eliminate unused code safely with CommonJS. require is entirely dynamic and can be used in arbitrary ways. It's impossible to analyse all such uses correctly, after all JS is Turing complete, so you're going to have to solve the halting problem first! In practice, this issue doesn't come up much, but some modules do use dynamic loading.
The same applies to module.exports, the final set of exports from a module can only be determined by executing that module. While you can spot common patterns, ultimately JS is, again, Turing complete, so the code has to be executed. In practice, this is frequently an issue as modules often use sophisticated logic to build up their exports.
So it's impossible to statically analyse all such dynamic modules, which is why the ES6 module system is static. This is a big win as it allows tree-shaking to be done with 100% safety.
Also note that the ES6 import syntax opens up future possibilities which are not possible by destructuring require, for example type annotations look set to come to JS and destructuring can't be used to import a type, because types are not values.
Considering that anything more complex than a primitive in JS is implemented as a prototype chain attached to a first-class-function constructor, I'd say that, in JS, types are absolutely values.
> I don't think any JS programmer wanted modules natively in JS
Seriously? Code organisation and globals are massive pain points; pre- CommonJS/AMD/UMD we needed both boilerplate + strict conventions, post- , well, they're all dependencies, they all need toolchains. Why should I not want to remove dependencies with something that is syntactically and practically extremely simple? I guess there may be a few JS programmers rubbing their hands at the thought of another bundler and another dependency to add to the toolchain (but it's better this time! promise! just install another 200 NPM modules and off we tootle into JS bliss!), I dunno?
Well, the biggest issue with using "require" (commonjs) is it is by definition synchronous. That works okay for server side when files are accessible (not best practice though) but it gets really difficult with network resources. JS modules are supposed to be a way to get the advantages with aynchronous module loading (as in require.js) without all of the callback hell.
Again, that is just an issue with the current implementation of the commonjs require. You can, for example, have a require implementation which runs your code inside a `GeneratorFunction` instead of a regular `Function` and use `yield require(...)` instead of just `require`, where `require` returns a promise and is async, and use something like `co` or bluebird's `coroutine` etc.. There is also AMD which is by definition async and gives you the same static export/import as ES6, leave for the new syntax.
It really is just that JS land hacked up something together and it worked great. Had the community went to lengths as much as people in the TC, we would have had much better everything, but it is a problem already solved enough that people don't care much.
> And with HTTP2, they would have just stopped bundling and a newer async implementation of `require` would have been conjured.
async and dynamic import can always be added later, if demand is great enough for them. Simpler is better to start off with, and require still works just fine (and extensions to require can easily be built on existing primitives).
> And the new "syntax" that they created is just a variant of destructuring.
If the `someRuntimeVariable` is statically evaluate-able, then some compilers (uglifyjs, for instance) support limited inline evaluation at compile time, which could be enhanced by community effort.
If not, then it won't :) But you still would have the same benefits when requiring stuff with plain string literals, and additionally you'd have the freedom to require dynamic stuff on the cost of lesser optimizations. You don't lose anything.
Nope. If we're talking about tree-shaking (which we should be as it's the entire point in having static modules), you loose everything. That's because require(someRuntimeVariable) can refer to any module. So there's no way to exclude any of the node_modules, or any portions of any of them.
So all code in all modules must be kept. No tree-shaking for you.
If you don't find JavaScript modules confusing, you should. I work on a module loader [1] and I find it quite confusing. To clarify we have 3 things:
1. import and export syntax, this is defined in ES2015. It's just syntax, it doesn't explain how modules should be loaded.
2. The Loader spec (http://whatwg.github.io/loader/) which defines a full loader including hooks to do just about anything you can possibly need. This allows you to do really crazy things like define modules dynamically and import then dynamically (which the static import/export syntax doesn't allow). We use this extensively in StealJS (or at least an older version of it) and it's pretty powerful.
3. import type=module is what this article is referring to. It defines a tag that allows you to import modules in the browser. You can't use a regular script tag, modules are different; they are strict mode by default for example.
If you read the GitHub issue there was some discussion of whether import type=module should use the Loader spec. It probably should. But the Loader spec isn't done so... reading the spec and seeing things like the module map [2] it sounds like import type=script defines a miniature sized loader.
I don't blame them for doing this, Loader has taken too long and people are clamoring for something. The Loader spec has a weird history; it was close to being done in ES2015 but when it was ejected it sit for a long time without much work. Then around last summer it was heavily worked on and got pretty far. But now as it sits, no commits like the last 2 and a half months [3]
At this point I wouldn't be surprised to see Loader go away. It's probably too controversially and too big to ever get done. I imagine the web will just have the import type=module loader and Node will do it's own this. This is going to make writing portable JavaScript as annoying going forward as it's always been.
One detail that's often lost in the discussion (that you might have been hinting at) is that without a loader modules can only be referred to by URL. Node-style name resolution won't work at all.
This means that modules will either still have to be compiled with something like browserify that resolves modules names into URLs ahead of time (a solution I don't particularly like), or modules will just have to use URLs to import other modules, leading to a convention of assuming that modules are all siblings so that you import like this:
import * as jquery from '../jquery/jquery.js';
Despite the extra verbosity, I prefer this since it requires that modules are flat and de-duplicated and doesn't require any resolution step at all.
Modules are not flat in reality though, any non-trivial sized app likely has multiple versions of the same module being used. I don't see any reason to think that's going to change in the future, extreme modularity is the norm in JavaScript these days.
It may be the norm right now (since npm came around), but that's no reason to excuse it for what it is: a mess. A mess we should be able to do without.
If we can have an official module-implementation which somehow forces developers to get a grip on their dependencies (which is the exact opposite of what "modern" js developers are doing), that can only be a good thing.
Every language and platform ever has had serious problems with multiple library versions. Anyone who has used an OS without a package manager knows this. I won't claim that npm doesn't also have problems, but at least its problems are different. Humanity can survive the existence of more than one paradigm for dealing with versioned libraries.
> any non-trivial sized app likely has multiple versions of the same module being used
Which I think is a very bad idea in general, but especially when shipping bytes to the browser.
IMO, one of the key responsibilities of package managers is to find a single solution to the version constraints in a dependency graph, and allow the user to resolve conflicts. Just in general, multiple incompatible versions of a library in an application is asking for hard-to-find bugs if those versions ever interact. With web apps though, it's just too easy to create unnecessarily bloated script bundles without even knowing it.
I'm not sure how a user can possibly resolve conflicts when 2 libraries are using 2 incompatible versions of some other library. Those need the versions they depend on.
You might not want multiple versions of Angular in your script bundle but multiple versions of small modules like an "extend" doesn't matter much.
But whether it's good or bad isn't really relevant, if you think that people are going to throw out the npm workflow because it doesn't fit into script type=module's semantics I think you're going to be disappointed. More likely they'll just not use script type=module (except perhaps after bundled).
Often times packages have far to narrow version constraints (usually from unnecessarily raising the lower bound), and/or a technically breaking change in a dependency doesn't actually break the depender. In these cases the user can override the version, test, and move on.
In more severe cases, like lodash 3->4, it is tough, but a user can try to downgrade the direct dependencies at the root of the paths that lead to the newer version.
The best antidote is for packages to maintain compatibility through consecutive major versions with decent deprecation rules, and for packages to update and test their dependencies upper bounds regularly.
I'd rather have each module have its own dependencies as subdirectories, sort of like node_modules. Everyone resolves paths from cwd.
If someone wants to share the same jquery across subdirs, I'm sure http redirects could be configured. Or alternately some sort of corporate versioning convention that developers can expect to have with absolute paths. Certainly better than ../jquery.
>> This allows you to do really crazy things like define modules dynamically
In python there's a library called "sh" and it has a novelty interface:
from sh import ls, pwd, my_unique_command
print ls('/dir')
...
I wondered (not enough to check right enough) if something similar could be done in JS. Although due to the async nature of such a call, it might not end up looking so familiar with JS in practice i guess.
This is huge. Finalization of the module loader spec is usually what's cited as the last remaining blocker for full implementation of modules in V8 [1]. Combined with HTTP2 being available nearly everywhere [2] and native ES6 support soon reaching most mainstream browsers [3], it means that web developers will soon be able to drop a large portion of the current JS stack, including spriting, Webpack, and Babel.
Re [3]: Nice, it seems that both Blink and Webkit are nearing 100% support for ES6. Safari Technology Preview already has 98%, so next major version of Safari (presumably both on desktop and on mobile) will have it too. So this autumn we can expect to have ES6 in every browser that matters!
Node 6.0 is also supposed to come out in late April [1], targeting v8 5.0, which IIRC corresponds to Chrome 50 on the compatibility table. Not quite as nice as the Chrome 52 release, but still covers most of the ES6 features I actually use.
FF48 and Edge 14 don't do so badly either. If you drill down into their missing test cases, it's largely features that few people care about.
Note that this table doesn't test all parts of ES6, obviously. It's not that hard to have a feature listed as "supported" in this table while not actually following the spec very well at all.
In practice, browsers usually try to not do that sort of thing, of course. But note the double-caveat there. ;)
> So this autumn we can expect to have ES6 in every browser that matters!
I wish I could afford to ignore Internet Explorer... if Microsoft leaves Windows 7 to rot with IE 11, corporate users are going to keep holding back the web for a few more years.
1. A system that converts <script type=module> to HTML imports so that loading and ordering is consistent across JS and HTML imports. The converter will run in `polytool build` (polytool is the Polymer CLI in development) and in `polytool serve` (the dev server).
2a. Chrome's implementation of <script type=module> will work correctly with HTML imports, so that HTML imports wait on any JS modules and their transitive dependencies before signaling ready.
2b. The HTML imports polyfill will wait for JS modules to load (this might require no work).
3. HTML imports itself is being re-imagined as a layer on top of the JS module loader, so that HTML is just a different file type that can be loaded and participate in the dependency graph. This idea of "HTML Modules" has positive feedback from other browser vendors so far.
There are many ways to load and execute JavaScript, static/dynamic adding a script tag to the html, html imports, manual fetch + eval , concatenation of javascript files in 1 file, in the future maybe the import syntax ? and now <script module> ?
Do we really want to add more ways ? Complexity is an enemy of security, it will add more confusion. Maybe we should first disable/remove ways to load and execute JavaScript.
We should also think, if what really need is a client solution or a better tool/system to transform scripts, in a script server side.
Why, what should be clearly answered before thinking in terms of how.
Sometimes having the bigger picture in your head helps.
Like JSLint or "use strict"; didn't remove JavaScript possibilities, it encouraged us to not write insane JavaScript.
They should consider making a document that lists all possibilities to organize code in modules and explain why some are betters than others, then in the same document explain the benefit for new syntax.
I don't understand why TC39 had to define a new syntax, new semantics, and new restrictions for modules in JS land when the JS community had already created solutions for modules in JS. People were using modules in JS before they knew that "loader" or "static import/exports" were a thing to be landed in ES6 in future. And frankly, what the JS community made for themselves was a lot more saner than what is coming natively to JS, and even more so compared to the native solutions for module loading in some other languages (compare to Python for example). I don't think any JS programmer wanted modules natively in JS. And with HTTP2, they would have just stopped bundling and a newer async implementation of `require` would have been conjured.
And all this "tree-shaking" etc. that is claimed to be possible only with the static syntax is blatantly incorrect. It is entirely possible to eliminate unused code with the commonjs/amd/umd way with a slightly more intelligent bundler. And the new "syntax" that they created is just a variant of destructuring. You can achieve nearly the same with ES6 destructuring and `require`.
sigh