Also, the default target configuration is "es2016," and modern browsers only support up to "es2015."
I had to recheck this was indeed an article from 2023, because this part surprised me greatly. To my understanding, es2016 only added Array.prototype.includes, the exponentiation operator (**), and preventing generator functions from being constructed. Even the slowest adopters already had these in 2017. Is the author assuming IE11 support?
This cheatsheet would have you publish dist/*.{js,d.ts}. Presumably you would use "files":["dist"] in package.json to exclude sources from being published.
The OP recommends to additionally package src/*.ts along with sourceMaps and declarationMaps.
I poke around in node_modules with some regularity (often in combination with the debugger) and it’s always nice to find actual source files, not just source maps.
Browser support for ES2015 is about 95% [0], while most ES2022 features sit at around 90% [1]. You can find the individual benchmarks on compat-table [2].
Some, but not all of these, are transpilable/polyfillable (refer to compat-table).
When we talk about "modern browsers", we're not talking about usage percentage, we're talking about browsers released in the past few years. If you need to support browsers older than that, that's great, but we don't call those modern browsers, that's legacy support.
Most modern browsers are evergreen, so targeting es2022 would be fine for most users. The exception is Safari, which is slower to incorporate new features and doesn't roll out its updates nearly as quickly as Chromium-based or Firefox, but even they have had es2016 support since 2016.
Sure, but the reason why browser compatibility is being measured is to quantify that notion of "modern".
You can then use that to make your own decisions; if you're working on a high-tech video streaming platform for kids you can probably safely ignore the 5%, but if you're providing information on behalf of the government you might need to support much more than that.
Also Safari greatly picked up pace a few years ago and honestly it’s hard to categorize them as behind anymore. They lead in several areas, and have caught up in most.
Is there a reason why npm packages should not target the highest possible target that an LTS Node.js supports? On the front-end those that need lower targets will run their own transpiling anyway to whatever target they want. I'm about to release a few simple npm packages and I'm considering ES2017 as the target (since it has the last major feature - async / await). I'm guessing it's just that the end-user of the library would need to be aware of the polyfills that need to be supplied. I have asked a question [0] about it a while back but it did not get any responses.
To clarify - the goal is if you have a project that uses ES7, why would you want to use a package written in ES2020 that is transpiled to, i.e. ES2016 using non-native async / await.
Some people write ES2015 code without any ES2016+ transpilers or polyfills (for both good and bad reasons). They might also be using different transpilers with varying settings (such as TypeScript's downlevelIteration, whose default value), and your package would then show different behaviour. And many setups don't transpile anything inside node_modules.
`tsc` is extremely slow to transpile TypeScript sources into JavaScript. It's invaluable for typechecking, but for transpilation there are much better solutions like SWC or tsup [1] (which nicely wraps esbuild and Rollup).
I like Typescript enums generally, my only complaint is that they are kept in the final build as big objects, especially in the scenario where you don't enumerate them. I use a replace function with terser to replace them with inlined constants to make my fine build smaller.
> `tsc` is extremely slow to transpile TypeScript sources into JavaScript
That’s because its main job is not transpiling to JS, but as a compiler it actually checks the correctness of the code, which is probably the main reason typescript exists in the first place.
You can actually run tsc without emitting JS at all in particular configurations.
I know this is a "me" thing, but I've come to the point of view that making bits in a different order (eg: classic "compilation" of one language into another) is a very useful side effect of compilers. My main love of them now is as a complete suite of tests with "100%" (some will argue this) code coverage of types, in the languages where that matters. You called functions with what you thought you did, you're getting back what you think you are, you're doing operations on things that actually support them, etc.
I'm wondering if there's a tool that converts TypeScript into JavaScript that has JsDoc type annotations and preserves API docs? Then you wouldn't need to include anything else. Let the application minify if they choose.
While the article is technically correct it avoids the most common issue with 'pure' typescript libraries in that you still need a bundler if you have multiple .ts files.
Once you enter that territory you realize how much more complex everything becomes.
I don't see how that changes the problem that OP describes.
TS module resolution looks almost like ESM, but neither the syntax nor the semantics are the same.
And or course you need a compiler to transpile TS, even if you wish to ignore the typings (often wrapped in a bundler like Vite which in turn wraps swc or ESBuild... because tsc is not as practical for large projects)
Really, having dealt with this kind of problem on Friday, it can make you go crazy.
E.g. having to deal with CJS-specific settings for some tool in a project using TS and exporting to ESM JS...
so explain to me again why I have to change my entire toolchain for simply consuming a library? What if I consume a wasm library written in Rust, suddenly I have to add rust into my build script? What is the point of bytecode if people are going to repeat the same build over and over again anyway?
TIL that types should be listed first in the export map thanks to that tool. I don't think it bothered anyone consuming my packages, but better be spec compliant and update them all. Thanks for sharing!
Anecdotally, ESM has been an absolute nightmare. Packages that aren’t compatible without code changes, having to add extensions to every single import unless we specify a flag (that is now deprecated), challenges with import maps and module resolution, the list goes on.
So we just use CommonJS with project references and call it a day. Everything works and we don’t need to change any of our code. The way ESM was introduced into Node has been the dumbest decision. And they’re doubling down on it. Excited for Bun to be compatible with my project so that I never have to deal with any of this again.
We recently switched to esm and it wasn't that painful. Ok, you have to put .js on imports, it's quite easy to search replace this with regex.
Any modules that didn't work (e.g. mocha) were switched over to more modern equivalents (vitest). This is probably a good thing.
We also export ESM and CommonJS source in our packages by first transpiling to ESM then using vite to convert to CommonJS. It works really well and allows us to use our packages in both environments.
I've been doing node for 10+ years and it was the python 2vs3 moment for me
I started using Rust for quick and dirty programs, web services, small frontend apps - and I noticed:
- slower to prototype simple things compared to node CJS
- equal or faster when compared to TypeScript
- faster to get complex things right
also, if one decided to publish as dual ESM + CJS package, depending on the user's toolings, one might encounter the Dual Package Hazard [1] that could be really hard to trace
The problem is that you can import commonjs modules in ESM but not the other way around. For stepci (https://stepci.com) we have chosen to not support ESM for this very reason. We want that the library “just works” for all our users
Maybe an aversion to bundlers exists, but I think tsup and vite are great and simple. We have turbopack on the horizon too.
For small projects I don’t even need a configuration file for tsup to target both formats. For larger more complex projects vite configs tend to be under 50 LoC.
Perhaps I’m missing something, but when I’ve done this in the past, I get weird package resolution because my index.js isn’t in the root (`npm publish` publishes the root and doesn’t support alternate paths to my knowledge) - in the end people have to `import x from “mylib/dist”`.
I got round this with some funky step where I copy the package.json into the dist folder and rewrite some paths. Is there a better way to do this?
You can use subpath exports like paulddraper suggested or in your `main` key in `package.json` you just point to whatever file you want to be served when `import X from 'Y'`.
It doesn't have to be `index.js` in the root of your package. That's the fallback if no `main` or subpath exports exist in `package.json`
It should work as you expect if you set “main” to “dist/index.js” in your package.json. If you tried that, some other strange thing must have been going wrong. Your workaround sounds fine but ideally you shouldn’t need it, yeah.
You missed a step in this article’s package.json. The “main” (or “module” for ESM) property defines the entry point. I’ve published dozens of packages and they all point to dist/index.js with zero issues.
> Will it work with ES Modules (ESM) and CommonJS (CJS) style imports?
It's so frustrating that in 2023 this is still a concern. It's probably a bad idea when we adopt the latest JS language standards before they are implemented in the runtime env. Transpiling sucks.
The problem is there isn't one runtime env, there are many.
On top of that, being implemented in at least two different environments is a prerequisite to get finalized as a standard per the tc39 process.
I agree with you when it comes to things that aren't far along- the decorator goat rodeo is a good example, but on the whole transpiling is essentially a part of the process now.
Maybe not related, but if a npm package has conditional exports, what's current best way to patch that package.json ? I want to replace its condition exports with mine.
If you use Yarn, there’s the `yarn patch` command [1], which lets you maintain patches for your dependencies. Even though I try to upstream patches wherever possible, sometimes you just want to apply a quick patch and move on, especially if the dependency is poorly maintained or even worse, deeply nested in your dependency hierarchy. I use `yarn patch` regularly, it’s one of the main reasons why I moved to Yarn in the first place.
If you’re not using Yarn, there seems to be a similar thing on npm, `patch-package`. [2] I never had to use that though.
I'm developing a npm package with just plain javascript modules. No more JS fatigue. I won't ever go back to TypeScript and JS compilers, modern JavaScript is capable enough to stop all this madness.
I had to recheck this was indeed an article from 2023, because this part surprised me greatly. To my understanding, es2016 only added Array.prototype.includes, the exponentiation operator (**), and preventing generator functions from being constructed. Even the slowest adopters already had these in 2017. Is the author assuming IE11 support?