Hacker News new | past | comments | ask | show | jobs | submit login
TypeScript NPM Packages Done Right (liblab.com)
102 points by chatgidipi on Sept 24, 2023 | hide | past | favorite | 78 comments



Also, the default target configuration is "es2016," and modern browsers only support up to "es2015."

I had to recheck this was indeed an article from 2023, because this part surprised me greatly. To my understanding, es2016 only added Array.prototype.includes, the exponentiation operator (**), and preventing generator functions from being constructed. Even the slowest adopters already had these in 2017. Is the author assuming IE11 support?


No idea. Matt Pocock has a much better cheat sheet for tsconfig configurations and his is set as es2022.

https://twitter.com/mattpocockuk/status/1701619240686485799



This cheatsheet would have you publish dist/*.{js,d.ts}. Presumably you would use "files":["dist"] in package.json to exclude sources from being published.

The OP recommends to additionally package src/*.ts along with sourceMaps and declarationMaps.

What's actually the best practice here?


I poke around in node_modules with some regularity (often in combination with the debugger) and it’s always nice to find actual source files, not just source maps.


Browser support for ES2015 is about 95% [0], while most ES2022 features sit at around 90% [1]. You can find the individual benchmarks on compat-table [2].

Some, but not all of these, are transpilable/polyfillable (refer to compat-table).

Edit: Those numbers are weighted by global usage.

[0] https://caniuse.com/es6

[1] https://caniuse.com/?feats=mdn-javascript_builtins_array_at,...

[2] https://kangax.github.io/compat-table/es2016plus/


When we talk about "modern browsers", we're not talking about usage percentage, we're talking about browsers released in the past few years. If you need to support browsers older than that, that's great, but we don't call those modern browsers, that's legacy support.

Most modern browsers are evergreen, so targeting es2022 would be fine for most users. The exception is Safari, which is slower to incorporate new features and doesn't roll out its updates nearly as quickly as Chromium-based or Firefox, but even they have had es2016 support since 2016.


Sure, but the reason why browser compatibility is being measured is to quantify that notion of "modern".

You can then use that to make your own decisions; if you're working on a high-tech video streaming platform for kids you can probably safely ignore the 5%, but if you're providing information on behalf of the government you might need to support much more than that.


Also Safari greatly picked up pace a few years ago and honestly it’s hard to categorize them as behind anymore. They lead in several areas, and have caught up in most.


I prefer to leave lib empty, it’s always caused me problems.


Yeah, I stopped reading after that. A browser that only "supports up to es2015" is not a "modern browser".


Yes, even ES2019 features have been supported by most mainstream browsers since late 2018, and by all of them by 2020


Is there a reason why npm packages should not target the highest possible target that an LTS Node.js supports? On the front-end those that need lower targets will run their own transpiling anyway to whatever target they want. I'm about to release a few simple npm packages and I'm considering ES2017 as the target (since it has the last major feature - async / await). I'm guessing it's just that the end-user of the library would need to be aware of the polyfills that need to be supplied. I have asked a question [0] about it a while back but it did not get any responses.

To clarify - the goal is if you have a project that uses ES7, why would you want to use a package written in ES2020 that is transpiled to, i.e. ES2016 using non-native async / await.

[0] https://stackoverflow.com/questions/76558023/whts-the-ideal-...


Some people write ES2015 code without any ES2016+ transpilers or polyfills (for both good and bad reasons). They might also be using different transpilers with varying settings (such as TypeScript's downlevelIteration, whose default value), and your package would then show different behaviour. And many setups don't transpile anything inside node_modules.


Those are some good points, thanks.


`tsc` is extremely slow to transpile TypeScript sources into JavaScript. It's invaluable for typechecking, but for transpilation there are much better solutions like SWC or tsup [1] (which nicely wraps esbuild and Rollup).

[1] https://tsup.egoist.dev/


> much better solutions like SWC

Which doesn't support const enum correctly and god knows what else without erroring.


Esbuild works fine and is also much faster than TSC.

Also TS enums are flawed: https://blog.logrocket.com/why-typescript-enums-suck/


Not that I care, but from your article:

> Enums in TypeScript are a very useful addition to the JavaScript language when used properly.

So, just like any language feature.


> enums flawed

I like Typescript enums generally, my only complaint is that they are kept in the final build as big objects, especially in the scenario where you don't enumerate them. I use a replace function with terser to replace them with inlined constants to make my fine build smaller.


That's what `const enum` is for!


> `tsc` is extremely slow to transpile TypeScript sources into JavaScript

That’s because its main job is not transpiling to JS, but as a compiler it actually checks the correctness of the code, which is probably the main reason typescript exists in the first place.

You can actually run tsc without emitting JS at all in particular configurations.


I know this is a "me" thing, but I've come to the point of view that making bits in a different order (eg: classic "compilation" of one language into another) is a very useful side effect of compilers. My main love of them now is as a complete suite of tests with "100%" (some will argue this) code coverage of types, in the languages where that matters. You called functions with what you thought you did, you're getting back what you think you are, you're doing operations on things that actually support them, etc.


Why do you need it to be fast?


To go from .ts -> .js -> execution more quickly. Am I misunderstanding something?


Is this a troll question or do you test and publish your code only every Friday?


I'm wondering if there's a tool that converts TypeScript into JavaScript that has JsDoc type annotations and preserves API docs? Then you wouldn't need to include anything else. Let the application minify if they choose.


https://github.com/angular/tsickle kind of does that. Though it's kind of deprecated and more specifically aimed at the Closure Compiler.


Does the typescript language server even look at jsdoc comments in libraries?


Yes it does


GPT 4 is great at converting TypeScript to JSDoc


As a one-time conversion, maybe, but it would be expensive and unreliable as part of a toolchain.


While the article is technically correct it avoids the most common issue with 'pure' typescript libraries in that you still need a bundler if you have multiple .ts files. Once you enter that territory you realize how much more complex everything becomes.


You don't necessarily need a bundler. You could use index.ts files to re-export all the exports of your source files.


I don't see how that changes the problem that OP describes.

TS module resolution looks almost like ESM, but neither the syntax nor the semantics are the same.

And or course you need a compiler to transpile TS, even if you wish to ignore the typings (often wrapped in a bundler like Vite which in turn wraps swc or ESBuild... because tsc is not as practical for large projects)

Really, having dealt with this kind of problem on Friday, it can make you go crazy.

E.g. having to deal with CJS-specific settings for some tool in a project using TS and exporting to ESM JS...


You don't need a bundler with multiple .ts files any more than with multiple .js file - which is that you don't "need" one at all.

Libraries especially don't and should be published to npm unbundled. Bundling is purely an application concern.


So consumers are forced to use a bundler/ts just because some random dependency decided they're too lazy to deal with a bundler?

> Bundling is purely an application concern

by this logic all compiled languages' repos should just be source, no precompiled binaries.


yes. Always include source and let the user/client decide. Include a bundled version is OK, but it should be only there for convenience.


so explain to me again why I have to change my entire toolchain for simply consuming a library? What if I consume a wasm library written in Rust, suddenly I have to add rust into my build script? What is the point of bytecode if people are going to repeat the same build over and over again anyway?


it's unfortunate that javascript (and typescript) is not bytecode.


It's the same in that js is the standard runtime, and typescript is something that compiles to it, that's NOT used in all codebases



TIL that types should be listed first in the export map thanks to that tool. I don't think it bothered anyone consuming my packages, but better be spec compliant and update them all. Thanks for sharing!

https://publint.dev/rules#exports_types_should_be_first


Also attw [1]. I run both tools in CI to ensure package correctness.

[1] https://github.com/arethetypeswrong/arethetypeswrong.github....


"module": "commonjs",

Nope.


Anecdotally, ESM has been an absolute nightmare. Packages that aren’t compatible without code changes, having to add extensions to every single import unless we specify a flag (that is now deprecated), challenges with import maps and module resolution, the list goes on.

So we just use CommonJS with project references and call it a day. Everything works and we don’t need to change any of our code. The way ESM was introduced into Node has been the dumbest decision. And they’re doubling down on it. Excited for Bun to be compatible with my project so that I never have to deal with any of this again.


We recently switched to esm and it wasn't that painful. Ok, you have to put .js on imports, it's quite easy to search replace this with regex.

Any modules that didn't work (e.g. mocha) were switched over to more modern equivalents (vitest). This is probably a good thing.

We also export ESM and CommonJS source in our packages by first transpiling to ESM then using vite to convert to CommonJS. It works really well and allows us to use our packages in both environments.


I've been doing node for 10+ years and it was the python 2vs3 moment for me

I started using Rust for quick and dirty programs, web services, small frontend apps - and I noticed: - slower to prototype simple things compared to node CJS - equal or faster when compared to TypeScript - faster to get complex things right


also, if one decided to publish as dual ESM + CJS package, depending on the user's toolings, one might encounter the Dual Package Hazard [1] that could be really hard to trace

[1] https://nodejs.org/api/packages.html#dual-commonjses-module-...


I understand the whole CommonJS Vs imports situation.

I have been that vocal minority that keeps complaining about CommonJS support to library authors. But I have given it up for the greater good.

we have a standard now, CommonJS needs to go, why keep pushing non standard stuff in a 2023 article.


Yes, when working with Angular, you'll see build warnings about CommonJS dependencies.

> ▲ Module 'some-library' used by 'src/app/some.component.ts' is not ESM

> CommonJS or AMD dependencies can cause optimization bailouts.

> For more information see: https://angular.io/guide/build#configuring-commonjs-dependen...


The problem is that you can import commonjs modules in ESM but not the other way around. For stepci (https://stepci.com) we have chosen to not support ESM for this very reason. We want that the library “just works” for all our users


You cannot import Common JS from standard JS modules in an environment that doesn't support Common JS, like... all browsers.

Everything should be published as standard modules at this point.


If you're using TypeScript, you have a build step anyway, so you can easily use ESM.


Remember we’re using Node. I don’t think TypeScript compiler supports importing ESM into CommonJS files.

Basically we will have to move the library to ESM and then use a bundler like esbuild to compile it to CJS as well for compatibility. It’s a mess


Try using the tsx package for running Node in situations like this. It magically “just works”

https://www.npmjs.com/package/tsx


> It’s a mess

I think everyone agrees about this.

What people can’t agree about is how to fix it.


I haven’t really struggled too much here.

Maybe an aversion to bundlers exists, but I think tsup and vite are great and simple. We have turbopack on the horizon too.

For small projects I don’t even need a configuration file for tsup to target both formats. For larger more complex projects vite configs tend to be under 50 LoC.


Exactly.


Perhaps I’m missing something, but when I’ve done this in the past, I get weird package resolution because my index.js isn’t in the root (`npm publish` publishes the root and doesn’t support alternate paths to my knowledge) - in the end people have to `import x from “mylib/dist”`.

I got round this with some funky step where I copy the package.json into the dist folder and rewrite some paths. Is there a better way to do this?


You can specify those exports in package.json.

  {
    "exports": "dist/index.js"
  }
or

  {
    "exports": {
      ".": "dist/index.js",
      "./*": "dist/*.js"
    }
  }
https://nodejs.org/api/packages.html#subpath-exports

Many packages do this.


EDIT: main and module properties are precursors to exports. exports is the recommended replacement.


Note that jest before v27, and some other runtimes, don't support export maps.


Jest and every other major runtime/bundler has supported exports for a couple years.

I.e. if you're releasing a package today, you probably don't have to worry about support.


A package I help maintain was released about a month ago, and we've had bug reports related to this: https://github.com/openai/openai-node/issues/304#issuecommen...

We simply chose to mark Jest 27 as unsupported in this case, but it did cost some debugging time.


You can use subpath exports like paulddraper suggested or in your `main` key in `package.json` you just point to whatever file you want to be served when `import X from 'Y'`.

It doesn't have to be `index.js` in the root of your package. That's the fallback if no `main` or subpath exports exist in `package.json`


It should work as you expect if you set “main” to “dist/index.js” in your package.json. If you tried that, some other strange thing must have been going wrong. Your workaround sounds fine but ideally you shouldn’t need it, yeah.


You missed a step in this article’s package.json. The “main” (or “module” for ESM) property defines the entry point. I’ve published dozens of packages and they all point to dist/index.js with zero issues.


Yup, we have a script which does exactly the same, copies to the main folder and then pushes.


I’ve experienced this as well.

For some projects I just configure publishConfig.directory directly at the dist folder.

While this may not look as clean, it is a lot easier for me to manage everything that’s published under a single directory.


Aslo interesting read on how to push a beta version https://kevinkreuzer.medium.com/publishing-a-beta-or-alpha-v...

So that you can test without doing increments which are not required


> Will it work with ES Modules (ESM) and CommonJS (CJS) style imports?

It's so frustrating that in 2023 this is still a concern. It's probably a bad idea when we adopt the latest JS language standards before they are implemented in the runtime env. Transpiling sucks.


The problem is there isn't one runtime env, there are many.

On top of that, being implemented in at least two different environments is a prerequisite to get finalized as a standard per the tc39 process.

I agree with you when it comes to things that aren't far along- the decorator goat rodeo is a good example, but on the whole transpiling is essentially a part of the process now.


Maybe not related, but if a npm package has conditional exports, what's current best way to patch that package.json ? I want to replace its condition exports with mine.


If you use Yarn, there’s the `yarn patch` command [1], which lets you maintain patches for your dependencies. Even though I try to upstream patches wherever possible, sometimes you just want to apply a quick patch and move on, especially if the dependency is poorly maintained or even worse, deeply nested in your dependency hierarchy. I use `yarn patch` regularly, it’s one of the main reasons why I moved to Yarn in the first place.

If you’re not using Yarn, there seems to be a similar thing on npm, `patch-package`. [2] I never had to use that though.

[1]: https://yarnpkg.com/cli/patch

[2]: https://www.npmjs.com/package/patch-package


I still use Babel to do the transpiling, because tsc doesn’t support plugins, such as babel-plugin-styled-components.


I'm developing a npm package with just plain javascript modules. No more JS fatigue. I won't ever go back to TypeScript and JS compilers, modern JavaScript is capable enough to stop all this madness.


Then atleast use JSDoc. Type definitions greatly help the users of your package.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: