Hacker News new | past | comments | ask | show | jobs | submit login
The Node ecosystem still has tooling problems (maxleiter.com)
84 points by MaxLeiter on July 30, 2022 | hide | past | favorite | 120 comments



This is really short. I think my complaints, compiled into an article, would probably fill out an article five or ten times as long.

There’s just so much shit. Everyone made their own system. There’s .js .ts .mjs .cjs extensions. Theres’s tsc and jest and npx and create-react-app. These are just the tip of the iceberg. If I try to update the build system to use newer versions of the software, all sorts of stuff will break. The really shocking part, for me, is that I feel like it’s easier to set up my build system for C than for JS.


Yes, there is a lot of shit, but stick to ES2015+ and you'll be mostly fine. No reason to continue with the .cjs .mjs crap. Either set "type": "module" or use Deno (the superior option) and don't look back. If others still want to use Common JS, that's their problem. Yeah, a lot of modules won't work out of the box because they're Common JS, and visa versa, but screw Common JS. That and goofy junkware like Gulp and Grunt which are overcomplicated, change their APIs frequently, and don't really do anything better than a Makefile. Ok, Gulp might be helpful if you're running Windows, but I pity those who still do web development on Windows without WSL2.


Gulp and Grunt? Are you a time traveler from 2012?



That is freaking bananas. Thanks for sharing.


Gulp is practical to insert extra behavior if you have a multi-step build process and are allergic to Webpack. Just write a gulpfile and spawn whatever you need spawning, like some glorified bash that is already in JS so you don't need to manage more languages.


I worked on a project using Gulp and Grunt as recently as 2020.

This was a product the company I was working for inherited from a merger. They bought the company for this product.

It was a nightmare of bad software practices. Still using Gulp in 2020 was only the tip of the iceberg.


At least (I hope) you weren't still using Bower.


Look at the source code of web.dev site :-) It's Eleventy with gulp for asset management.

https://github.com/GoogleChrome/web.dev


Both still work fine to this day. I don't know what the exact (dis)advantages of the competing toolsets are, but if you know Gulp and Grunt then I don't see why you wouldn't use them.


Was helping an org migrate off of CloudFoundry and Grunt gave us no end of trouble. In the end, we wound up building with Node 8(!) and then praying that everything would still work with a 12.x runtime.


unless one is in need of piping files (and streams) in a complex and prolific way, npm scripts or a makefile can supplant Gulp/Grunt with much less overhead.


Yeah Grunt is pretty ancient but Gulp is still used in many places.


Gulp it's really useful when your workflow gets out of what webpack, esbuild & cia expects


2012 === bad. 2022 == awesome!


Problem is that a ton of random stuff breaks when you "type": "module". Unexpected stuff, like Prettier.


Oh no, whatever will the world do without Prettier turning nicely formatted JS into a pile of unreadable shit because line 218 was 71 characters long instead of 69.


I’ve recently had a PR that Prettier would add 2 extra whitespace and Lint would complain about it. And these were in the same build process. The resolution was for me to change my code…


I reflected recently that, for years, I've carried a previously unverbalised and deep-seated expectation that JavaScript Will (Eventually) Be Replaced.

The language contains so many lurking horrors, accreted over the years, that it might even seem irrational to assume otherwise.

I suspect this sense of ephemerality then pervades the entire extended ecosystem, including all JS derivatives, reducing everyone's time horizon to maybe the next eighteen months at best. Thus the roulette wheel of broken tools keeps spinning.


I think these arguments often miss the “for who” both in the producer and consumer sense.

If you have a team or teams of engineers, having one or two people deal with all the dev-ops stuff makes a lot of problems go away.

But if you have a large C++ project for example, you probably need more than a couple people focusing on build and target dev-ops work, or at least a lot more of their time.

It’s a lot easier to get a frontend person to help out with the backend dev-ops than it is to hire a new C++ guru and wait a bunch of time to catch up on the nuances of your build systems/org/whatever.

Anecdotally, I see the people who balk at Node are usually enterprise Java devs, backend Python devs, or junior Go/Rust enthusiasts.

At the end of the day, all these languages can interoperate with all the others in a bunch of different ways, and an organization’s ability to engineer and maintain systems is mostly orthogonal to its choices of “driver” technologies.


Anecdotally, I've met a ton of people who balk at Node for backend stuff. I've been at two companies where it was actually prohibited.

> At the end of the day, all these languages can interoperate with all the others in a bunch of different ways, and an organization’s ability to engineer and maintain systems is mostly orthogonal to its choices of “driver” technologies.

I think people say that in an interest to keep the peace, but the assertion doesn't stand up to close scrutiny. Poor technological choices kill companies, or at the very least, make them perform more poorly. Choosing the right technology is a core skill for technology companies.

The problem is that each company is a goddamn unique snowflake. Company X has a bunch of front-end developers and spends most of their money on headcount. Company Y has a bunch of back-end developers and spends most of their money on infrastructure. The "best language for a project could be R, Python, Go, Rust, C++, TypeScript, C#, Java, or something else. Picking the wrong language can sometimes be outright disastrous compared to picking a good language. But most of the time, there's no clear "best" language.

Even though there's no clear "best" language, these people balking at Node may have a point.


This isn’t merely about tools. They’re one piece of a much larger puzzle. The fragmented chaos of the Node ecosystem arising from that aforementioned ephemerality ensures that eighteen months is roughly the duration before bitrot of dependencies is corrosive enough to have folks talking about rewrites. In four decades driving keyboards in various capacities, I’ve never seen any language experience dependency rot in production apps so rapidly as JavaScript and its offspring.

Aside: any shop operating under the assumption that devops is a function you assign to someone, has already failed devops 101.


I genuinely don’t believe you’re coming at this from a business stakeholder POV, which is fair for HN. But if you have to advocate for a devops org outside of some massive global scale or as a small % of revenue, you’re doing something wrong.

Unfortunately a proven playbook for struggling devops teams is to just fire all the devops and infra folks, which seems to help with platform stability, recruiting, and velocity year over year.


Your opening assumption is false. I start and run businesses.

Any reference to “devops team” is also automatically failing at devops. It’s not a job, a task, or a team. I don’t spare much time for folks encumbered by a silo mindset. It’s the antithesis of a service/product-team approach, and (to the actual point) does nothing to oppose the myopia I’m accusing the JS ecosystem of.


I do almost exclusively backend and DX development on Node and it's a joy. None of the frontend world's problems.

There is value in an ecosystem that lets you do it "your way," rather than enforcing convention. It's not for everybody though.


Yeah. We have both Scala and Node (TypeScript) backend services and honestly the latter are much nicer to work on.


Theoretically if you just used a runtime that didn't depend on code being transpiled and transformed a million times over (e.g deno) you could just run your damn code. It would even work in the browser, if you used `import/export` syntax instead of `require/module.exports`.

I don't buy the argument that the existence of separate binaries for the typescript compiler, test runner, packaging system, etc. is somehow unbearably confusing, I would almost prefer that over certain other languages where one magic command does it all. Although if you like that, you could try `bun`.


I could write an article solely about npm and just the swear words would take up more space than this article.


It’s beyond repair. The future is Rust, Go, and Deno.


The node ecosystem doesn't have a tooling problem. Frontend development has had a tooling problem since the introduction of React and Angular, and TypeScript. One might argue that goes back as far as CoffeeScript. But can we stop boosting these "OMG this was too hard" posts?

TypeScript (tsc) should display a hide-able message on invocation that states: "TypeScript isn't straightforward. It's a vast ecosystem all its own that accommodates several million developer's individual needs. It can be overwhelming. You don't have to use it."

I won't opine on the dumpster fire that is ESM support in Node - that's a completely separate topic and head-through-wall session.


What's the trouble with Typescript, exactly? The setup is relatively straighforward (at least, for Javascript tooling). Copy/paste a file or two and you're off to the races.

In comparison any frontend framework is much more difficult to learn. Typescript itself is just Javascript with types and the fancy parts of the type system are rarely ever required.


I'm in agreement with you. But I'm also an old salt that's been using it for nearly 8 years. I completely understand how someone unfamiliar with the language and its nuances may be overwhelmed by the configuration options and the effects they have on compilation. The types system does take time to learn, especially for those that didn't come up on a strongly (or hell, even loosely) typed language.

I also think there's enough resources out there for someone, who isn't in a hurry, to grok TS enough to put together a project that will compile. But skills and comprehension aren't uniform - I've seen many a junior fresh out of bootcamp look at a tsconfig.json file like it was voodoo. That's why I think it's fair to say "Hey, you don't have to use this."


One big advantage of typescript with respect to the configuration options is that you can look at the output js and see what the options are producing (with a low barrier to entry - the output is easily readable js aside from some of the older targets + async keyword)

The type system is super flexible and I agree it takes time to learn, but you can get a lot of value out of the basics from day one.

I've not actually tried this but I suspect even with the most lenient configuration and no type annotations you'd still get limited value over plain js through type inference and flagging of obvious bugs


> many a junior fresh out of bootcamp look at a tsconfig.json file like it was voodoo.

It's their job to learn technology of the time and it's your job to teach them... Everything is a voodoo for fresh boys.


As someone who loves TypeScript, sourcemaps are the problem. It’s extremely hard to get them everywhere. I can get them in about 90% of places, but inevitably there are a few spots where they don’t work.


Typescript + Node + ESM is definitely a problem that's not _completely_ unrelated. I do agree that Node setups are usually a lot easier to work with than frontend, though.


I have lost whole consecutive weeks to WebPack. It made me wish I could go back to something wholesome and healthy, like template metaprogramming in C++.

"I'm making progress, boss. The error output from the build is smaller than the input source code now."


Rant: could we please stop referring to a collection of <X> as an ecosystem? Node has a collection or environment of tools, frameworks and libraries to support it. An ecosystem is something living, a biological system, typically with a set of symbiotic properties within it.

The trend is the same in business. Stripe no longer has business partners; they have an “ecosystem.” The same with banks that provide wholesale banking services through a set of retail businesses - they are no longer wholesale banks but “ecosystem players”. This kind of jargon really dilutes the meaning of words and makes it hard to understand what is really meant.

Rant over, sorry.

Apart from my rant, it’s funny to see Node and JavaScript doing what almost killed Perl 20 years ago. There are so many ways of doing things, so many bespoke frameworks and packages (doing OO in Perl, anyone?), so many odd dialects that it fragmented the language and people got frustrated. The same is happening to JS right now, although with some important differences. There is hardly a fragmentation of usage via formal “dialects” in JS; rather, the problem is in the sprawl of libraries and “infrastructure” services, many of which are solved problems (e.g. Gulp reinventing what Make is really good at; Yarn reinventing what npm is supposed to do, etc). The supposed “ecosystem” is fragmenting day by day from its inherent complexity, and it’s exactly what caused web developers to flee from Perl to more simple, consistent alternatives.


An ecosystem is exactly what it is though, a rich and messy landscape of individuals and groups competing for habitat and resources with finite inputs. Like a biological system, there's a lot of similar forms with subtle differences trying to find or invent a niche, with many constantly evolving or going extinct. I mean literally you have individual organisms (programmers and projects or companies) competing for fitness, with a finite input of dollars and time, striving to reproduce and gain widespread dominance. If that's not a ecosystem, I don't know what is.

Perl was the first language I seriously learned. It wasn't really an ecosystem. More like moss slowing growing on an old tree. It was so niche as to be endemic.

In contrast, Javascript is a bustling ecosystem. It has a genome that is constantly evolving, extremely rapidly, not unlike how pop and rock music have evolved through the decades. It's very much a living, breathing system, and a fascinating case study of group dynamics and organisms trying to optimize for some local maximum while still being a part of a bigger system in the abstract.

Organic life is beautiful and messy because there was no predefined "best answer" and it's all just information vs information, algorithms and processors trying to find their place in a vast, cold world, scavenging what little energy the sun happens to shine their way. Javascript really isn't so different. It's utterly chaotic BECAUSE it's alive.


I’m doing Perl for a living and I don’t see it. Perl died because it sucks as a programming language. It’s good as a scripting language but that’s it. Perl was created before some widespread modern concepts existed so the terminology and implementation is all wrong. Also tons of Perl libraries go way too far and implement new keywords (well not really but it’s just like it) and modify the structure of the language. Error handling sucks, debugging sucks, tooling sucks, Cpan sucks both as a tool and as a repo library, even logging with Caro sucks, wtf kind of level is « croak » or « confess »? Also each minor version of Perl breaks everything, with JSON::encode we have issues with scalars not properly encoded as numbers between 2 versions.

Perl is a dumpster fire for real programming, that’s why it died.

I also do Typescript for a living, and on the contrary, the ecosystem (yes) keeps improving. I use Parcel 2 on my projects and it’s amazing. I’ll try Vite someday but it’s not a priority because Parcel works.

For your other examples, well yarn is less relevant today but it helped drastically improve npm.

As for Make, do you really think it’s a perfect solution? I use it but it’s syntax is opaque, it doesn’t support env files easily, it’s not even really cross platform because of GNU make/BSD make. It’s dishonest to think there’s no reason to reinvent Make


It's a perfectly apt metaphor which is clear and useful. Your rant seems unfounded.


An ecosystem isn't alive, it's a collection of independent species that have an emergent behaviour.


By that line of thought, is life alive? It's just a collection of interdependent cells and molecules with emergent behavior. Seems like an ecosystem is alive collectively, in the same way that an individual is alive, just on a different scale of organization.


> Node is a problem

Fixed it for you.

The node ecosystem embodies mediocrity. There is no design, no testing, no stability, no benchmarks, no RFCs, no plan to produce a coherent product. Instead, you're giving a half baked set of features that are abandoned within 8-12 months. Not invented here syndrome runs wild, and over-engineering any solution is seen as a mark of accomplishment.

Simple things, like displaying static text, now require 25mb of js files for whatever framework the author is currently playing around with as a hobby. I miss the days of the web where you could just load a page in under 25kb.


> I miss the days of the web where you could just load a page in under 25kb.

Ah, yes, the old web where there was "no design, no testing, no stability, no benchmarks".

Don't get me wrong. I liked the old days when content was just HTML, but when it comes to building applications, the old ways were just terrible. You could push JS files without even checking that it's valid syntax.


Our node docker images are 600-1000 MB for relatively simple website (backend and frontend). 25 MB would be so good and lightweight.


Unless I'm massively misunderstanding your setup, that Docker image contains an entire OS and runtime, not just your website. Reasonable people can disagree about whether that entire paradigm is sensible but, having chosen to use Docker, complaining about image size in this way seems an odd choice. More to this point, most of what is in the image will be little to do with Node or its build systems..


The NodeJS alpine base image is only 40 MB.


Then you _npm install_ something, and BOOOM! 600MB !


You can use multi-stage builds [1] so your result image would be as small as possible by just copying the compiled JS from the build stage.

[1]: https://docs.docker.com/develop/develop-images/multistage-bu...


You still need to download hundreds of megabytes of packages, unpack those into gigabyte of storage, "compile" it into a final form.

Also you need to keep those huge images, if you want to have any reasonable caching. Otherwise builds will take forever.


Compiled JavaScript?


Bundled. With esbuild, for example.


My bad, I meant "bundled".


That sounds like you are bundling the whole node_modules directory. Have you looked into Alpine based images and vendoring js dependencies into a bundle? 1 GB seems too large even for nodejs.


Old man yells at clouds?

There's an overabundance of interesting and great software on js, but yet we fixate on cherrypicked bad examples of things one doesn't like.


Tooling in the JS ecosystem (specially frontend) is very frustrating. It's just layers on top of layers with a weak foundation and a whole set of annoying incompatibilities between them.


The transition to ES6 modules made this much worse. Before that, for a npm package you could basically just call tsc on your entry point and then type `npm publish`. Now with the ES6 situation various dependencies and common packages demand you upgrade to ES6, and customers also ask for import syntax so their builds can do “tree shaking”. I maintain a Emscripten-based typescript library and the Emscripten wrapper code is allergic to ES6 imports - everything breaks if you do that since it removes various node globals. I wasted hours trying to figure out a way around, or a fix in Emscripten itself before giving up.


It’s funny I’ve been looking at industrial strength build systems like Bazel (language independent build tool from Google) recently the past few months just to get an idea of how things work in that world.

One of the things happening in that community at the moment was their second major attempt to bring the JavaScript / Typescript and Node ecosystem onboard as a semi official language. It has been a 4 year effort so far and they had to make a bunch of compromises along the way as you can see in the design section of the projects readme here https://github.com/aspect-build/rules_js

Watching the walls they would continually run into where the answer would essentially be “Node is very unlike other language runtimes” was illuminating and I suddenly started to understand why projects like Deno want to actually leave Node behind as a fundamentally unfixable mess and start again with a bit more planning this time around to avoid all of those issues.


T'was all fine until folks had to come up with useless typescript and modules and webpack churn and Es2015+ bloat. It's not like we didn't warn them to turn JavaScript into Java; for some of us getting away from Java bloat was the entire point of node.js. Well that, and because CommonJS was portable, or used to be at least. If you want a typed language, just use one, rather than foobaring JavaScript or Python into something they just aren't, a fool's errand.


Since JavaScript is the only way to interact with the DOM in a web browser, you literally cannot use anything different than JavaScript, which makes your point to use other languages quite hard to follow.


WASM ?


WASM came up way later than TypeScript and Webpack did.

Also, no direct DOM access. You still need a Javascript proxy for that.


> useless typescript

The only reason I started and continue to use JS is because of TypeScript. There is no world in which I'd use pure JS over TS.


Different folks, different strokes I guess: there's no world in which I'd use an MS-proprietary language over a broadly implemented one, let alone a proprietary language changing all the time and that has a Turing-complete type system, so I don't need to complain about bloat and churn, which is the whole point of this thread.


It's open source, not proprietary at all. Suit yourself, I don't have time to deal with runtime errors when they could easily be caught at compile time. I don't care which company makes something as long as it's a good technology. This is also why I use Copilot without any of the alarm that some people on HN and elsewhere display about it.


> It's open source, not proprietary at all

"Open source" and "proprietary" are not antonyms. Something can be both open source and proprietary, like XUL. TypeScript is one of them. (So are the NodeJS APIs, for that matter.)


Java’s ecosystem is in eons better shape, its build tools are actually working so I really don’t get your point. If anything, the JS ecosystem could learn quite a few things from Java.


The problems start with frameworks that require custom compilers and non-standard module systems.

If you use vanilla JS in standard modules most of this goes away.


Well then you need to use vanilla JS, which seems to be widely regarded as not great “at scale”. I certainly will never go back to JS from TS. Fearless refactoring is worth slogging through all the JSON config soup.


Fearless refactoring? What about tests?

Sure TS helps, but tests are “better”


Tests and static typing are not excluding each other. Best would be to have both.


I'm pretty happy with just vanilla JavaScript.

I don't do typescript, I don't use frameworks (unless you consider express to be a framework).

I use vanilla web components on the front end, if I need reactivity and frequent re-renders I might sprinkle in a bit of lit-html, depending on the component.

Debugging and development are brain dead simple, just sticking to the standards.

I had a new developer come into my code base and he said "I couldn't believe it at first, you just write something and it does it... Wow!".

He was so used to piles of frameworks and build systems and stuff, he'd never been in a code base that just... ran.


I just checked my most recent frontend project that uses Typescript as much as possible. Here's how it works:

- src/.ts: Compile with esbuild in webpack

- tests/.test.ts: Compiled with babel

- jest.config.ts: Compiled with ts-node

- webpack.config.ts: Compiled with ts-node

In an ideal world, one tool should do everything but I doubt that will ever happen


I'm curious: Why use webpack with esbuild in a "most recent frontend project"?

Not so long ago at $DAYJOB we even migrated a big legacy project from webpack to esbuild in no time.


I've used webpack for several years now and have written custom plugins to accomplish what I need. esbuild-loader gives great typescript performance almost on par with vite so I don't see the need to learn a new build system.

Also, there's no official Vue 3 support for esbuild so that's a problem for some of my projects


I use pnpm and Sveltekit and have none of these problems. Everything just works, and it’s rare anything related to tooling takes longer than 1second to accomplish.

Being on OSX helps a lot with the “just works” part though.


It’s still early days, but I have been working on this…

There are scaffolding tools that help configure all the JavaScript frontend stuff you need, but the problem is you run them once and you can’t ever run them again to change/add stuff.

So I built Confgen which is sort of like create-react-app except it’s idempotent:

https://github.com/erikpukinskis/confgen

It’s very alpha, but I would love to get ideas/ bug reports on GitHub.

It’s also currently Vite-only, but I’m open to a possible webpack/babel mode in the future.


Keep in mind that most of these are just rants about the developer experience, which is sacrificed in exchange for faster iteration and deployments on majority of business use cases.

As a Node.js developer myself I even noticed that some of the best solutions is nodejs plus other language.

Examples

uwebsockets.js uses some c++ code. it is faster and has better api than express, fastify, hapi, koa, and has builtin websockets support.

esbuild uses golang code. it is faster than webpack and babel for bundling and minifying javascript code.


Understandable frustrations but without a comparison to other major languages in the same domain its a bit moot.

Is it just the Node ecosystem that has these types of problems?


Node succeeded by “simply” being plain JS on V8 with a really small standard library. Easy! No nonsense! Everyone knows this already!

Unfortunately that left everything else up to the community to solve: establishing a “good” standard library, building, debugging, packaging, bundling, dealing with the Node/Browser duality in a meaningful way, creating usable network frameworks, UI frameworks and surely much more.

And they’ve all ended up reinventing that 100 times (like the curse of lisp) in 200 different ways, all recursively based on Node, while standing on the shoulder of non-giants trying to solve one of the other problems mentioned above, possibly standing on the shoulder of someone else who has already tried to solve what you are trying to solve now.

Calling it a clusterfuck probably isn’t sufficient, yet at the same time it seems to work, so it’s simultaneously a sort of modern day miracle.

Disclaimer: have set up quite a few node build-pipelines.


Of npm (the online database of packages) I hate the fact they never forced a clear distinction between package meant to be used only in the backend with node, package meant to be used only on the fe inside a browser, and package which can be used in both. I mean just having a flag for that would be useful, simple and solve a lot of headaches


The beauty is that those distinctions don't exist. There are no limitations. There are also no guardrails or training wheels.


In what case would an npm package for talking to an external database be actually useful for a frontend project? I mean, yes, if you're only ever running the tool inside a LAN where that access exists, I suppose it's conceivable... And plenty of frontend packages that rely on storing state in the browser's window object (Redux etc.) would presumably be a bad idea to use for most backend projects, but no doubt somebody's found a way to make it work. I suppose I'm not disagreeing with you, but the GP kinda has a point too.


A couchDB package perhaps, for which the db supports access via http. Or look at any of over 1000 of sindresorhus' packages - a majority of which are meant to be used in both environments. I've seen plenty of backend services using state trees and mobX style packages.


I don't think that Java would be different if one would want to replicate all features that author felt necessary.

I'll speak for Maven as it's the most popular build tool and most straightforward. With gradle it would be worse.

To build two configurations you ideally want at least two modules. So multi-module projects. Weak point for maven. Doable, but with lots of quirks.

Last time I checked, to publish Java library, you would need to GPG sign it. This fact alone is a serious stuff. Pushing library to internal repository is easier.

Tooling in maven is terrible. Plugins are archaic, some of those seen last commits 10 years ago. I don't even know which tools to autoformat Java code are popular nowadays. I think that most people just use their Idea to format before commit with shared style or something like that. I think it's good enough.

Github bot to update dependencies and deploying demo site - I don't know, never did that, but I couldn't imagine it being easier with Java either.

So Java tooling has its share of quirks.

Most sane tooling I ever saw was for go, as long as you're ready to adjust your expectations and requirements to happy path.

What's unique for node is huge numbers of tooling attempts. Java has Ant, Maven, Gradle. I don't think I've ever heard about anything else. Bazel may be. Well, may be Makefiles for some old beards. JavaScript guys just can't stop trying to invent new tools. And this fragments ecosystem and user base. I don't know why is that. I've never heard of alternative Rust build system, for example.


> So multi-module projects. Weak point for maven

But not for gradle.

> Last time I checked, to publish Java library, you would need to GPG sign it. This fact alone is a serious stuff

Why is it a problem? This is partly the reason why the Java ecosystem wasn’t hit by all the malicious package fiasco.

> Github bot to update dependencies and deploying demo site

Never felt a need for this, but contrary to your misrepresentation, even maven plugins have a healthy ecosystem and being part of the 3rd biggest language, I would be honestly surprised if no such plugin would exist.


You know that maven have profiles that are tool to use when you need different configurations, not ?


In my experience, Node is worse than any of Java, Rust, or Haskell.

I think a couple of reasons for that is that it's untyped, and not compiled. Both of those mean that basic correctness checking ends up happening on the user's machine, rather than the developer's.


> not compiled

Ironic you say this, since the whole blogpost is complaining about setting up compiler pipeline to target NodeJS.


That fits in perfectly with what I was saying, I just wasn't precise enough.

The issue is that Node.js source code is not compiled to a target language like native code or bytecode. As such, there's no package repository from which you can retrieve compiled packages - you're forced to deal with the same build process as the developer of the package. If that weren't the case, the compiler pipeline that the post is complaining about would be eliminated for package users.

In fact, the author of this package could have avoided some of his pain by publishing a library with the compiled JS code, which would eliminate all the config of higher layers. But the JS ecosystem doesn't support making that distinction, there's no source package repo vs. compiled package repo (afaik - I don't use it more than I have to, because it all sucks.)


Note the convention in npm of using a dist/x.min.js or x.dist.js to use NPM to ship prebuilt modules (usually UMD as the ES6 world still mostly just expects you to use a build took, but e.g. Vue at one point at least had prebuilt es6 modules too).


I think you misunderstand... you don't have to worry about building packages at all (excluding some edge cases like native C++ extensions) when you install stuff from npm.


Ironically the existence of NodeJS is somewhat related to Haskell, as Stephen Diehl mentions in [1]:

> As a funny historical quirk, back in 2011 there was an interview with Ryan Dahl, the creator of NodeJS, who mentioned that the perceived difficulty in writing a new IO manager for GHC was a factor in the development of a new language called NodeJS. When asked why he chose Javascript, for the project, he replied:

>> Originally I didn’t. I had several failed private projects doing the same on C, Lua, and Haskell. Haskell is pretty ideal but I’m not smart enough to hack the GHC.

The interview itself is in [2].

[1]: https://www.stephendiehl.com/posts/decade.html [2]: https://www.bizjournals.com/boston/inno/stories/news/2011/01...


Its a fundamental issue with JS and DOM. Its understandable that there needs to be a need to manipulate elements on the webpage, but the issue is that with JS, you can basically create the webpage programmatically from the ground up, but you are doing it through essentially concatenating text into html/css.

Even worse, JS can actually modify itself. You can take defined functions that have definite common behavior. and overwrite them to arbitrary ones.

The equivalent to a low level language like C would be writing your code with half C, half custom macros that are #defined at random places in the code libraries that you pulled in that change core C instructions, and instead of regular C code you are concatenating strings with assembly instructions that would then be put into a file and ran.

No other language that I know of allows for this.


my big gripe with npm is that the global packages overshadow the local packages... I can't understand when that would be a good option other than maybe having an option to do so but not being the default. I can't think of another package manager that does this.

oh that and installing thousands of packages in the project root instead of a sensible place that could be shared per version like bundler and rubygems.


The globally installed packages only take precedence over local packages when you run the binary name from the command line. To get that to stop, you'd have to add the local node_modules/.bin to the PATH. Additionally, if you run npx/yarn/pnpx in a package script, it will use the local binary.

> oh that and installing thousands of packages in the project root instead of a sensible place that could be shared per version like bundler and rubygems.

Look into pnpm. It does precisely this in a way that's compatible with node.

Honestly the information is out there to address most people's gripes.


> Honestly the information is out there to address most people's gripes.

Exactly, but not discoverable. I came to know of esbuild, vite, pnpm etc.. only because of reddit and HN.


You discovered it because of reddit and HN. Others have dev.to, SO, Twitter, etc. Most devs today have to be mobile and inquisitive. It's not really a field for the complacent.


You can't do a package cache, because nasal demons erupt whenever you use a package that's not precisely equal and all packages use their own copies of all their dependencies.

Sane package managers don't try to solve for this problem; they make it clear when you're using bad software.


Bundler does this just fine. installs every single exact dependency in a central location and then sets the relevant paths to them before running ruby. It works very well. some of the people behind bundler wrote the much loved package manager for rust. I'm not sure where you are getting the thought that other systems haven't done it better.


That's exactly the behavior that makes NodeJS unsuitable as a package manager. Some resources are, in fact, globals. You should have a consistent enough API to use those through code that is imported and singular, rather than through importing a dozen versions of the API for each dependency you have.


Interested if anyone is using a jsconfig.json and jsdoc comments to do ts typechecking. I found it really easy to setup, I got a lot out of the extra checks, but the practice never really took off with the team of js devs.

https://code.visualstudio.com/docs/languages/jsconfig


How you gonna use it on a 3rd party module?


I saw the author was using vite, they have a mode to help you build a library that would mitigate their woes. Check it out!

https://vitejs.dev/guide/build.html#library-mode


I'm using library mode


So what is the "recommended" way to build libraries using TypeScript? I've heard that tsdx https://tsdx.io/ is quite good


I've started using plain old CDN-served React/jQuery/Lodash to do everything, with no build system, and daring my employer to fire me. So far so good.


The node ecosystem is just a giant problem on top of problems.


Node has been superseded by Deno, right? How about that?


I doubt anyone is interested but I'm building some tooling for node and npm in jenkins.

Just standard stuff, nvm, nodenv, npm and npx wrapper.


I think projen may solve the configuration and tooling woos. At least I hope it will.


I’ve seen enough of these config wranglers come and go (and built/maintain a few myself) to know that the mismatch between user needs and project templates cannot be bridged.

To be happy with a tool like this, you need to use the vanilla output. If you need anything more you’ll be in a world of hurt. But unfortunately often real world apps begin to need something more.

Perhaps this time will be different? I hope so but I’m not optimistic.


The major difference with projen is that configurations are not set and forget. They can be updated by a normal dependency update. So as tools evolve and APIs get deprecated, the projen component maintainer can adjust the output of the config, which updates your project config with one command. And as community we can all pitch in at keeping these components up to date. Rather than every individual trying to figure out for themselves what is the right combo of options to make something work.


Thanks for talking about it, I just discovered it.


You're describing all command line tools and config files.

Maybe people will discover GUIs one day, it's only been 30 years.


That's not even remotely true. For example, Rust has an excellent command line build pipeline experience.

That has to do both with design decisions - a typed, compiled language has a big edge here - and differences in how the respective ecosystems were designed and developed.

As for GUIs, I'll paraphrase an old quote I saw on Usenet once: if you need to point at something to communicate, you're at the cognitive level of a preverbal child.


hope that last bit was very much in jest otherwise "someone on the Usenet is wrong."

visual stuff (vs. text) is about massive parallelism and compression. see Rudy Arnheim for example. or Tufte. or video games. or excel. or a good "foreign" film. or VR once it doesn't suck.

https://www.ucpress.edu/book/9780520242265/visual-thinking


Even google languages (Go, Dart) and modern Java, no matter how people here hate them, have excellent tooling.


"Everything old is new again"




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: