Hacker News new | past | comments | ask | show | jobs | submit login
Modern SPAs without bundlers, CDNs, or Node.js (kofi.sexy)
315 points by kofigumbs on Feb 17, 2023 | hide | past | favorite | 166 comments



Sigh... Another post like this. Sure, it works well enough for the example shown in the article, but won't work well beyond this. Bundlers, package managers and other tools were created to address problems people saw in medium-large projects for websites with a lot of traffic. And as soon as the project has dozens of files and even just two or three external libraries, this becomes unmaintainable. Not to mention minimizer/TypeScript/Babel etc that are important for development or distribution.

(Did I mention that import maps was only supported in only Firefox 3 months ago?)

People think they are smart and come up with ideas nobody thought of before. "This seems so simple, why don't people use this?" I'm sorry, that's not the case here. I have been doing web development as a hobby and professionally for well over a decade and have seen too much of this.


"Sure, it works well enough for the example shown in the article, but won't work well beyond this."

Sigh...the entire bloody point of the author is to use this for light projects, to scale the sophistication of the setup gracefully. With very obvious benefits: easy to understand, not linked to any setup so it works forever, interoperable and transferable.

Typescript isn't universally important for development. It has its advantages for large projects with many developers but it's in no way essential. Most web projects don't use Typescript and the very idea that it's a must-have only a few years old and quite opinionated.

Likewise, Babel isn't important either. You don't need to use some futuristic JS feature on your simple project, you can just stick to well supported ones, thus not needing Babel.

"I have been doing web development as a hobby and professionally for well over a decade and have seen too much of this."

I've been doing web development since 1996, so that makes for 27 years, should such obnoxious statement make anyone's opinion more important. If you need to piss on the idea of somebody scaling a web architecture proportionally to actual needs, you should reconsider who is the "smart" one.


I think the criticism is exactly of that ability to scale though. This setup is ineffective but manageable at the individual scale, but it's going to get more complicated as soon as anyone else is involved, or if the project lives longer than six months and needs to be updated, or if you wanted to add any dependency more complicated than a single file.

The other side of it is it's not really clear what the author is gaining here. They're still reliant on NPM package structures, and they're still pulling in third party dependencies. They're just doing it in an overly complicated way.

In contrast, this is what I'd do to get to the same point the author does in this example: (I'm going off the top of my head here, and I don't have a computer to check this, but it should just work.)

    mkdir myproject/ && cd myproject/
    npm init # although I think just doing echo '{}' > package.json is enough here
    npm install -D vite
    npm install solid-js
    # write some actual code
    $EDITOR index.html
    npx vite
And now you've got basically everything that the author has, but you didn't need to write your own shell script to download modules, you don't need to do all the import mapping yourself, and it's much easier to graduate from this simple setup if you ever do need to.


My objection is that the author is quite clearly describing this as a bottom-up, small scale approach. Criticizing it as an inadequate approach for something much larger is unwarranted, as nowhere was the claim made that this scales up infinitely.

What is gained? No build step, as the article also clearly mentions. Your alternative approach goes against the goal of the author. Which is to not have a tool chain. "You can also do this by installing these 17 tools and 30 dependencies" is quite missing the point.


I don't disagree that it's small scale approach, it's just the language of "bottom up" and "scaling gracefully" that I disagree with. The setup in the article will work only at very small scales, and then you'll probably have to rewrite most of it. The setup I provided works at very small scales as well (arguably better), but can be progressively updated as needed (but only as needed - for example, there's no JSX support for now, but it could be added later if it became necessary).

And in a situation like this, I don't know that I understand the value in avoiding a build step simply for the sake of being able to say you don't have one. This isn't one of those situations where you're saving time by removing the build step because it's not actually building anything here, just rewriting a few paths and adding essentially the same import map. It's probably even quicker overall because it does the reload for you.

Likewise, you're overstating what tools are actually necessary here. It's not "17 tools and 30 dependencies", it's two: one (NPM) to manage dependencies, and the other (vite) to provide the dev server and set the imports up correctly. Everything apart from that is up to you to install as you wish.


>Which is to not have a tool chain.

In my opinion that boat sailed the moment he implemented `download-package`.


> Likewise, Babel isn't important either. You don't need to use some futuristic JS feature on your simple project, you can just stick to well supported ones, thus not needing Babel.

I would say even many moderately complex sites/apps have no need for [shiny new feature X]. These days new JS features tend more towards nice-to-have than they do essential. Back in the days when jQuery was ubiquitous it was a different story but things have improved quite a bit.


Agree, Babel is bullshit in most cases these days. It's for people wanting to use some syntactic sugar that they just learned about 3 days ago.


I think a lot of Babel is for people who haven't kept up with caniuse statistics and don't realize that the "low water mark", the "well supported baseline", has moved to something very close to ES2021 or more recent. To be fair, Babel is explicitly targeting some of those developers that prefer ignorance and don't want the mental overhead of actually knowing what browsers support and like having "just use the presets!" attitude. I think some of them would be surprised how extremely little most of the standard presets do in 2023 because there are increasingly few things to downlevel.

I have especially long been of the strong opinion that you never need Babel and Typescript. There's nothing that Typescript supports that it can't downlevel itself and tslib is a much tinier runtime dependency (if you choose to have Typescript import it rather than embed it in place) than any of Babel's runtimes (including their core-js dependencies) even when a bundler is tree shaking most of it out today (because caniuse statistics say yagni).

It still amuses that for instance the create-react-app "best practice" in its Typescript template is to use Babel for type stripping and downleveling and delegate the Typescript compiler only to type checking. No wonder people find modern pipelines bloated. To be fair, I get why CRA likes the conceptual simplicity that both of their core templates use the same Babel presets and pipelines given to them Typescript is still the opt-in "extra".

(Also, I have no problem with esbuild doing my Typescript type stripping and running tsc as a separate pass in esbuild pipelines, but esbuild isn't pretending to do a lot of unnecessary downleveling in 2023 and adds no further runtime beyond the parts of tslib it bundles for you. It's mostly just Babel's extreme complexity that I find redundant and unnecessary and its labyrinth of presets that don't actually do much in 2023 but give the impression that they do.)


Es6 arrow functions are a good example of this. They have been around for a long time and work in most browsers, but people learned they need babel to transpile them out for internet Explorer. People dropped ie support but the knowledge "we need babel for browser compatibility " remained an entrenched fact.


Yeah, it's relevant to remember that arrow functions were defined in ES2015 (eight years ago) and like I said most browsers people are using are caught up through at least ES2021. The nice thing about the ES2015 year-based nomenclature (as opposed to the people still calling it "ES6") is that you can do a quick reasonable first approximation of caniuse statistics off hand with "is that standard at least two years old?" Arrow functions are old.


Here's another thing: if you want to grow from this exact setup, use deno. It has support for import maps and don't require a bundler or a separate compilation step for typescript

https://deno.land/manual@v1.30.3/basics/import_maps

Maybe add aleph too (which is similar to nextjs)

https://alephjs.org/

Deno won't require nearly as much tooling as nodejs, but it still has tooling for the cases you need it.


Why scale “gracefully” though? The boilerplate of bundlers and package managers isn’t difficult or time consuming in practice. Sure it’s a lot of code and bloat but it isn’t actually hard.


I’ve been doing it professionally for 20 years, and I agree. Modern front end development with TypeScript, Vite, Preact, and even the much maligned npm are so much better than anything that preceded them for building complex front end applications.

I like having a build step for code that I write. I build C, I build Go, I build C#. I don’t ship debug builds of those things.

I don’t know why I’d want to forego all of the niceties that a modern TS build brings and also ship what is essentially a debug build of JS to the browser.

Sure, for a really simple web page, I’ll just slap a script together. But for anything real, hell yes, give me a good build mechanism.


I think the difference lies is how pleasant the tooling/build process are. In the case of C, Go, C#, and other non-web languages, there are comparatively few gripes — the most common being “building takes a long time” with Rust and Swift for example, which is a fairly mild problem. The negative parts of the experience for these languages more often than not get fixed.

With JS, it seems like many of the negatives are much more sticky and refuse to go away. Not sure why. Perhaps it’s related to how that ecosystem has a tendency towards cyclical reinvention of the upper layers while foundational/structural issues get considerably less attention, and so they get swept under the rug more often than they’re fixed, leading them to rear their heads every so often to remind everybody of their existence.


> I think the difference lies is how pleasant the tooling/build process are.

> With JS, it seems like many of the negatives are much more sticky and refuse to go away. Not sure why.

Spot on. I think there is a quality/cultural problem around the javascript ecosystem, but I'm also not sure why it persists. If I had to hazard a guess, I suspect that it has to do with what javascript programmers are usually trying to build. Those things are mostly visual, and the audience cannot see the quality or lack thereof easily, and outputs do not often live long. These incentives steer people to create fast and not worry so much about the soundness, sustainability, or longevity of their solutions.

I made a comment to a colleague recently that javascript tooling is a 'labyrinth of complexity'. That is my main gripe with all of the 'stuff' that comes with the front end. It all feels overcomplicated and thrown together.


While you may like having a build step, it is not necessary and some people don't like it. I've been building commercial data warehouse apps for >30 years and am very happy that our apps don't have a build step. Current app is a multi-million dollar Fortune 500 client app. No build step.


I don't mind those things.

I do mind that they install 2000 packages, though.


> $ echo '{}' > package.json && npm i --save-dev typescript vite preact

> added 17 packages, and audited 18 packages in 1s

You're off by two orders of magnitude. I get not wanting a lot of transitive dependencies, but you can get a modern toolchain without having a lot of transitive dependencies.


And you can go even smaller if you're fine with a slightly less user friendly (but still modern, and also very fast) build tool that's used at the core of Vite:

> echo '{}' > package.json && npm i --save-dev typescript esbuild preact

> added 4 packages, and audited 5 packages in 2s


Webpack is the real offender and while vite is cool it isn't really a replacement for the decade or so standard, and looks like it requires a semi modern browser.

I'll probably try it out some time for a home project, but for anything serious it's a no go.


Comparing building a golang project with "building" a frontend project is pretty laughable. Frontend "builds" are atrocious.


Perhaps, but JS tooling is waaaay better than it used to be.

Vite and Turbopack are way easier to use than Webpack, and faster than the next best thing, Parcel.

npm supports lockfiles. Yarn goes further in that you no longer need a million files in node_modules; or pnpm uses hard links to prevent duplication.

That said, typechecking Typescript is unfortunately slow. esbuild (as used by Vite) can strip out type signatures quickly, but can't check them. Rescript is very fast, but is a very different language than TypeScript or JavaScript, namely ML with braces.


What a silly thing to say.


I have never ever in my lifetime seen a front end application that was enjoyable to use.


Is there a reason you decided to come into a Javascript thread just to tell everyone how much you dislike Javascript?


The person I was responding to wrote "Modern front end development with TypeScript, Vite, Preact, and even the much maligned npm are so much better than anything that preceded them for building complex front end applications." which implies that modern front end is an improvement over what previously existed.

That person then wrote "But for anything real, hell yes, give me a good build mechanism." which implies that he creates real, high quality applications.

But my experience is that all front end web applications are all sub-par, so it doesn't really matter if a SPA uses a bundler or not, it doesn't affect quality when it is all shit anyway. So more power to the guy who decides to not use the bundler.

In the end, I agree with Alan Kay, the web was created by a bunch of amateurs


The web targets every platform. How ambitious is that? Of course it brings some challenges.


Figma and Discord come to mind as being exceptionally well implemented front end applications. They're even more HN-friendly because they both use Rust for performance-critical code.


The author is pretty clear that their intention is to use this as a simple first step to creating an app without the hassles of bundlers, they don't seem to be advocating that one would create "medium-large" projects like this. The word "incremental" is mentioned in the first sentence.

Nor does their message come across as "why don't people do this" to me.

> Until recently though, I wasn’t sure how to make this step feel incremental.

Seems like they genuinely just found a neat way to do a thing and wanted to share it.


It’s cool to see what modern browsers can do natively now. Using babel/webpack probably obscures that for most developers.


"I like cars, they serve my use case exactly".

"Sigh, another car user. What are you going to do when you need to transport 50 humans at the same time?"


> People think they are smart and come up with ideas nobody thought of before.

I think you're projecting your thoughts about some other thing on to this article... this one is a "this is how I like to do things and I found a new thing that helps do it that way" post.

The tone of the article is really not like what you're suggesting.


The last couple days there's been a lot of negativity towards the build step, but it's like everyone's forgotten the historical context that made the build step so popular to begin with.

I'll give credit to the post about the fragility of certain build tools long-term in the npm ecosystem, but I don't think that's a reason to shrug off build tools period. We just need better, more stable ones (and I think we're starting to finally get there).


The historical context is web development became popular and developers from other disciplines moved over and brought their practices from those other areas of programming with them. We compile apps for the desktop, we should compile apps for the web too. They brought their complexity with them and forced it onto web development instead of stopping to consider if that was a good idea. Now we have developers who never knew how things were before transpilers were created and now think that this is how it has to be and that this is the best way.


Have you worked with old, medium to large sites filled with jQuery and custom ad-hoc scripts? In the past year I helped maintain a site exactly like this - it's a huge pain. So many problems that should've been caught sooner are noticed far later, leading to long feedback loops that cost time and money.

Sure, I would agree that the current JS ecosystem has its issues, but work can happen at either compile/design time, or at run time. Shifting work from one step to another has trade-offs that may or may not matter for a given team or product.

For small hobby projects? The scale makes these problems pretty insignificant, so handling more at runtime ain't so bad. But the number of possible problems multiplies as your codebase increases in size, and by solving these problems sooner, you spend less time overall dealing with them.


I still remember why I moved to Typescript.

Typescript was still 0.x "public preview" and I was knee deep in a codebase that was a massive amount of jQuery scattered across mostly inline script blocks in a huge number of ASP.NET templates many of which were "components" subcluded by other templates (meaning who knows how many final inline DOM-blocking script blocks per final output page). I had helped invest a ton of effort into an AMD infrastructure to start to modularize all those inline script blocks, but the conversion was pretty slow going (AMD modules were a pain to write by hand) even though the clear performance win of it helped keep it a clear priority project. Typescript even in 0.x made all of that so much better and easier if for no other reason that it made writing and managing AMDs so much simpler (and type checking helped code quality so much as bonus).

A lot of compile time tooling didn't fall out of the sky because they were solutions looking for problems. They solved real problems at the time. It's amazing to think how few developers today remember "the bad jQuery days" and "server-side template inline script block hell" and AMDs (good riddance, though they too solved important problems in their time). I don't think they can understand how much "our compile times are slow" is a better problem to have. I'm not sure if I envy some of them having missed some of the worst of those past problems.


> We compile apps for the desktop, we should compile apps for the web too.

I don't think this is true at all. The key reason you see build steps are to solve problems that you don't even have to worry about in desktop apps. If you have an app on the web, its bundle size needs to be small so that it downloads over the internet in a few hundred ms or so. So you get a minification step. You want a good type system, you end up with a build step. You want broad compatibility with older browsers? You end up with polyfills. Need to do some asset processing? You end up with some scripts that have to run at build time.

Build steps aren't some arbitrary complexity people make them out to be. They solve performance and compatibility issues, which impact real users in the real world. Running full-blown desktop apps over HTTP comes with a different set of problems compared to any local native app.

Of course most of this isn't important for small side projects. But for many, side projects are a way to learn about different technologies they might end up using for larger projects or in a professional context.


Cargo-culting is responsible for the most popular things in development. Not the actual benefits.


> but it's like everyone's forgotten the historical context that made the build step so popular to begin with.

Maybe the reasons are not as strong as it seems?


I’ve been doing web development for more nearly 25 years. It isn’t unmaintainable. How do you think we built large websites in the days before compiling/transpiling was a thing?


>How do you think we built large websites in the days before compiling/transpiling was a thing?

You don't really compare web development 25 years ago and now, do you? Then the most complex front-end was having two forms. Also people wrote operating systems in ed but don't think one will even consider giving up their IDE/editor for ed because people could build complex software in past with that.


> Then the most complex front-end was having two forms

You have no clue what you're talking about. In the late nineties we had just about every bell and whistle working that I see in websites today, hardware limitations permitting. Some of it did not work well because client browsers were slow but it was all there: full interactivity through Java applets, VRML, media control API etc. Websites in the nineties were not two forms and a table tag. My masters project was embedding a voice recognition engine for interactive web search.

Yeah we didn't do SPAs and instead implemented interactivity locally. That was a good thing that you threw in the garbage for a fad.


It seems odd to bring in Java applets and similar technologies to the discussion when the original point was that web development shouldn't need a build step. And isn't the whole point of SPAs to implement local interactivity? I.e. treat the client as a separate application (just like you would with an applet) and move session logic out of the server?

If anything, I think it's remarkable how far we've graduated from that point: many features components and features that we used to have to build by hand are now directly implemented in the browser; Javascript is significantly easier and nicer to use, even without having to apply a build step (even just splitting scripts into separate modules is now supported natively!); and features that previously required potentially insecure plugins to be enabled are now controlled by the browser, giving users much more ability to control what their browser is doing or not doing. Even if you do want more complex development with build steps, with tools like Parcel and Vite, that's usually pretty simple at this point (certainly simpler than the last few Java builds that I've seen).

It seems like you're complaining that everything is worse now because it's possible to have stupid amounts of complexity these days, but it's also a lot easier to have no complexity at all.


>full interactivity through Java applets

Not sure we're even on the same page if your argument on "sites were as complex back in the good old days" is embedding a literally distinct program on site, which quite ironically also happens to include its own complicated tooling and building.

>instead implemented interactivity locally

Umm, what you think SPAs are?


Man I cannot wait for bundlers and all of that garbage to go the way of the dodo bird.


Man I cannot wait for compilers and linkers and all that garbage to go the way of the dodo bird.


You don’t, and never did, need compilers and linkers for JavaScript. There’s even “script” right in the name.


Sure, you can run optimized JavaScript in your browser. At least you can smugly tell your users about your elitist belief when they mention that your scripts don't run on safari, because you've mistakenly used an API that was only available in Chrome and you didn't ship a polyfill, or if your code is larger than it has to be because it's not minified.


Sure, you strictly do not need many things that make your site smaller and load faster.


> People think they are smart and come up with ideas nobody thought of before. "This seems so simple, why don't people use this?"

So, in your opinion how do things improve then? People see how painful some tools are and so they try to improve things. Some times it works, some times it doesn't.


But can't you see? Overengineered blog / todo list clearly shows that JavaScript frameworks are totally useless and we should all be using jQuery instead. My own custom framework is totally up to the task and we shouldn't be trusting <big corp> as the maintainer of common frameworks. The browser is a bad tool anyway and we should all go back to using text only web pages.


Not everything must scale to stay relevant.


> People think they are smart


To go quite a bit off topic.

The author says that he starts with a single html file and splits it or incrementally adds stuff when needed.

That course of action has proven to be a really bad idea on every non trivial web project I worked on.

Mostly for teamwork and maintainability reasons.

E.g there is no clear project structure, the next dev will not understand stuff and do things differently. And welcome to the chaotic legacy code hell. It's like Php all over but in Js.

I usually work for customers who to some extend now what they want. I choose an appropriate tech stack for that use case. The team can use that stacks conventions to develop the project. There is no need to slowly grow, everything can be done in parallel and due to the conventions stuff sticks together in the end.

This sort of work flow has a tendency to just pile incredible amounts of code in files thousands of lines long. That is going to be unmaintainable in no time.

Everyone claims they don't do that but they all lie (yes you too).

Also everyone has a chance to end up maintaining the resulting amorphous blob (could be you again).

This lacks tooling like hot reload, spell checkers, linting and so on. You will need that anyway so why don't you start with it?

Also you need tooling for deployment like config files. Tests, Ci and what not...

Lack of documentation. Since this follows your personal style your colleague needs to follow. A framework provides documentation for that. Working like this, the docs need to be written by the devs. Although they may claim otherwise they usually don't.


You are optimizing for entirely different use cases.

There are a whole bunch of devs who work alone or in (very) small teams. For this type of work it’s really more of a hindrance to have an imposed structure and tooling.

We want to get to your goals as efficiently as possible. Minimal abstractions, guidelines, tooling, indirection, magic, surprises and general overhead are in order. We don’t want to struggle with questions like “how to do this in X”. We already know how to do it, so we just do it.

As time goes on and LOC get merged we find ways to add sensible structure and compress our code.


There's another comment that talks about medium to large project, and it's the same thing, it's a different use case. I don't care about medium or large projects, I need to add a quick semi-dynamic element to an incredible basic web app.

In my case, setting up a React or Angular project is going to be more work than just writing the whole thing in vanilla JavaScript. This has the added benefit that I actually understand what's going on.

The modern JavaScript frameworks are great for large projects, but it feels like learning Django in order to make an API that goes "pong" when you do a get request. I get why this happens. Vue looked great 8 years ago, small, simple and easy to get started with. Then it grew to support more and more use cases, and larger projects and now we need a new tiny framework. Or we can accept that JavaScript has come a long way and that many of us might not need a framework.


No, proper structure and tooling (especially for HMR) are important productivity boosters for solo developers as well. Single file scales to about 1k LoC, it’s only viable for toys and tiny projects; beyond 2k-3k LoC you’ll hate yourself every time you need to change something (been there). And ad hoc shell script in place of npm/yarn/whatever proper package manager, which is way easier than said script? I hope that’s show business and not actually used.

Btw, this direct module import approach means you’ll be shipping a lot of unused code to users.

Edit: If you only need a tiny amount of progressive enhancement on top of static HTML: forget about this import map and ad hoc npm script business, use petite-vue (single script, 6KB, add a script tag like good old jQuery).


> There are a whole bunch of devs who work alone or in (very) small teams.

And you still need to reimplement a bunch of stuff if you don't use oss packages. And the next maintainer will curse your family for generations because of your own custom framework.


Imo being "smart" like this is reinventing the wheel and doesn't give you any productivity gains.


It's not reinventing anything. It's just avoiding complexity until necessary.


Writing custom shell scripts to avoid complexity usually achieves the opposite result. I can Google npm errors, but not why a custom shell script fails.

Second, anyone reading the repo is now expected to understand not 1 but 2 languages.

Finally, the shell script performs basic tasks that npm does out of the box.

Because of above reasons I think this solution is needlessly complex.


As someone who does this too: it depends. If you take time out every now and then to completely refactor your code base it can actually be surprisingly effective. I've done exactly that on my last project and I'm pretty happy with the end result, you can have a look for yourself:

https://gitlab.com/jmattheij/pianojacq/-/tree/master/js

This project will likely never be finished, there are always nice new things to add or requests from people, there is no commercial pressure because it is a hobby project and I don't have a boss to answer to. And even if such refactoring operations take me two weeks or more (this one I did while I was mostly just working on a laptop without access to a keyboard so it was sometimes tricky to ensure that nothing broke) in the end it is worth it to me because I am also paying the price for maintaining the code and if it is messy then I would stop working on it. Project dependencies are the absolute minimum that I could get away with.

The project moves forward in fits and starts, sometimes I work on it for weeks on end and sometimes it is dormant for months. In a commercial setting or in a much larger team I don't think this approach would work.


> If you take time out every now and then to completely refactor your code base it can actually be surprisingly effective.

So much this!

Current project I'm on, I own the entire front end for a major modernisation (AKA rewrite) of a legacy application and we are working in 4 week "sprints". I'm giving myself two days every month just to refactor the code I wrote that month.

How you plan to structure a project never works out, you always find better ways as you go and as the objective and priorities change (they always do).

On another tangent, before pivoting into software development I used to be a mechanical/industrial engineer. The parallels between coding and CAD are enormous. With CAD you also need to be spending 10% of your time "refactoring" your model. It's almost exactly the same from a maintainability perspective and leaving the model with good hygiene for the next person to work on.


> How you plan to structure a project never works out, you always find better ways as you go and as the objective and priorities change (they always do).

I always joke I write everything three times. The first time to get a feel for the space, the second time because I think I now understand it and then again when I finally really understand it. Most of the times these three look absolutely nothing like each other.


I think that even extends over longer time periods to projects as a whole and an evolving team. Sticking to my current project which has a nearly 20 year legacy and looks nothing like what it was when it started:

V1: Perl CGI, HTML forms, very little js

V2: Perl CGI, JQuery, Ajax (done badly with js code generated by Perl code)

V3: Perl backend rest API (will be "public"). TypeScript+Vue front end.

Each time a transition has happened as the application features have outgrown the architecture, and more has bean learnt about how people use the product and therefore what it needs to do.

In this case it has now grown the the point that Node and build tools are required.


Neat, is this a public project. You make me curious what it is that you are hacking away at!


Avoiding Googleable backlinks, if you search for "integrated antibody sequence and structure tool" you will find it.

I'm the first "product" person without a bioinformatics background to work on it. The new version isn't out, but is a significant rewrite with an aim to massively extend its functionality in future. (Current "public" version isn't even the current commercial version, and very 90s in style!)

Backend is a very large Perl codebase implementing a significant amount of algorithms from academic research.

It's been awesome to work on, my ideal sort of project where I can bring a technical product focus and learn about interesting technology and science at the same time.


And super useful too. Wow. Thank you, I will definitely have a look.


I looked through your code a bit, and there are a couple of things I'd flag in a code review (e.g. use of non-local variables). Ironically these are things that are a lot harder to do, and easier to spot, if proper modules are used.


I'm sure you would, and if you looked at earlier versions of that same code you'd see mountains more of those.

I'd love for everything to be perfectly pure though and side effect free. The pressure is always on to improve it further. As for code review, feel free and open tickets for whatever you spot and hopefully one day I'll get around to it. Better yet: submit pull requests.


There is no clear project structure if you don't write down what the expected project structure is. Which you really need even with a framework, because the frameworks only provide really rough high level structure.

While I loathe files thousands of lines long, I also loathe the endless chasing of things from file to file that tends to be the result of applying big frameworks to tiny projects.


Really? It has been well over a decade since I cared about file length or number of files. Any number of code editors has search, multi-file search, and "go to definition". I've got one personal project that is a three.js character creator, where my js is one file of somewhere north of 200K lines. Who cares? I sure don't. If another were to join the project, sure, I'd break it up into smaller files so others can work on separate bits - but that's the only value of separate files anymore. Nobody prints code anymore, and if they do it is code fragments and not one's entire program.


I find it a lot easier to reason about logical chunks of code that are clearly divided. I rarely want to see just a specific function, but a cohesive unit that can be read in sequence and make sense. You can do that in a single file too, but in my experience, large files tends to lead to code being spread out without a narrative determining order because people (myself included) are rarely disciplined enough. Files act as a convenient clear grouping to me. On the other extreme, when people insist on splitting everything up into the smallest little unit, it drives me entirely nuts for the same reason. I want to be able to read through the code without jumping back and forth all the time.

But when it's just you it's all down to preference, so do what works for you.


I get where you're coming from, but I have on occasion wanted a fairly simple one-trick-pony sort of SPA. Think a GNU userland tool, not Word.

Thinking outside the node-vue-kitchen sink sort of development box is a good idea. Maybe you don't have a grand idea, maybe you just have a kinda neat idea. Punch that thing out fast and light. You can always grow it bigger, but maybe you don't need to.


I also sometimes enjoy this approach of starting from absolutely nothing.

Instead of taking the path of starting with DOM manipulation and then going to a framework as necessary, I've kept really trying to make raw web components work, but kept finding that I wanted just a little bit more.

I managed to get the more I wanted -- sensible template interpolation with event binding -- boiled down to a tag function in 481 bytes / 12 lines of (dense) source code, which I feel like is small enough that you can copy/paste it around and not feel to bad about it. It's here if anyone cares to look: https://github.com/dchester/yhtml


An even simpler bit of sugar over the DOM that I embed whenever I need just a little bit of JS:

  function $E(t,p,c) {
      const el=document.createElement(t)
      Object.assign(el,p)
      el.append(...c)
      return el
  }
Usage:

    const button = $E('button', {onclick: () => alert('click!')}, [
        $E('img', {src: '/assets/icon-lightbulb.png'}, []),
        'Ding!',
    ])
(With thanks to 'goranmoomin: https://news.ycombinator.com/item?id=23590750)


Wow! What a coincidence! I literally have something extremely similar which I called `$e`. I think one crucial thing you did not mention is that this conforms to the signature of jsxFactory, meaning if you run some kind of transpiler that supports jsx, you can literally do:

parent.append(<p>Hello</p>);

(Mine has slightly more functionalities, such as allowing passing in styles as a object, auto concatenate array of strings for class names, etc.)


Joining this thread to say that I, too, have written a very similar function and also use jsxFactory to have JSX support in personal projects. I find that using it along with an extremely simple implementation of a kind of state listener[0] produces something really nice for small projects.

It's a bit like a jquery for the '20s.

[0] https://github.com/curlywurlycraig/vdom-util/blob/master/src...


Oh, didn’t expect a reference to me at all, was thinking that the function was very similar to mine :)

That function is now in my must-haves in my new Django side projects; I usually overuse them before I finally move to a JSX-based UI library. It’s great for simple DOM manipulation… and for me it seems to create a semi-simple migration path in the case the code gets complex.


Import maps strike me as a huge win for making prototypes, or things where you control what exactly you're importing (eg from a private registry instead of NPM), but throwing out the bundler and more importantly the tree-shaking and minification steps of bundling, will result in a massive about of unused code being delivered to the user. You'll be relying on maintainers putting minified, unbloated assets in their packages and ... well ... NPM is gonna NPM.


I think the issue might be the name "tree shaking". People from outside of the JavaScript just don't understand what it means, and how important it is. Maybe we should have just called it "dead code elimination". (I think there is some subtle difference but my memory is failing me).


IIRC tree shaking is a form of dead code elimination. Something like: dead code elimination is the what and tree shaking is the how.


This is where ES Modules come into their own and why importmaps are very much an ESM thing, predominantly. ESM was built with "natural" tree shaking in mind in a browser context. Only URLs that are actually imported in a module are loaded. (importmaps are not prefetch maps.) Modules themselves are loaded in a such a way that unused exports in them may be lazy-JITted "weak references" and easily garbage collected (a form of tree-shaking).

You get better "natural" tree-shaking from that library of lots of small little ESM modules, you don't need to rely on maintainers building their own minified bundles.

The obvious trade-off, of course, is that HTTP 1.0 wasn't optimized well for lots of little files and even HTTP 1.1 servers haven't always been best configured for connection reuse and pipelining. Bundling is still sometimes useful for a number of reasons (whether or not minification matters or compile-time treeshaking makes a noticeable difference from "natural" runtime treeshaking). Of course, all the browsers that support importmaps and script type="module" all support HTTP 2 and most support HTTP 3 and that trade-off dynamic shifts again with those protocols in play (the old "rules" that you must bundle for performance stop being "rules" and everything becomes a lot more complex and needs "on the ground" performance eyeballs).


> you'll be relying on maintainers ..

Our team (at Fortune 500 co) are pretty agile and forward looking (compared to corporate IT) but we're still not allowed to use anything from NPL-like repos without permission, and even then it has to get a code review and be pulled into our local repo. Personally, I agree with the CTO's belief that going from NPM directly to production is sort of insane.


On the same topic, I wrote some articles a while ago. Here is my own approach for making a VanillaJS SPA without using any bundlers or frameworks https://rishav-sharan.com/#/making-a-spa-in-vanilla-js

Note that this blog right now converts markdown posts into html at runtime, which is definitely not an efficient way of doing things. But considering that I have just a handful of people landing there, I really haven't tried to update the approach.


I really enjoyed reading that, it's always a pleasure to get a window into the thought process of other developers as they evolve their ideas. Just curious, did you consider using event bubbling so you can declare event handlers right away rather than doing them in the after render section? This is currently the approach I am taking, and I'm just wondering if there are some pros and cons here.


Clever and minimal approach!

I wonder whether you could turn this into a static renderer by skipping the `after_render` as a separate step that only happens in the browser. You could even rename it `hydrate` if you want to use fancy modern JS lingo. The benefit would be to show content on initial page load without JS enabled as well as letting crawlers index your site (some still don't run JS).


A common problem I run into is wrt routing, and I see you've built a hash-based routing system. Are you worried about overriding the expected functionality of the URL hash?


Just wanted to say I enjoyed reading this. Thanks for sharing.


Nice work, and nice writing too!


It warms my heart to see frontend web dev maturing like this, it really does. Kudos to the author for putting this out there, that shell script is nifty.

One trick I've used is to `npm install` my dependencies as normal but bundle them with esbuild, while also generating a typescript definition file. This gives me one file I can import in vanilla JS but also lets me check my JS with typescript.


Me too, I left Web development when node and npm were learning to crawl, given how things went in WinRT and Android world, I ended up refocusing on Web.

So when I got back I couldn't believe how bad the experience became versus SSR frameworks, and gladly focused on backend and devops stuff instead.

It is refreshing to see this starting to happen.


> So when I got back I couldn't believe how bad the experience became

Like frogs in a slowly boiling pot, most full time js devs have no idea the shit they put up with. IMHO modern js frameworks suffer from complexity rot; they are so over engineered it is a sad joke.


If I really needed to go without a bundler, my search would start and end with Vue. Other popular js frameworks (even some of newer ones) are heavily tied to a bundler. And why use less popular alternatives when vue comes with official router, state mgmt and devtools?


That has been my experience -- I've been able to leverage Vue and avoid bundlers, node, npm and all the other bloated nonsense.


I think Vue.js is an excellent choice for this. They recommend bundlers and the default import doesn't come with the template compiler but the template compiler is fast enough to run when loading the page, and the vue template syntax looks clean inside a template string IMO, especially if you take care to minimize logic in templates.


view really deserves as serious a crack that react got, imo it's just better made. and vite is so good I think it's earned it.


I just don’t get the whole adding another DSL instead of using JSX or something.


JSX suffers from the same problem that early PHP did. Because you have the full power of JS at your hands, you need some very good linting rules if you don't want to end up with a mess combining presentation and functionality. By having a hard split between the two you're forced to build declarative templates.

Most React projects I've jumped into are really hard to understand regarding which template elements come from which pieces of code (partly due to JSX, partly due to design patterns which seem counter-productive). I've never had this issue with a Vue project.


JSX is quite a bit better than early PHP, but yeah, it has the same issue. I like both JSX and Vue templates, though (I prefer JSX with Vue or Preact).

I think JSX and Vue templates both are discovered, not invented, and have staying power because of that. JSX is neat because it is a really simple hybrid of two languages and is fully composable. Vue templates are neat because unlike so many other HTML templating languages, the templating is inside (X)HTML attributes, and they don't go too nuts with them like Angular did. It took time for Vue to support pretty much everything, but now it does. The nested <template> tag can be used to accomplish the same things as JSX fragments. The IDE tooling adds syntax highlighting and autocomplete to the JavaScript expressions. Even though it's HTML5, Vue went ahead and supported XML-style self-closing tags.


This is a balanced viewpoint, thank you for sharing! I probably have been burned a few too many times with legacy applications without good separation of concerns in the rendering layer to ever really prefer JSX to a strongly-separated templating language, but in the right hands they can definitely be well-written and -composed.

In a way JSX feels like GOTO statements.


The problem with writing a shell script to fetch and download NPM packages is that it won't do so in parallel. I'm sure that's fine if you only depend on one or two packages, but dependency explosion is practically a feature of the NPM ecosystem, and that'll stretch the script runtime. Not to mention fetching packages that you already downloaded, and no compatibility with outside tooling like Dependabot for automatic updates.


You could easily use something like GNU Parallel:

https://www.gnu.org/software/parallel/


A different way to think about this is that most of the dependency hell relates to dependencies of your build tools. Not to dependencies of your actual application. And then of course some of those tools might be node.js tools and others might be things written in rust, go, or other languages. So, you need multiple package managers to deal with all of that.

A neat way to organize that mess is to just use docker for tools and use maybe a simple Makefile to call those tools. Build your docker container once, push it to a docker registry and use it many times. Also nice for CI/CD.

I did that for my website which is a simple static site that I generate with some bash scripts, pandoc, and other tools from markdown files. The scripts run in Docker. I don't use node.js for this but it doesn't matter. It's just another set of tools. The only tools I need on a laptop are docker and make to build this website.

The issue with npm is that it pretends that everything happens inside of npm/node.js. Just npm this or that. It's a package manager. It's a build tool. And it's a way to run tools that you install via the package manager. And then almost as an afterthought there are a few run-time dependencies that it minifies into your actual application as part of some convoluted bundling toolchain (typically installed via npm).

The actual size of those run-time dependencies is surprisingly small for most applications. And of course with modern web browsers and html 2 & 3, loading them as one blob of minified js is not necessarily even the best way to pull in dependencies anyway. You can load a lot of this stuff asynchronously these days.


Not only that, properly managing dependencies would become a huge problem.


Woah, importmap is a game changer! I didn't know about it until reading this, thanks!


Not yet available on Safari, but still cool!

https://developer.mozilla.org/en-US/docs/Web/HTML/Element/sc...


https://github.com/guybedford/es-module-shims has a polyfill.

(But it is fairly large: 53KB raw, 15KB gzipped, 32KB minified, 11KB minified+gzipped. It’s providing a lot of likely-unnecessary functionality. I’d prefer a stripped-down and reworked polyfill that can also be lazily-loaded, controlled by a snippet of at most a few hundred bytes that you can drop into the document, only loading the polyfill in the uncommon case that it’s needed—like how five years ago as part of modernising some of the code of Fastmail’s webmail, I had it fetch and execute core-js before loading the rest iff !Object.values (choosing that as a convenient baseline—see https://www.fastmail.com/blog/using-the-modern-web-platform/... for more details), so that the cost to new browsers of supporting old browsers was a single trivial branch, and maybe fifty bytes in added payload. I dislike polyfills slowing down browsers that don’t need them, by adding undue weight or extra requests.)


There's also https://github.com/systemjs/systemjs if you want more of a ponyfill approach. FWIW bundlers also don't use the browser's functionality to load modules...

It's a bit late to figure out, but I see caveats with dynamic imports in es-module-shims but not SystemJS.


That's going to change when Safari 16.4 comes out - currently in beta: https://developer.apple.com/documentation/safari-release-not...



No Safari support seems like a significant hurdle.


Safari support is enabled with https://github.com/guybedford/es-module-shims


I’m the post author, and I use this same polyfill. I even install it with the same NPM download method I wrote about.


What does this have to do with SPAs? If you're doing this, you're precluded from using any framework of substance or templates. You can only use packages which are fully compatible with JDM. You're forced to use only systems which require you to model your DOM with function calls, or which parse your templates at runtime.

This is the sort of thing to do for lightweight projects that have a little bit of code. But SPAs are necessarily not that.


is this the right place to shill a similar thing? (I use preact without bundlers for a while; also fast enough typescript without bundlers)

UPD:

- Preact template: https://github.com/wizzard0/naked-preact

- node/browser TypeScript loader: https://github.com/wizzard0/t348-loader


Thanks for sharing! :D


k k updated the comment) preact is bundled but intentionally non-minified so it's easier to debug. ts incoming, gonna clean it up a bit and write a README

UPD you recognize a software developer a mile away by his awesome estimation skills >_> published the typescript loader too!


What’s the benefit of the custom download-package script over npm install? Why would you want to roll a custom solution here?


Didn't get the idea as well. The end part of the post is like a seed for a future custom package manager or something


NPM will automatically install dependencies and run post-install scripts. Those are normally a convenience, but I can understand why someone would want to skip that.


Still. I'm all for avoiding npm when I can get away with it, but for TFA's case I would have just used the `--ignore-scripts` flag.


As the project becomes more than a simple example and requires more third-party libraries, this is just unmaintainable.


I prefer to use vite - you get handling of imports without needing a separate bundler step, but you get hot reload as well. I find it is a nicer experience, YMMV but it is good to have alternative options:

  FROM node:19.1.0
  WORKDIR /project/vite_project
  RUN npm init -y
  RUN npm install react react-dom
  RUN npm install -g esbuild
  RUN npm install vite
  EXPOSE 8081
  CMD ["npm","run","dev"]
The react install isn't normally there if I start light - but it shows that the path to throwing in a framework is smooth. Typically combined with:

  #!/usr/bin/env bash
  docker build -t vite_play -f Dockerfile.vite emp/
Obviously there is something to a bundler step happening in the background, but it is fast enough (and implicit) so it doesn't get in the way of rapid prototyping.


I agree, while it’s nice to go back to basics, I am more concerned with the end result, and vite works really nicely to achieve this. Any real project is going to use CI at some point so what’s the issue with having a build step in your pipeline?


If you use normal includes instead of imports, and put an object (or better a closure) on the top as a namespace, your SPA will work with any browsers, and you don't force your users to throw out their phones and laptops every 2-5 years.


You could also just use a polyfill (es-module-shims) to add support for older browsers.


Maybe you are right, and modules (even this fancy new module syntax) are not worth to fight against anymore.

I don't really trust polyfill tho.


could you make a code example?


I was thinking of including with

    <script src="mod.js"> </script>
just how you include (or included in the past) jquery or mathjax. I manage my namespaces like

    function Zebras() {

      let l;  //local
      let e;  //has setter and getter
 
      gete =  () => {return e}
      sete = (x) => {e = x}

      function lfunc() {console.log("local function")}
      function efunc() {console.log("exported function")}
 
      // exports :
      this.efunc = efunc
      this.gete  = gete
      this.sete  = sete
   }
and import it

    <script src="mod.js"> </script>

    <script> 
 
      zr = new Zebras()

      zr.sete(4)
      console.log( zr.gete() ) // 4

      zr.efunc()
      // zr.lfunc() <- not working
 
    </script>
You can export objects or functions trivially. I don't think you can expose primitives this way.

I think this method is working since ~20 years, and will work 20 years from now, when you can't even get your bundlers to run. But then if you have to change it frequently, probably ES modules are easier to automate, test, and handle with bundlers.


I'm for simple development environments and processes for small projects and teams, but refusing to use nodejs/npm but writing a bash script to do the same work is next level crazy.


While Safari doesn’t support importmap, it’s possible (and not too hard) to use them as a progressive enhancement so that it’ll still work, and safari users will get the benefit with 16.4. I did a writeup: https://qubyte.codes/blog/progressively-enhanced-caching-of-...


Firefox only started supporting import maps in v108 which was released 3 months ago. Previously you would need to use an experimental flag to enable it.


I usually create a folder containing src/index.js and public/index.html, then install react-scripts and add 'start' and 'build' scripts in my package.json.

  mkdir -p myproject/{public,src} && cd myproject  
  touch src/index.js public/index.html  
  npm init -y && npm i -D react-scripts  
  # add to package.json scripts:  
  # "start": "react-scripts start",  
  # "build": "react-scripts build",  
  npm start  
Now you've got a hot-reloading development server that can handle (and transpile) ES6 / TypeScript, import JS/CSS with 'import', etc. If you want more control over webpack you can 'eject' to be able to edit the webpack configs. Note that you don't need to use React to use react-scripts. I just want to get up to speed fast, without doing a 4hr webpack deep dive.

Hot reloading definitely saves me some extra RSI.


react-scripts is one of the most bloated pieces of front end horror I’ve ever worked with (and the amount of abstraction on top of it that aims to hide that fact only makes it worse), their webpack config reads as a silly joke. All you have to do is to eject and you see how bad it is.

Esbuild + a script in package.json and you’re done for modern front end. Maybe run a tailwind watcher.


> without bundlers, CDNs, or __NodeJS__

> When I need to add a dependency, I invoke download-package and then declare it in the import map. I like this setup because it feels like I’m only using the bits of the NodeJS ecosystem

Thats using node/npm but with extra steps...

Anyway kinda like the importmap thing, it would have been better if he didn't reference solidjs but thats life.


I appreciate the principled avoidance of over-tooling and dependencies. importmap is neat!

edit: In the same vein; cross-linking to "Writing JavaScript without a build system (jvns.ca)" https://news.ycombinator.com/item?id=34825676


Yeah, two really similar posts in a short time


This has been happening on HN a lot nowadays.

Tons of instances where there there are two closely related posts on the front page at the same time. Wonder if one post is inspired by the other, or they are usually independent.


well sure, client side reliable ES means we can drop all this compiled and packed stuff

but bold move cotton

nice to see there is a polyfill

I’ll think about it for my next side project, but I’ll probably just do something I can add to my React + Node portfolio on github because recruiters and employers won’t know about this until 2027



No support in Safari, unfortunately.


"I typically start <...> using a file:// URL."

"Ideally, I’d just grab the framework files, import them from my JavaScript, and then carry on with my file:// URL."

Does this work with file:// URLs at all?

I see in the article that paths in import maps start with /. So, the author must be using a local server, otherwise browsers will block loading the modules by CORS policy. Is there something I'm not getting here?


> Some frameworks have a hosted CDN option, but I’m similarly uncomfortable accepting that infrastructural dependency. Ideally, I’d just grab the framework files, import them from my JavaScript, and then carry on with my file:// URL.

I mean, you can literally just download the files to a folder on your computer. It's not like it stops working just because it's not hosted on a javascript cdn


That's what I've been doing for decades, but in current browsers `import` doesn't work with file urls, and whatever you download will need that


geez. how did everything get so crappy


I was recently playing around with a similar approach using Alpine.js and it was really fun just hacking on an html file. Didn't know about importmap, but I'm excited to try it out. There's a time and a place for bundlers. But if you're just spending a lazy weekend on a toy project for fun, this is a really satisfying environment for prototyping


The problem with import maps is that external file maps are not widely supported yet and debugging is a nightmare, when they work fine, but for anything that handles prefixes or a bit more complex setups it becomes basically poking in the dark. Instead i still use either re-export files or service worker route rewriting.


Neat idea. My initial concern with this approach is that dependency versions are nowhere to be had, which feeds my second concern which is that a build / ci system would struggle and deployment of this approach would have to be done manually, barring building out custom ci / cd to accommodate.


The developer experience just sucks compared with regular project with build, hot reload, etc…

We have also tried this approach and I still think that it might be the future. But apart from using it on tiny projects and experiments, the DX is just subpar compared to any modern vite based “build-full” stack.


>Later in the project, I could opt into more of what NodeJS has to offer

Haven't used NodeJS in a while and my current projects have no build step. But I'd like to hear some opinions on what specific threshold of complexity would lead one to add NodeJS.


I like Astro for this reason. It feels light and modular compared to the other frameworks.


I hadn't worked with unix/linux much so I had to ask ChatGPT what programming language/scripting language the second code block was written in.

"The code you provided is a shell script, commonly used in Unix-based operating systems. It uses the "set" command to set shell options, "-eo pipefail" means that the script will exit immediately if any command fails and that it will propagate any error code through a pipeline."

"The script then declares four local variables using the "local" command, which are used to construct the name and location of a file that will be downloaded using the "curl" command. The script creates a directory using "mkdir -p" and then downloads a file from a remote URL and saves it to a local file using the ">" operator."


You can check -> https://generator.jspm.io for generating these import-maps w.r.t to CDN's


> "solid-js": "/node_modules/solid-js/dist/solid.js",

Wouldn't this require node_modules to be publicly accessible?


What about the production build? I guess Importmap paths at least would need adjustment.

Not strong on the frontend that's why genuinely curious.


Rails 7 use importmap as default!


Good write up and the entire website is pretty tight, good design and smooth


You can use it on your sites that will probably be used by 5 people. Good luck.


I'm a fan of this actually. Thanks for introducing me to importmap type!


Have you tried the same with remote imports to save the manual download step?


Doesn't work in Safari!


I am developing a media-heavy web app, and I was surprised at volume of features that work in every other browser but not Safari. Or features that behave just differently enough in Safari to require code re-writes. In this regard, targeting Safari feels a lot like targeting IE 6 back in the early 2000's.


maybe works working alone but try this with a team


Seems like an incredibly ineffcient way to build a SPA. Fine for small pet project though but if you plan to build a SaaS or something this is just silly




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: