Shouldn't we have more devices and more connection types to have a more controlled experiment?
It's always 4G, mobile Chrome and I assume the same device.
Very likely same carrier at the same place, so roughly same connection conditions in terms of latency DL/UL bandwidth and jitter. Also always the same device with same CPU/GPU. Perhaps a flagship new shiny phone with a superfast SoC which gives a headstart to faster JS execution? Or perhaps a very spotty barely 1-bar 4G connection. (Just assumptions, maybe both are false, but you get the idea)
I'm a bit fan of client-side generation using JS too but I don't think this experiment is exhaustive of many practical scenarios.
If we see more connection types and more variety of devices with different CPUs then it'd be more convincing.
Measuring the speed of a page rendered on an iPhone 14 on an mmWave 5G connection a foot from an antenna is not a worthwhile test. If it takes 5 seconds for Twitter to load a tweet (which it does on my iPhone 12 Pro on WiFi) is that somehow better? A tweet, famously limited to 140 characters takes 5 seconds to load?
A news article or tweet takes way too long to load on my phone, it's just ludicrous that on a mid-range phone and connection it would take 45 seconds! A copy of Frankenstein[0] (~78k words) weighs in at 463KB. A random CNN article or tweet is not a damn copy of Frankenstein. There's no reason either should take more than a second to load and render.
An HTML document with a bare minimum CSS to not be ugly has enough information to render and be useful to a user. It can do that with a single request to a server. At minimum the same page rendered with JavaScript needs two connections to a server. It's also got a higher minimum threshold for displaying something to the user because the JavaScript needs to be downloaded, parsed, interpreted/JIT, then requests for useful resources made. All to do things a browser will already do for free.
There's full JavaScript applications that can't be built with just HTML and CSS. Of course they need to load and run the JavaScript. A tweet or news article are not applications. They do not need to load the equivalent of copies of Doom to display a dozen paragraphs of text or just 140 characters of text. The modern web's obsession with JavaScript everywhere is asinine.
4G on a Moto is basically the worst case scenario but also how half the world interacts with the internet at large. If you're going to pick one scenario that describes a lot of users, they're pretty much dead on.
I think 4G doesn't tell the whole story, especially in several businesses that target users in specific conditions (e.g. tourism, where your users have poor unstable 4g) or specific markets withpoor avera8ge mobile connections.
Some of the tests in the article are run on Desktop Chrome using a "cable" connection speed instead of 4G, which looks to have about a 6x faster round trip time than their 4G does. Those results are a little less impactful but still significant (many seconds faster still in some metrics).
More testing environments would make the results more or less significant, as you'd expect.
In ideal browsing conditions, the impact will be more minor, and in the spotty barely 1-bar connection you mention, the difference would be much more dramatic than the 4G examples in the post.
It's always 4G, mobile Chrome and I assume the same device.
Very likely same carrier at the same place, so roughly same connection conditions in terms of latency DL/UL bandwidth and jitter. Also always the same device with same CPU/GPU. Perhaps a flagship new shiny phone with a superfast SoC which gives a headstart to faster JS execution? Or perhaps a very spotty barely 1-bar 4G connection. (Just assumptions, maybe both are false, but you get the idea)
I'm a bit fan of client-side generation using JS too but I don't think this experiment is exhaustive of many practical scenarios.
If we see more connection types and more variety of devices with different CPUs then it'd be more convincing.