Are there any documented cases of a business failing because their JS payload was too large? I get that smaller code is easier to understand/work with, but I've never been able to internalize the desire for small payload -- just doesn't seem like it ever matters outside of philosophic reasons for saving user's bandwidth (especially if it comes with a steep tooling cost).
It strikes me as a technical pursuit in search of a problem -- but certainly willing to be convinced otherwise.
My own customers have seen 4x different conversion rate, when grouped by page load time. (i.e. same slide as walmart page 37 above, where the peak is 4x higher than the lowest performing group).
Your business probably won't fail because it's slow, but it will certainly make less money because it's slow.
But in this context, the issue becomes how much JS bundle weight it takes to cause a 0.5 second delay. And there are many other things that can slow down page load time (and especially perceived page load time) that either don't involve JS at all, or are not directly caused by the amount of JS that gets loaded.
Depends on your connection. I gave a talk about CLJS page speed, and server side rendering at the recent Clojure conj (https://www.youtube.com/watch?v=fICC26GGBpg). Trying to load my site on the conference wifi, 200kB took 3 seconds = 60kB/s, so 30kB would have been enough.
"Ok, that's not real world". Today, the cable guy is here fixing my home internet, so I'm tethering my iPhone 5s. Loading the .js for my site took 40kB in 143ms = 300kB/s, so 150kB would have been enough.
So certainly less than half a meg of JS, which is very common to see in SPAs.
Certainly, there are tons of other things that can slow down a site. But the JS is one of the "easiest" to solve, because devs are responsible for it, and it has well known solutions, that people don't apply consistently. (reduce dependencies, only serve one file, Use webpack/closure, use CDN)
I agree there's certainly some correlation, but we can fix e.g. "slow connections" and "slow computers" by not serving 9MB auto-playing hero videos [edit:] when it's not appropriate for your user demographics.
I'm not in the position to intentionally slow down my customers, so I haven't run a controlled test. Google's test was controlled though.
I don't know that it would result in a business failing, but there's certainly lost opportunity. Marks and Spencer is a fairly recent example, perhaps not JS payload specifically, but lower conversions related to site performance.
Considering the audience might help as well. For example, if rural U.S. dwellers are a demographic you want, many of them are on low bandwidth connections.
I think there are circumstances where it is very valuable - it depends on your target audience.
For example on mobile, connection speeds in major cities in developed countries maybe rocketing, but substantial areas still need to make a lot happen with very little bandwidth. So it can be a small competitive advantage.
See my reply to your sibling comment. You can measure very real changes in SEO, user engagement, bounce rate, conversion rate and sales on even small changes (100ms) in page load times. I have personally seen conversion rates vary by 4x based on page load speed.
It strikes me as a technical pursuit in search of a problem -- but certainly willing to be convinced otherwise.