The problem are mostly with images and javascript.
That is why I wish the next generation image format to push the quality and size ratio. I think JPEG XL currently has the best chances at succeeding Jpeg.
We should have faster Internet (5G and Fibre ) for everyone, and better Image compression for everyone. Hopefully in 5 to 10 years times this problem will be a thing of the past. Assuming we dont bloat every website into WebApps that tries to download 10MB before it even load.
No there's not. Web bloat is a problem, but it's not useful to compare media-rich audio-video capable applications with the lightly formatted, text-only, Kindle book format.
I'm comparing content to content. Yes, some is media rich, but if I want media heavy, I'm likely using Netflix/Youtube or similar anyway. If I'm using a web page, I actually actively don't want media heavy content.
And you can get graphic novels on Kindle nowadays. To pretend it is just text is to undersell the formats now. Could I claim they are bloating? Certainly. Still have a long way to go before they reach current web browser bloat. Which only seems to be marching on. With no signs of restraint on what to pursue.
It’s also not useful for a web page you’re visiting for a few tens of kilobytes of text and a few images to be a media-rich audio-video capable application full of trackers and ads-that-might-be-malware.
Google says the average size of a Kindle book is 2.6Mb. Web pages are often bad but they're not that bad. If you regularly find Kindle books download faster than webpages I suggest you talk to your ISP.
You've picked some very heavy websites, and only two (or three, depending on Twitter) of your six examples are over the 2.6Mb threshold, but fair enough. Popular sites tend to be full of images, and images are usually quite big.
However, this is good, because those are great examples of how browser lazy loading is going to help. When I loaded Reddit it pulled down 7Mb of data, but more than 5Mb was images. Looking at the content above the fold my browser downloaded about 4.5Mb that it doesn't need until I scroll. This change to the HTML spec will get all those sites first load below the average size of a Kindle book. Awesome.
All the examples are indeed over 2.6Mb as that's only 325kB. You probably mean 2.6MB instead but since you're using Mb consistently and the discussion is about transfer sizes I'm not sure.
My Kindle books download in a few seconds. And then they are loaded completely and I can jump around in them rather well now. Properly linked books even let me jump from exercises to answers and back rapidly.
And there is some irony that I am likely tracked heavily on what I've read. Certainly on what I note. So it isn't like I'm clamoring for no scripts. Just find it odd that the push for web applications has destroyed the use of web pages.
Websites make gazillion requests and run a bunch of javascript that is in their (not necessarily your) interest. Download size is not the only driver for slowness.
The majority of those requests aren't blocking though, and nor is the JS code when it's parsed and run, so it won't have much of an impact on the parent's perception of how long it takes to load a page. There will be exceptions on sites that have been poorly made but those are unusual these days. If you sit with devtool's Network tab open you'll see a ton of stuff going on, but most of the bad stuff is after the page has rendered and become interactive.
Web apps are a different story because they often load a couple of meg of JS before anything happens, but so long as things are being cached correctly that's only a problem occasionally.
Surely browsers could already do that without a lazy loading spec. I mean, I’m pretty sure browsers already queue requests so that if you load a page with 10,000 img tags it probably won’t do 10,000 simultaneous requests.
Not simultaneous, but it does queue them with some parallelism (6 request for chrome IIRC).
Page basically dies; any Ajax requests are at the back of the queue. Scroll halfway down that page, you'll be waiting 5 mins for the images in viewport to load since they are processed in order. You can roll your own lazy load, but it's a pain and often done poorly. A good browser implementation would be great for most pages (but 10k+ might still require custom work).
If they're annoyed they'll go on the Internet and find out how to do it. Defaults should be useful to the many, and the few can reconfigure their UA.
Think about all the use-cases you could build. Maybe you could later set "I'm on a metered connection" at the OS level, have your browser pick that up, and not overuse your metered connection, etc. etc. Maybe you could have that in a browser extension that you manually toggle. No, this is far too useful.
I'm still skeptical. Is there any precedent for similar features that are used today? Closest I can think of is turning off JavaScript. Which this seems made to combat, honestly. And, honestly, I doubt that has penetration worth talking about.
And again, just don't put so many giant graphics on a page. Problem mostly solved. With less tech and likely faster results.
Please read the spec. Your accusation of this being intended to defeat turning off JS is dealt with here[0]. It's only a +188 -66 patch and eminently readable. If you are discussing this without reading it, this is going to be an unproductive discussion.
The size is a non sequitor. This is something that otherwise requires javascript. Full stop. No?
Now, I don't actually think this is being done to sidestep people that turn off javascript. Mainly because I just don't think that is a market worth worrying about.
But I can't see why this is a feature that we need. Progressive loading, I could almost see. But by and large, high resolution images are just not compatible with high speed page loads. I'm not seeing how this feature actually changes that.
The link is to the part of the diff that enforces that it behaves identically to current unannotated tags when Javascript is disabled. Obviously if it's designed to be no different from the current state when JS is disabled then it can't combat disabling JS.
See, for me, disabling it would be not eagerly loading anything. This is literally putting the site into what is assumed the least optimized state of scripts are off. Ironic, as typically scripts off should be faster.
If Chrome disables lazy loading by default then web sites will continue to just use javascript lazy loading as they do today. Disabling lazy-loading in your browser is inherently something that can only be useful if it's not the default.