Hacker News new | past | comments | ask | show | jobs | submit login

Web pages are in fact that bad. Testing just now:

* https://www.cnn.com/ is 1.3MB of data.

* https://www.nytimes.com. is 5MB of data.

* https://www.reddit.com/ is 6MB of data.

* https://www.google.com/ is 400KB of data.

* https://www.facebook.com/ (not logged in) is 2MB of data

* https://twitter.com/home/ is 1-3MB of data depending on the ads it decides to show.

Those are all on-the-wire sizes, so after gzip compression and whatnot.




You've picked some very heavy websites, and only two (or three, depending on Twitter) of your six examples are over the 2.6Mb threshold, but fair enough. Popular sites tend to be full of images, and images are usually quite big.

However, this is good, because those are great examples of how browser lazy loading is going to help. When I loaded Reddit it pulled down 7Mb of data, but more than 5Mb was images. Looking at the content above the fold my browser downloaded about 4.5Mb that it doesn't need until I scroll. This change to the HTML spec will get all those sites first load below the average size of a Kindle book. Awesome.


All the examples are indeed over 2.6Mb as that's only 325kB. You probably mean 2.6MB instead but since you're using Mb consistently and the discussion is about transfer sizes I'm not sure.



If you felt mistreated I'm sorry. But it was a genuine question.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: