Hacker News new | past | comments | ask | show | jobs | submit login

A "fast website" is super-easy to create if you don't add dozens of megabytes of useless crap to each page.

Two decades ago: hardware was slower, bandwidth was far more constrained, and browsers didn't have so many features nor take up so much resources --- and yet the speed of page loading was often much better than it is today!

Indeed, most of the "problems" web developers complain about are self-inflicted.




Things weren't just slower, they were orders of magnitude slower. It's ridiculous how we've managed to make computers ludicrously fast and the developer response has been to keep adding garbage until they are slow again.


Same with disk sizes and game file sizes, it's common to see 60gb+ installs for AAA titles.


Games specifically primarily take up space due to art assets. A high fidelity modern AAA game full of thousands of 4k+ textures, super-high-poly models, detailed animations, and hours of high quality audio is going to take up a lot of space. There are certainly optimizations to be had, but there's no way a 60GB+ game today would ever fit on, say, a DVD, at the same level of fidelity while maintaining the same player experience.

Higher compression means longer load times, so there's incentive to not compress more than absolutely necessary, and the biggest games tend to have huge worlds, which means you can't hold the whole world in memory at once. So you have to stream it in.

It's a constant balancing act between your nominal hardware target, the space the game will take up, up-front load times, and the amount of stuff that can be in a scene before the hardware can't keep up and you get model/texture streaming pop-in, stuttering, or an otherwise degraded player experience.

Perhaps in the next few years we'll see games leveraging super-resolution AI to quickly produce usably high-res textures from lower-res installed ones in storage faster than a directly compressed equivalent could be.... or games will leverage the same to take what they already ship and make it even higher detail...


> but there's no way a 60GB+ game today would ever fit on, say, a DVD, at the same level of fidelity while maintaining the same player experience.

I mean, you could make a modern open-world game where all the textures are procedurally generated from set formulae, ala https://en.wikipedia.org/wiki/.kkrieger .

It might even load quickly, if the texture generation algorithms were themselves parallelized as compute shaders.

But it'd be a 100% different approach to an art pipeline, one where you can't just throw industry artists/designers at the problem of "adding content" to your world, but instead have to essentially hire mathematicians (sort of like Pixar does when they're building something new.) Costly!


I don't know if you're being ironic or not. We would obviously not have as many great games if only math nerds were allowed to design them.

In fact, games are one of the few areas where all those compute/storage resources in private PCs are mostly justified.


I’ve shipped almost pure HTML landing pages (Single kBs of total JS, none of it render blocking) in the last few years, and inline CSS and preload headers help a lot, especially for older devices. Expectations for image/video resolutions in particular have gone up since “the good old days”. You can really see the effect with much older devices, which become the majority once you look outside the US and EU.

IME most people trying to optimize their way out of tag manager, monolithic SPA hell don’t generally bother with these kind of features outside of turning on all the cloudflare rewriting and praying. If performance was important to them and they knew what they were doing, they’d fix those first.


This is like when people complain about legacy code they first see it without understanding the context the code was written in. Or saying C coding is safe if everyone was just more careful and more skilled.

It's just not true that it's super easy to write fast pages. There's a huge amount of background you need to understand to optimize fonts, images, CSS, your server, your CMS, caching, scripts etc. There's multiple approaches for everything with different tradeoffs.

Even if you have the skills to do this solo, you might not have the budget or the time or the approval of management. Big websites also require you to collaborate with different developers with different skillsets with different goals, and you need to keep people in sales, graphic design, SEO, analytics, security etc. roles happy too.


Typical page loads were much slower two decades ago than they are today by my recollection, unless you were lucky enough to have something better than 56k dialup.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: