Hacker News new | past | comments | ask | show | jobs | submit login

slower processors with more efficient software? For as far as we've come with hardware, the actual user experience hasn't changed anywhere near as dramatically because faster machines just allowed people to push out slower software, the same way that hard drives moving from MBs to GBs to TBs just meant that games and applications bloated up to fill all available disk space and websites bloated up to consume increased bandwidth as we went from 56k to broadband.



The reason why games are tens or hundreds of gigabytes in size is that modern textures are high-resolution and models are high in detail; if you took away high-capacity storage, then you'd just end up with worse-looking games. The reason why Web sites are big is largely images, videos, JS, and CSS; if you took away broadband, you'd end up with worse-looking and less functional Web sites. It's like the broken window fallacy in economics: the fact that having limitations gives smart developers an opportunity to work cleverly within them doesn't change the fact that limitations leave everybody worse off overall.


> The reason why games are tens or hundreds of gigabytes in size is that modern textures are high-resolution and models are high in detail;

not "the reason", just "a reason". A lot of it is just laziness. Not compressing audio and video or doing a very poor job of it. For example a Fortnite update took the game from 60GB to 30GB without making it look like garbage so how did they do it? Optimization. Why didn't they do it sooner? Because they didn't care.

> The reason why Web sites are big is largely images, videos, JS, and CSS; if you took away broadband, you'd end up with worse-looking and less functional Web sites.

Video and images are again "a" reason (and again often the issue is poor compression) but so are bloated JS frameworks, user tracking, and ads. Even very simple websites can bloat to grow larger than full novels. (there are some good examples of this here: https://idlewords.com/talks/website_obesity.htm). You could cut the ads and the tracking and the JS bloat without any impact on the content delivered or the functionality.

People just don't want to take the effort to lower file sizes which is why people have to turn to things like repackers who can cut the sizes for downloads by more than half. Somehow they manage, and they do it for free no less, but game publishers can't?

We could do much better without sacrificing anything (that users care about) in the finished product. If we had to go back to slower processors people would be forced to care enough to write better code and optimize for speed. At least until some new trick for faster speed was developed at which point very little in our lives would be faster, but the code would be slower again.


Also, it does no good to compare the best software of the past with the average or worst software of the present. We tend to forget the average software of the past, and I would guess that most of us weren't exposed to the worst of it.

Inefficient software has been a problem for practically all of the history of personal computing. I'll illustrate with two anecdotes:

I was in high school when Office 97 came out. I only have a vague memory of this, but I do remember one of my classmates complaining that it was sluggish on whatever computer he had at the time.

The first commercial product that I shipped, in 2002, was a desktop application written in Java (though, as shipped, it only ran on Windows). I didn't do most of the original development; it was developed by an outsourced development shop, and then I had to take over the project. Whether on my underpowered 366 MHz laptop or my boss's much more powerful machine, the app took considerable time to start up, so much so that we put in background music during some of the startup process (the app was a self-contained audio environment for blind people, so that was our equivalent of a splash screen). I never really dug into what caused the slow startup, but in hindsight, I would guess that it was the late-binding nature of Java, particularly the fact that classes had to be loaded individually as they were first used, often from multiple jar files, not to mention loading native code from multiple DLLs. The peanut gallery here may say the app should have been a statically linked native executable, but for all practical purposes, that would have meant C++, and if that had been a hard requirement, the app would never have shipped at all. And while we struggled to sell that particular app, it did have some happy users (indeed, for some of the target users that we did manage to reach, the app was positively life-changing in its impact), so I don't regret that we shipped it in its inefficient state. If the same app were developed today in Electron, with any decent build setup, I'm guessing it would be up and running in a few seconds.

Whether in the 90s, the 2000s, or today, most development teams have never had the resources to produce the highly optimized gems that we fondly remember from the past that we so often pine for. But the software that we do manage to ship is used and even enjoyed anyway. And, to bring this back to the original discussion, advances in hardware enable more under-resourced teams to ship more useful and enjoyable software that would otherwise not exist.


> for all practical purposes, that would have meant C++, and if that had been a hard requirement, the app would never have shipped at all.

This is a really good point. Slow software certainly has its place. Not everything needs to be as optimized as possible. I don't think that the loss of speculative execution would put us back so far in terms of performance that it would hurt slower languages like java or python, but I think it might encourage putting more effort into optimization and probably create more interest in lower level languages. It might even lead to new creative approaches to speeding things up. That said, I'd really rather processors stay fast if they can do it while still being secure.


I see it as a ladder of reducing complexity. There is you, power Joe and regular Joe. You can write a program in Assembly or C with syscalls and char ptrs, but then power Joe cannot. You both can write a program in C# or Java with their runtimes, but then regular Joe cannot. All of you can write electron and pyqt.

If we didn’t climb from KBs/MHz to GBs/GHz, only few vendors could ship their software, and that would suck even more.

For some reason there is no simple compiled language with simple but powerful runtime which could do the same thing electron does in KBs/MHz. It is not unrealistic and I think the problem lies within us, our methodologies and tradition to overcomplicate everything we touch. So anyone who tries to make a ladder has to cut through layers and layers of nonsensical abstractions, sacrificing performance and memory here and there, and only then you get something that business people can use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: