Perhaps what's surprising to non-programmers is that older code tends to be better/faster/more optimized than new code, which is unintuitive because new hardware is orders of magnitude better.
It's an inverse relationship. As the "just release a patch" method of development takes over the idea of a 'gold master' disappears. When software used to ship on floppies and no one had modems, in general software had to ship working 'well enough' because it was not easy to fix.
Also older software tended to be documented better because there was no 'fire off an email' or 'ask Bob on Slack' what this code means. The barriers to quick communication where higher, hence reading the code comments would have been the path of least resistance.
I completely agree. I wonder if there would be a niche for high-performance languages that are maybe a little harder to write than Python but still easy, and closer to the performance of C.
“better” by definition of often applying rather absurd hacks for the sake of performance, yes, hacks that very often also compromise the output of the end result.
In particular, some old video games feature the absolutely most wonderful hacks to achieve what they wanted to with the available hardware, but also often lead to engines having to make particular compromises in possibility of environment design that many players failed to notice.
Many older game engines were actually incapable of stacking walkable planes on top of each other, though the levels were three dimension and one could ascend and descend, there was never any walkable surface directly under another, which was necessary for some optimizations.
Indeed. For anyone who grew up in the industry after the Internet was widely available, it might be unnatural to visualize how much effort was spent on making sure the code is correct before shipping.
Shipping meant producing that gold master build and sending it off to a floppy replication service and then got distributed to the store shelves. That 1.0 version was frozen until you'd ship a new version a year or more later. There was no second chance for that release, code had to be as close to perfect as possible, first time out.
> Is it arrogance to say that medicine is far better than it was in the past?
Irrelevant. Medicine and Software Development are not correlated, just because we want them to be. That's the point. Software Development results seem to have devolved (quality metrics such as binary size for equivalent reliability, functionality, and notably time to release), despite improvements in almost every aspect of development tooling.
Also interesting to note, these quality metrics follow along expected curves (accounting for development tooling) when DUPLICATING (sometimes with small improvements) working software. This tells us a great deal about what is a large problem with modern software.
People expect much more features with modern software as before, and accept worse reliability because of the faster bug fix cycle.
Sometimes even playing music doesn't work on my mobile phone, and I have to restart it. I never had that problem with my casette player 30 years ago. But that doesn't mean that I go back using a casette player instead of my phone for listening music.
Interesting that "avoids causing unnecessary costs" is not a quality metric to you. It's fun to see how it can suddenly become a really really important metric on platforms with size limitations, even if they do not care about "cost to users", which is the more common effect of binary size.
That cost is frequently swamped by development costs, runtime costs, support costs. Platforms with super restrictive size limitations are a small minority, these days you can plop a 5-10MB binary on your toaster.
To the vast majority of software developers and software projects out there, except for very constrained embedded environments.
Server back ends don't care about it, front ends don't care about it despite the lip service paid, games sure as hell don't, etc
A small minority of web devs seem to care, together with creators of already gigantic mobile apps and as I was saying, embedded devs working in very constrained environments.
The market, as a whole, maybe has this as their 100th priority. Which means it's not.
This is often the excuse we hear for why modern software is bad: "well, normal people don't care about that", a strawman argument that implies anyone who does care is abnormal and therefore worthless.
> anyone who does care is abnormal and therefore worthless.
No, they're not worthless, they're worth less :-)
Don't tell me, tell the business people financing everything we build. By and large, they don't care, it's reflected in incentives and it's reflected in what we build.
> Don't tell me, tell the business people financing everything we build. By and large, they don't care, it's reflected in incentives and it's reflected in what we build.
Another common scapegoat for developers who write bad software. I just wish we had a little more integrity as an industry, especially since so many of us insist on calling themselves "engineers".
To use a more extreme example, regular engineers build tanks and jet fighters and drones and machines that make mustard gas, etc. The software world is just a reflection of the outside world, at this point.
I don't think that's quite the point they're making. They're saying that small binary size doesn't always translate to user value. Technical excellence can be viewed as a goal in its own right, but it's not the same thing as user value.
Yeah, for 99% of software produced, smaller binary sizes do not offer increased user value.
Software designed to spec and user requirements does (where those requirements could be fast development time, high performance for whatever the performance criteria are, longevity, easy and cheap extensibility, etc.).
Binary size, as I was saying before, is like 100 on the list of priorities for most categories of software. Mobile apps sometimes have it as a priority and embedded apps frequently have it, too. But for mobile apps those limits are loosening, so soon they'll stop caring, too, and embedded apps are a minuscule percentage of all apps developed out there (most of them are web apps, especially Line of Business - LOB - apps).
Computer science evolved a lot, we have self driving cars running on the road. Of course the computing power improved a lot, but deep learning algorithms are also improving faster than Moore's law.
It's just improving classical computer science has less practical benefits than machine learning nowdays, so the focus of research changed.