Hacker News new | past | comments | ask | show | jobs | submit login

Years ago a Digital Ocean virtual server of mine stopped working because I had never upgraded Ubuntu to the newest major version. After a few years, the version of Ubuntu was no longer supported by the Digital Ocean hypervisor and couldn't mount or boot at all.

In my experience, yes you absolutely need maintenance. In the past I've had to upgrade from HTTP to HTTPS, upgrade the OS, upgrade to newer versions of external API and embedded components because the old ones were deprecated, handle a domain registrar shutting down, and then yes absolutely PHP updates and upgrades for security that then start giving you warnings because less secure versions of functions are being deprecated...

And frequently updating the one thing that's broken necessitates upgrading a bunch of other things that breaks other things.

I literally cannot imagine how you would keep a PHP site running on a virtual server for 10 years without any maintenance. I need to address an issue probably roughly once a year.




These are all problems that shouldn’t exist. You have succinctly described the problems with modern IT. Software doesn’t need to have an expiration date. It doesn’t decay or expire. But because of our endless need to change things, rather than just fix bugs, we end up with this precarious tower of cards.

If, as an industry, we focussed on correctness and reliability over features, a lot of these problems would disappear.


But the hardware does expire. Computers aren't just magically "faster" than they were decades ago; they're altogether different under the hood. An immense number of abstractions have held up the image of stability but the reality is that systems with hundreds of cores, deep and wide caches, massively parallel SSDs and NICs, etc. require specialized software compared to their much simpler predecessors to be used effectively. Feature bloat is a major annoyance, and running the old software on new hardware can give the appearance of being much faster, until it locks everything up, or takes forever to download a file, or can't share access to a resource, or thinks it has run out of RAM, or chews up a whole CPU core doing nothing, etc.


Yes, these problems shouldn’t exist. But they do.

One of the big strength of the Web is the commitment of Mozilla „do not break the web“.

But this is hitting its limits, because the scope of Javascript is being expanded more and more (including stuff like filesystem APIs for almost arbitrary file access — as if we had not learned from Java WebStart that that’s a rabbit hole that never stops gifting vulnerabilities), so to keep new features safe despite their much higher security needs, old features are neutered.

I lost a decentralized comment system to similar changes.


I agree there's some truth in what you say. I do think these upgrades are part of a path towards correctness and reliability (bug fixes, security vulnerabilities, etc).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: