Hacker News new | past | comments | ask | show | jobs | submit login

Over the past decade, software updates in general have mostly become another form of malware, as software makers care more about extracting money from their users or adding pointless shiny features than actually making their software fast, stable, or better able to serve the user.

As a result, my default policy now is "If it ain't broke, NEVER fix it." Install, then disable all updates, forcibly if necessary. On Windows and your browser at a bare minimum, and probably on any other software you use often or rely on. Yes, there are risks and downsides to doing this: security holes won't be patched, bugs won't be fixed, and new features won't be present. But everything in life is a balance. And on balance, the bad of updating is almost always worse than the good is good. This article highlights that in brilliant neon letters. How on EARTH do you justify boot and reboot times doubling or tripling, and just about every metric getting worse over time? Why would I let Microsoft fuck with my system if they're going to make it shittier?

And of course, the article leaves out the worst part. Never mind gradual performance drops, there's a good chance that update you just downloaded just broke something. Entirely. Something you NEED to work. When is a fix coming? Anyone's guess.

Install Windows 10 LTSB. It's the only one worth using. And unless you NEED a new feature or bugfix, never EVER update.




I can never relate to these posts. I always update all my stuff pretty much immediately when I see an update available. I run Windows, so I'm talking Windows Updates, browser updates, drivers, anything and everything. It actually gives me a pleasant feeling knowing I get bug fixes, security fixes, maybe some new feature every now and then. I'm definitely naive enough to hope for performance improvements rather than worry about performance regressions. Historically, it's extremely rare that an update messes something up for me.

Maybe I'm not as quick to update as I think I am, giving vendors time to fix broken updates before I get them? I dunno. I'm also privileged in that I update my hardware quite often. Maybe that hides any worsened performance from my perception.

I'm not sure if I understand your strategy correctly, but disabling (security) updates on Windows and browsers sounds like a recipe for absolute disaster. To me that sounds waaay more risky than any risk taken when installing (potentially broken) updates from MS/Mozilla/Google


I can never relate to these posts.

I can relate to them very well. I've wasted far too many hours cleaning up after one bad update or another. Windows and driver updates have been among the worst offenders. You could argue that the good updates might have protected me from malware that would have wasted even more, but I have no evidence to suggest this is the case.

As a result, I tend to be very binary about updates now. If it's something that involves direct contact with remote systems, it gets updated almost instantly, at least if the update is anything security related. Browsers, email clients, phones, publicly accessible servers, anything like that. The risk of not updating promptly in that situation is too high, even though I've seen many adverse changes when updating those kinds of products too. For most other things I use, if it's doing its job OK already, it probably gets updated if I have a specific reason to want a newer version and otherwise gets left alone.

I detest the modern trend for bundling essential updates like security patches together with other changes that users might not want, as the likes of Microsoft, Google and Apple all now do. Fixing a defective product is one thing. Changing it arbitrarily is something completely different.


And yet, I see this attitude pretty frequently in the software world. I too don't understand it (All my packages move to the latest dependencies as soon as possible). It's very often not the case that things won't magically start working after a version that breaks you. From there, it's just a ticking timebomb for some random CVE to come around making your app exploitable in all sorts of interesting ways.

Yet so many software devs take the approach of "Well, this version works, so why do the next?".


It’s a good idea to keep updates enabled so that you get security patches, but it’s ridiculous that you have to do this. The industry seems to have given up on the idea of making finished software, so instead you get endless churn - the bugs and vulnerabilities are infinite because the bug fixes are mixed into the same update stream as new features which themselves come with new bugs…


sony messed up a phone update a few years ago and it took them maybe two to issue another update to fix it. i can't remember what it was now but it was annoying enough that i stopped updating until a few other people confirmed it was ok.

ive had the same happen with two android apps as well so i have auto-update turned off now and just go through the list a few times a years. if the changelog just says 'buxfixes' but im not noticing any bugs, then i don't bother updating


I used to find new technology to be exciting. Now, new technology (speaking about more than just software here-- appliances, cars, etc) causes me to instinctively respond with "What features, capabilities, ownership models, etc, are being taken away from me with this new thing?"

I assume it's a function of my getting older (44), to a large extent. It can't all be about me getting older, though. I can't be alone in thinking that a substantial portion of "advances" in existing technology have been for the benefit of manufacturers rather than owners. Surely "normal" people are starting to notice this too.

Edit:

I also think: "What capabilities that I already had are going to be sold as an add-on or, more likely, rented back to me on a recurring subscription?"

A close second is: "Will the thing be usable when the company goes bust and takes their 'cloud' infrastructure with them? Or how about when they decide not to update their thick-client mobile device app for new device OS's?"

A third is usually: "How will this new thing spy on me?"


Eversince Windows 10 came out I don't think there'll be a back-to-generalized-efficiency-tool era for the masses.

At some point enterprise-grade companies that produce software get inefficient in terms of constant costs. They somehow need to make enough money for all the clutter on top that doesn't have anything to do with development/design of the product itself.

So, as long as a software producing company is a public company with shareholders and the market expansion isn't possible anymore...things necessarily _have_ to get crappier for the end users so that the company can keep producing more money out of nothing.

You can see that everywhere in enterprise level software companies. Microsoft, SAP, DATEV, just to name a few where they incrementally didn't make the software better - but instead artificially created lock-ins for their endusers so they aren't able to switch in future. Even when you argue that those companies have many market segments they deliver to, I'd go as far as adding all custom OEM Android apps to that list, as well as smart cameras, smart TVs, everything related to IoT etc.

Most of the smart devices are about removing control from the users and forcing them to pay more money for the same featureset later, gradually, without them noticing in the beginning.

Now with Google FLoC we basically will have the same thing applied to the web, and it won't turn back to the good old "just block cookies" ways. Google will persist in tracking everybody, and they got the leverage to do so.


I've been the same since Android removed phone recording capabilities.

Internet feels slower each year and the screen space dedicated to content and not ads or popups shrinks every day.

It's like in Idiocracy where the guy has a huge screen but maybe a tenth of is for the content.


I'd forgotten that TV UI from Idiocracy. Now that I see it it certainly does seem a bit prescient: https://scifiinterfaces.com/idiocracy-tv/

(Then again, sadly, a lot of that movie seems prescient...)


I don't remember your name popping up as my HN doppelganger the other day, but you could have been me writing that comment.

I don't think it's because we're getting older so much as because we can remember a time when new technology didn't suck like this and so we see straight through the rationalisations and marketing doublespeak.

We live in a time when a right to repair can be controversial, copyright is being distorted to limit what users can do with tech they have bought and paid for in ways that have nothing to do with giving copies to other people, and the working life of equipment is being dramatically shortened by software limitations or a lack of ongoing fixes for software defects from the manufacturer. There should have been laws protecting ordinary people against these kinds of dangers long ago.


I'm young (20) and I already have this negative view. I used to be excited about how fast computers would be. How good graphics would become.


In that case, install Debian Stable and stick to Firefox ESR. Nothing will ever change without warning, and you will have the most blissfully boring user experience.


Debian is great, personally i favour it most with either the LXDE or XFCE desktops - they're blissfully boring and functional!

I would have perhaps recommended Ubuntu LTS to some folks previously because of the long release/support cycle (even if you only decide to install security updates), but i guess with software packages like snap infecting the OS i can no longer make that recommendation.

Previously i would have suggested that some folks also look at CentOS because it's wonderfully stable and releases are supported for ~10 years, but i guess all of that was ended by Red Hat with CentOS 8. Maybe Rocky Linux will once again provide a stable RPM distro for free, without resorting to using Oracle Linux, but only time will show that.

Is it just me, or have many once stable OS releases have been killed off in one way or another in the past decade, either forcing people to migrate to paid projects, forcing automatically updated software that cannot really be controlled easily (snaps) upon them, or doing other shady practices for no discernible reason?

That said, i personally also only update software (like Nextcloud, GitLab, OpenProject etc.) manually between larger releases when i have made and re-checked backups of all of the data, before archiving the old versions and then migrating over a copy. I'm not sure whether i could live that way with OS updates or versions, though, without absolutely minimizing the attack surface - maybe with something like a locked down Alpine Linux.

Either way, it feels like perhaps automatic updates that can't even be controlled being forced upon users are the inevitable future. It's nice that there's Debian, but my question would be: "How long before it goes the path of Ubuntu?"


You write as if that would be a bad thing. Debian is probably the gold standard for reliability, longevity and user control today. If all software was so well managed, most of us would be better off. If there was also a good process for updating application software to a new major version if you wanted it, that would be ideal.


Even with Win10 LTSB there are updates. The last update did cost me 2 days of fixing my other Apps again.


I understand that it sucks to deal with bugs that may release with updates. That said, "never update" is both bad and irresponsible advice. It is important to ensure your systems are up-to-date, even if you choose to lag your updates by a week or a month.

Security is extremely important and in some cases can save your personal information and money from being needlessly stolen.

I can empathize with the slow downs, bugs and deprecation that may occur, but I can never agree that to never update is a good alternative.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: