Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Do you think software quality is going downhill? If so, why?
24 points by bazil376 8 months ago | hide | past | favorite | 30 comments



No. Rose-colored glasses. Back when software was purchased in shrink wrap, quality was definitely a crap shoot. It was normal to reboot your DOS machine because the memory hack you used for one program didn't work for another (himem.sys and emm386.com). FAT required regular defragmentation and integrity checks. You rebooted just about every day up to the early 2000s because any desktop OS running at the time became more and more unstable with memory leaks as the days walked on. Games were buggy but didn't get regular patches you could just download on your 2400 baud modem.

Software is more complex than ever before by a wide margin, but for every Electron bloatware there's a ton of sleek Swift-driven amazement. And speaking of Electron, as maligned as it is, those apps are still Unicode-compatible, networked, and self-updating as table stakes.

There are some amazing old programs out there (have you seen the amazing engineering behind coreutils?), but things are most definitely not worse off today.


Absolutely, yes. For example, Windows peaked out at Windows 7. These days one just cannot expect software and websites to be usable even on the happy path. It may not crash, as others pointed out, it just does not accomplish the task satisfactory. With very few exceptions.

I see two main reasons. First, too much reliance on automated testing, which may catch crashes but does not guarantee good UX per se.

Second, the quality of recent talent in the industry. Too many people are in it for money, and could not care less about craftsmanship.


You could argue that the demand for software developers dilutes quality. When we're desperate for bodies that's going to mean we settle for inexperience.


One time my dad installed a new driver on a Windows 95 machine that somehow broke the sound card output. Sound on that computer didn't work again until I re-installed the OS. Rebooting that computer meant walking away and finding something else to do for a while.

The macs I got to use in school felt inhumanly slow. I never got to use one of the fun colored macs but I heard those were a big step up.

Windows XP was a big improvement. Downloading and installing SP2 and SP3 was daunting on a 56k connection. Still, the OS was plagued by malware. I remember running various malware scans constantly.

Mac OSX on the first intel based macs was a massive step up from Windows XP. It converted me from a die hard Windows fan to a mac user. I still prefer macs.

For a while I used Windows 7 and it was ok. Better than XP and everything that came after 7, but still not that great in my opinion. One thing that drove me crazy was fixing fuzzy tex issues in Windows 7.

Not long ago I re-installed Mac OS 10.4 (Tiger) on my 2007 MacBook. I was surprised by how snappy it was compared to my 2018 MacBook Air, and even my M1 pro. To this day the computer I miss the most was my 2013 MBP.

I would argue that Mac OS kept getting better until 2013-2017-ish. Windows peaked at Windows 7. Linux desktops struggle to compete with Windows. The operating system play a huge role in user experience. Declining UX quality in operating systems makes the good stuff not as good and makes the bad stuff really stand out.


No, software is not going downhill. Rather, it has reached the bottom of the hill.

The incentives are completely upside down. The chief "innovation" of the past decade has been to shift software from being something that at least theoretically worked in the interests of its users to something that works in the interest of a burgeoning digital surveillance industry.

The field has always been ruled by a kind of fashion. But with Metcalfe's law plus the new mainstream popularity of the industry, the definition of "success" has become whatever has the most advertising money faking social proof for it. In other words, what is expected to be the most lucrative to exploit. Metrics now revolve around things like how much of users' time has been wasted, or whether they were tricked into buying something.

And to continue the landscape analogy - with remote attestation, software is now poised to be pushed into a canyon. Widespread computational disenfranchisement as corporations can wield software against individuals, but individuals will no longer have even an escape hatch to use software for cutting through that generated complexity.


I've switched to the golang ecosystem for a couple years now and I have to say, I like it. I didn't like a lot of the opinions that golang enforces upon you at first, but the quality of software all adheres to the single, upstream, standard codestyles, standard unit tests, standard build system etc.

The beauty of it is that a lot of golang devs at some point to decide to build pure golang libraries for performance gains (especially in the networking area I am working in) and I love it.

There's a lot of very well maintained libraries out there, and golang.org/x package is just the start.


I certainly deal with a lot more errors day to day. My tasks are constantly interrupted by software bugs.

It could be that there's just a lot more software now. I do a lot more on my phone without human interaction.

The biggest issue is the lack of alternative when software fails. When the computer says no, you'll have a really hard time talking to a human. You just get a really hard-to-find contact page that forces you to talk to a chatbot.

Software is also slow now. It's crazy how many delays modern UIs have. It also has more and more dark patterns that make you feel like you are used by the software and not the other way around.


Software is great for what it does, unfortunately the good stuff in the commercial space is all subscription based.

FOSS is better than ever, but it hasn't kept up with commercial software on cross platformness. It's just too much boring tedious work that nobody really wants to do without being paid.

There's no FOSS alternative to Google Keep that has full parity, despite how simple it is, P2P has stagnated and been replaced with self-hosted stuff, which I have just about zero interest in.

But Jami, BitTorrent, and SyncThing are going strong, so p2p isn't quite dead yet.


Not really, we've always had a mix of great software, terrible software and everything in between. Take the gaming world for example. There are definitely a lot of buggy messes being released on Steam and modern consoles (like many titles in early access), but there were also about as many car crash level disasters being released for the NES or Atari too.

There's not much of a quality difference between say, The Day Before and King Kong the Rise of Kong vs Big Rigs and Superman 64. They're all poorly made trainwrecks that fail miserably at doing what they set out to do.

And the same goes with most software. CMS systems and things for web sites? There are about as many great and terrible examples now as there were during the 80s. Same with desktop software, mobile apps, web browsers, etc.

The thought that things are getting worse likely comes from the nostalgia filter, and how terrible products/art/whatever tend to get forgotten by history. Over time, the only works that stand out are those that are either amazing (and seen as classics to this day) and complete disasters that gain a certain amount of infamy. So you'll see the standouts compared to the flood of mediocrity around at the moment, and think the olden days were better. There was probably about as much mediocre crap released then as there was now, it just kinda got forgotten like most mediocre things are.


Software does infinitely more now than before and there's way more of it - is there some specific aspects of it you think is going downhill or you're asking about? You could always point to examples like Google search quality subjectively getting worse or video streaming/AI becoming way better but "software quality" doesn't seem very well defined. And hardware's improved a lot too so now we can have awesome things like electron that let you make desktop apps using web technologies which is very cool. Flash is dead (still sad about flash games tho) so we've got html 5 and stuff making the web way better too. Just the level of accessibility means so much more software that does so much more - a lot might be poor quality but then you also get things like figma and zoom and slack and discord that are all very good.


i think the answer is complex...i used to get terminal viruses on my PCs...on operating systems i loved. Now - operating systems spy on me but i never have issues with "blue screens of death" (perhaps my internet hygeine is impeccable?

Small apps are 300-500mb where they would have been 3mb .exes in the past. This bothers me.

subscriptions bother me. as do adversarial business practises (the entire 3d drafting landscape is fucked for example...try to find one company that charges decent prices for hobbyists + doesnt involve the cloud, make everything difficult)

However software is more complex than ever, because what we expect to do with it only gets more and more complex.


I don't think that the quality has declined, I think the job has gotten bigger and the quality has stayed consistent, thus most perceive a lower quality.

the scope of software has exploded.


My anecdotal experience as a user is that softwre used to crash a lot more than today. I rarely experience crashes these days. Maybe once a month at the most.


> My anecdotal experience as a user is that softwre used to crash a lot more than today. I rarely experience crashes these days. Maybe once a month at the most.

I presume you don't use Windows.


I do but it's not my main os.

Stuff on Windows used to crash a lot more back 15-20 years ago.


What do you mean by quality? Reliability, security, efficiency?

Efficiency has certainly got a lot worse because it matters less on better hardware - but the pilling up of multiple inefficiencies can still makes things sluggish.

With the other too I think it maybe that we have failed to make things better than actually make things worse. Software quality has stayed the same but feel it more because we are so much more reliant on software now.


IMO, people who are running the company has very little or no perspective of software quality and how it will impact in the long run, specially if company is run by sales (CEO) guy. They just want to "release" feature, even if there is know bugs are there. Its a sad reality of lot of software shops.


Definitely not as a whole. Sure, there is a lot more bad software than ever before. But that is because there are a lot more bad developers than ever before. And that’s because there are a lot more developers than ever before.

It is easy to find examples of crap code out there, as they tend to stand out. But we take for granted all of the good stuff.


> But that is because there are a lot more bad developers than ever before.

The overall quality has decreased. We learned to expect more from computers, yet the computers evolved in pieces of spyware with dumbed down UIs, making people spend more time to do the same thing.

When there is 1 (one, uno, eins) text input field in an application that has focus, why doesn't this field has keyboard focus ? Why do i have to click with the mouse in year 2024, after more than 30 years of Windows, to be able to write in that field ? 25 years ago this was working.


No!

Developers these days are far, far better resourced in almost every way imaginable, and the ever climbing salaries of the industry have pushed them to make good use of those resources.

Much better to ask if the quality of your software has gone downhill. If you ask and honestly answer that regularly, you will end up better for it, I promise.


It’s easy to remember about the best software over decades and forget about the not so good software.

I think it’s a bit the same with music. Music didn’t used to be better, but we remember the best music and forget about the forgettable music.


The nature of the Software Crisis has changed, but the crisis is still here.

At the outset of that, the predictions were simply that software wouldn't get built because it was getting too complicated and expensive. The 80's saw a lot of developments around programs using module systems, dynamic linking, etc.

But the software being shipped at that time was still mostly either "appliance" or "back-office" software: data came in through the keyboard, and left by the printer. Most memory discipline revolved around static allocation and overlays, and the microcomputers were not doing a lot of multitasking(although it did come in with the 16-bits). There were more interesting things happening in big firms and universities, but the hardware was also specified around the type of thing they were doing and there was admin staff physically monitoring those shared machines. There was a lot of software that, in Unix "do one thing well" fashion, could do very little, but also did it reliably. And there was a lot of eye-wateringly expensive commercial software of this type: thousands of dollars for a compiler or database.

In the 90's, software got uniformly bad for a while because the PC became so much more powerful in such a short time, and the approaches that worked before turned into a moving target of "plan for the machine spec of 18 months from now". Wintel machines just weren't made to do the things they became capable of, and you had little way of knowing whether your crash was the application, Windows, or your drivers. Macs were no better.

There were really three things that crept to the forefront in this period:

1. The hardware being commoditized, but not open. To this day, Nvidia wants to control their drivers. And as long as we keep buying from them, they can dictate part of the software stack. They are hardly alone - every big player knows the game.

2. Essential complexity being addressed with accidentally complex protocols. For example, everyone uses UTF-8 now. It addresses a significant issue in a reasonably good way. But you still have a lot of systems in the wild using UCS-16. USB is designed to do everything, which gives it a high "floor price" for a system implementor compared with the classic parallel/serial mechanisms. The web browser ended up with Javascript. And so on.

3. A drift towards financialization within software. The "dot com" hype of the late 90's was what it was because the VCs had found a formula for getting companies to IPO with no revenue. When shrink wrap software became a business, it had lots of competitors, but by the 90's consolidation meant there was one "industry standard" per industry(Microsoft, Adobe, Autodesk, Oracle, etc.), and it became more interesting to target consumers with Internet appliances. This project of building the ultimate consumer platform describes most of the past 30 years or so, in different phases and across different facets.

If it weren't for open source it would not be possible to host so much complexity. Kicking things down a layer to a dependency has been the way in which the software crisis has been handled, but a lot of the hasty or accidental standards are still standard.


Well said! In particular; Essential complexity being addressed with accidentally complex protocols.

I agree with everything you say; however there is one important point which needs to be mentioned here; "Software Industry Velocity" is working to the detriment of "Software Quality". The whole idea of Agile/Fail-fast/First-to-Market/MVP/etc. has been distorted beyond commonsense practice. Moreover with distributed architectures having become the norm all sorts of failures have become part of the design and hence they do not have the stigma they used to have before. Add the fact that due to the software explosion in the past few decades there are now so many tools/frameworks/libraries available that the threshold to entry is considerably lower thus allowing less skilled programmers to churn out less quality software, and you have the perfect storm of everything coming together to degrade "Software Quality" in service of "Business Objectives".


It's simple, one can make good money maintaining bad software.


Software quality has risen but requirements have risen faster.


> Ask HN: Do you think software quality is going downhill?

yes.

> If so, why?

CADT. Fixing bugs is much harder than a complete redesign. Addressing existing problems is much harder than implementing new ideas.


The PHP ecosystem has never been healthier or stronger, and that has much improved what used to be an enormously buggy stratum of the web.


> The PHP ecosystem has never been healthier or stronger

... which is not saying much.


If you are talking about updates eventually bringing down quality for a specific software, yes I think so. I wish it would be possible to just get security updates.

Also, I think that bug-free software never existed... hopefully one day it will, not too hopeful though for it to happen in my lifetime.


> I think that bug-free software never existed...

Sure did. E.g. in 1985 I shipped an implementation of the real-time multi-tasking music programming language AMPLE in which no bug was found over its entire commercial life.

https://www.retro-kit.co.uk/Hybrid-Music-System/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: