Hacker News new | past | comments | ask | show | jobs | submit login

Doesn’t reality suck the same ?

My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.

My house decays days after day. Floors need constant cleaning, wall have holes from small impacts, paint contains inedible fragments and disperse nocive gas.

Bees are building nests on my balcony and it’s definitely not what it was built for, nor where they should be.

How can we tolerate such a life ?




I live in an old house, and routinely discover ugly hacks that were done by the previous owner, presumably due to laziness, cost or just lack of skill. For example, they buried tons of stuff (toys, furniture, water heater etc) in the backyard and built a terrace on top of the pile to cover it up, apparently because they were too lazy to take it to the dump. The terrace decayed, so I had to tear it down, but in doing so I had to clean up their mess so I could actually use the garden. I'm not annoyed at the planks for decaying, as that is to be expected, just like you are expected to e.g. keep up with third party dependencies that you have chosen to include. Discovering a mess like the one I found in my garden, however, evoked the same feelings in me as when I look at a badly written code base and just wonder how anyone could ship something of such low quality to paying customers.

I guess my point is that there is a difference between things sucking because of the laws of nature, and things sucking because of incompetence, laziness or indifference.


To be fair, the previous person didn't know you were going to try to plant vegetables in their landfill.


But those same owners failed to mention the landfill during the handover


Not to mention, it was almost certainly illegal.


Out of sight, out of mind!


to be fair ulrikrasmussen didn't know the previous owner was trying to plant scrap trees


A similar thing happened at a relatives' house, a long disused storage space under a deck needed to be cleaned and whatever natural forces were at work had accumulated enough new dirt to actually bury items stored under there (a similar array of items, since no one had played with the children's stuff and a few old chairs and such had been thrown there).

It's a lot of work to dig a hole large enough for a water heater, I wouldn't be surprised if something similar happened (I probably would have also checked inside the water heater since if you wanted to bury something and keep it dry one might consider a water heater tank as a possible container, not sure it actually works but it's a natural idea).


When is the last time leaving your keys in the car caused your house to suddenly slide 10 foot southwest?

When is the last time you flipped a light switch, and suddenly your pool disappeared?

Have you ever had French doors appear in your dining room because of a "Windows Update" on Wednesday morning?

Have you ever had to wait for half an hour for your house to boot later on that same Wednesday?

When is the last time you closed a door, and were killed by a hailstorm of bowling balls?

At least with a light switch, you know it's very unlikely to cause structural issues, or plumbing issues, or drain your bank account. Computers are singularly horrible in the ways things can fail.


I agree with your underlying point, but it's also important to point out that computers are also singularly wonderful in that it's usually much faster and easier to reverse failures, and then to diagnose and debug in a non-impactful manner.

To take your second example - if I could then flip the light switch back, and the pool reappeared, then I'd be miffed but not particularly annoyed (assuming I was able to fix that obvious-bug either myself or with an update in a timely fashion). If the pool stayed gone, then yeah, I'd be pissed.

Of course, that whole argument goes out the window when the tech in question isn't controlled by you. Which is often the case.


Tell that to the 346 people who perished because of negligent and (in my opinion, malicious in terms of regulatory deception) undocumented, uncommunicated programming of the speed trim system of the 737 MAX.

Or the folks who perished because of badly programmed software interlocks on the THERAC-25 radiotherapy machine.

Just knowing or figuring out to flip that switch may be an insurmountable barrier depending on the circumstances when a failure state occurs. Especially when the implementation is intentionally hidden so as to facilitate continued market value extraction opportunities from the happy accident of information asymmetry.


I agree with the sentiment of the post and the replies.

Yet your examples hint at something more.

Those massive failures are by people not by tech. Mismanagement and incompetence and systems designed to obfuscate accountability.

Which happens aplenty in non tech fields.


In wiring a house, there is a built in assumption that something could go wrong and disrupt the wiring. That's why we had fuses, and now circuit breakers, grounding, ground fault interrupters, metal conduit, etc. All of these serve to limit the side effects of faults.

When you turn on a switch... it's part of a circuit which is current limited, and in fact there are several limits on that current, all the way back to the source... each designed to protect a link in the chain. Each of those breakers limits the capability to source current further downstream.

When you run a task in any modern OS, it runs with the full privileges of the user id with which it was launched. This is like hooking a generating station directly up to your floor lamp in the living room with no breakers. If the process has a fault, there is nothing the Operating System will do to prevent it from being used to subvert other parts of the system, there is no limit to what it can do.

There are systems that require you to specify how many resources a given task is to be allowed to access. It turns out that such systems can be just as user friendly as the ones we're used to, but they do require things be re-written because the ground assumptions in the security model are different.

Capability Based Security (also known as "Multi-Level Security) was born out of a need to have both Sensitive and Secret information shared on a computer that scheduled Air Traffic during the Vietnam Conflict. (If I remember the situation correctly) The flights themselves were sensitive, and the locations of the enemy radar were top secret (because people risked their lives spying to find them).

It was extremely important that information could not leak, and solutions were found, and work!

About 10 years ago, when I learned about this, and considered the scope of work required to make it available in general purpose Operating Systems, I estimated it would take 15 years until the need for Capability Based Security would be realized, and another 5 more or so until it was ready. I think we're on track.... 2025 people will start adopting it, and 2030 it will be the defacto way things are done.

Genode is a long standing project to bring this new type of security to the masses... I'm still waiting until the point I get to play with it... and have been for a while.

Things will get better... these types of tools, along with "information hiding", getting rid of raw pointers and other clever but dangerous tricks will help as well.

[Edit: Re-arranged to clarify, and improve flow]


The problem with an increase in security is that it almost always comes with a tradeoff of higher complexity. Higher complexity means more difficulty tracing. It also means the state space of a general purpose machine ostensibly there to be configured to fulfill the user's goals is a priori heavily constrained.

Point being, I don't see a shift in the direction of security above usability or ease of mentally modeling doing anything but worsening the problem. I could be wrong on that though, but the last 20 or so years of further encroachment by industry on User's perogative to configure their machine as they like doesn't inspire great confidence in me.

I can say I'm totally reading up on that though. I hadn't heard of it before, and it sounds interesting.


Completely agree - hence why I said _usually_. Another example of irrevocable harm is when ML algorithms dictate some medical treatment or social program.

But, _usually_, it's easier to reverse some changed-data somewhere than it is to reverse an actual change-of-state in the physical world. At least, the inherent effort required to do so is less - but policies or obfuscation may make it harder.


I’d argue computer programs failing mode are often less gruesome that real life’s gas and electric failures.

As a kid we had a gas range, and it was pretty easy turn on a burner and just leave it open without lighting it. Or just start cooking something and forget about it, depending on your situation your house is gone.


Normally the gas has quite a distinctive odor just for these kinds of situations. Sucks if you leave your house and enter it again lighting a cigarette though.


> When is the last time you flipped a light switch, and suddenly your pool disappeared?

Or the pool just disappeared for no reason and you couldn't get it back unless you sold your house and rebought it?


Whens the last time that you had a car door working door, and it fell off when you opened it? (MVP, no tests) [I'm not talking about a worn out car]


I don’t know where you got these examples, but they were fantastic.


Just trying to make analogies people can understand over the years.

The current state of computer security.... is like building a fort, out of cases of C-4 explosive.

How so? Almost every program has bugs, many of which can be exploited remotely. It is effectively impossible NOT to have a zero-day exploitable hole in any given computer. Thus, every single computer can be exploited... and then used to attack more computers.... in a chain reaction.... like a fort built out of C-4.


I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades. It's entirely our own creation, and we decide every iota of it, and yet it bites us (justifiably or not - turns out thousands of people each creating different layers of a gigantic state machine is hard to perfectly coordinate, but we may have been able to do better by now if we had been more thoughtful and patient throughout).


I hear you, but feel like we are biased by what we accepted as normal in our formative years, and that filter doesn’t apply on what we are discovering now that we’re grown up professionals.

For instance books have been with us for centuries, and honestly most of them suck. Paper pages are thin and sometimes cut your finger (how many times did you get cut by an ebook ?), most are weak to liquids yet our world is filled with liquids everywhere, sometimes coming down from the sky. Updates are painful and costly and non scalable. Font sizes can’t be changed, you have to use an external device to deal with it.

Not saying there are perfect alternatives or that the tradeoffs don’t make sense. Just that we learned very early that books have these limitations and we’ll need to live with them to be a member of society. And we can agree all of these aspects could be and sometimes are fixed, but most people are just ok with books ‘sucking’ in those ways.


We’ve also had centuries to improve the technology of books and I think that makes a difference.

Although the weaknesses you cite seem like problems in search of a solution. No one ever expected books to have variable font size ... why would they?

Lastly let’s recall the book five hundred years ago is dramatically different from the book of today. For example your point about liquids is now in many ways resolved by the cheapness and ubiquity of books. 500 years ago, not so much.


On book font size, there actually is a market solution for the issue: if enough sales are expected the same book (same content) will be sold in different formats, pocket size, deluxe paperback, standard edition etc.

Same for translations, with even books with dual languages side by side.

I find fascinating how the arrival of ebook readers made us rethink how we relate to books, and a lot of annoyances got surfaced only now because there was no comparison point before. My favorite is how you cannot ignore the length of a book while reading it: you can’t pretend not realizing there’s only a dozen pages left and the story must come to an end.


While nobody expected books to have variable font sizes, the fact that ebooks do allow it to be adjusted means that people with deteriorating vision may still read them.


A magnifying glass was the original solution to this problem. They never run out of battery.


And you can use any glass with any book!


Yeah... and once you're no longer paying by the page there's really little upside to using a small font you have to squint at or use margins that are too narrow to easily scan the page. I have no idea how long the books I read are, but I probably read them a few hundred words per screen simply because it's way easier to keep my place. Average for a small paperback exceeds 300.


In the early days of the Gutenberg press when most were illiterate, they would gather together and the one person could read would read to the rest of the group. So, arguably, it was both easier and more inclusive for a blind person to read what there was to read then than now. At least they didn't have to rely on any special accommodations.


It's worth noting that it took 75 years after Gutenberg's press before some disgusted printer came up with the idea of page numbers. As the saying goes, all progress depends on the unreasonable man, who eventually becomes disgusted enough to make things change. Quality matters, pride in design and workmanship matters, and it's not at all bigoted to point out that China, which now manufactures most of the stuff in our 21st century world (or at least the components of it), has a culture of designing and producing absolute dung. We should not accept unacceptable quality just for apparently low prices.


Books are also capable of being copied before or when damaged, passed on trivially, and are not prone to sudden existential failure because a server on the other side of the world was deactivated.

They can't be stolen back by the publisher or seller, can be trivially transformed into different formats, can take annotations, can be rebound with wear, and even if paper has it's faults, reading a page of a well maintained page in 1000 years is as easy as as the day it was written, even if significant swathes of technological backslide occur, and is only prone to the challenge created by human cultural evolution as opposed to loss of the processor or software required to decode/convert/display it.

An HDCP protected chunk of media may as well not exist in 1000 years.


> I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades.

Humans, as the makers of these systems, are part of that reality, which was not created by us. The reality is that we are great apes writing precise machine instructions with our general intelligence that was not purpose built for being that precise but selected for survival. Our cognitive machinery cannot exhaustively predict all the possibilities of failure of what we write, if we are working in teams, we have transfer most of our technical ideas still through natural language, in a combinatorially increasing manner as the team size increases etc. None of this is user hostile, it is just human fallibilities and limitations in play. And since we can't alter our cognitive capacity drastically, we can only make more machines against these (e.g. unittests) with their own limitations. I think the scale of what we have been achieving despite these limitations are just fantastic.

If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.


> If anything users are becoming too egocentric, expecting the world to conform to their comfort, with a dash of construal level fallacy, underestimating from a mile away how easy it would be to write bug free programs with perfect designs in a real world, by real people, with real budgets etc.

Selection bias. You only hear from users who want new features. You rarely hear from users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.


I was talking about the “users who don't want new features and just want software to stop being buggy and acting like a goddamn haunted house.” so the selection bias is yours.

Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes. You can recover with some frustration and wasted time. The perception of these being catastrophic failures shows the sense of entitlement users have because they are used to a certain smoothness in their experience and expect everything to go their way. This doesn’t match the underlying realities of the task; it is very easy to construe a sense of a working program in one’s mind but it is exponentially difficult to make the implementation actually bug free, usable and functional the way user wants.


> Most bugs are just annoying, consequences are not catastrophic if your favorite tuner app forgets your settings, your word processor messes up the formatting, your pdf reader crashes.

So what? Users get upset when your crap doesn't work. Stop being flippant and pushing back. Pushing back is not your (our) job. Complaining how hard your job is not your job. Griping and moaning about irate users is also not your job. Delivering a product that does what is says it will do on the tin is actually your job. Believe it or not, you produce something people depend on!

Imagine your power steering goes out on left hand turns going north downhill. You take it into the mechanic and all you get is "That's just annoying, not catastrophic. You can recover with just some wasted time. It's exponentially more difficult to make the implementation actually bug free!"

Users quite rightly spot bullshit excuses. And we have none. Save the settings, fix the formatting, stop the crashing.


> Pushing back is not your (our) job

Please tell me more about my job internet stranger.

You’re making the same arguments without adding substance, just emotional rhetoric and unnecessary personalizing.

> Imagine your power steering goes out on left hand turns going north downhill.

Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive.

I’m glad you’re making a physical world analogy. Comparable physical world projects have orders of magnitude less moving parts that need to interfit, and assembly gives immediate feedback whether they can fit. They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.

“Shut up and just make it work” might have been popularized by certain tech personas, but unless you have Steve Jobs levels of clout, pulling that stuff in most dev shops will quickly make you very unpopular whether you’re a IC, a manager or a product manager.


> Imagine that steering wheel stopped working depending on the highway you’re driving on (software & hardware platform). Why didn’t they test this on every single highway? Because that would be was combinatorially explosive....

Users guffaw at this point. They do not understand why your stuff is so complicated and broken. They think you suck at your job. Both you in the collective sense (you engineers) and you in the personal sense. They start plotting ways to stop using your stuff because you are so frustrating to deal with.

> They also have orders of magnitude less inputs they are expected to operate on, which makes it easier to exhaustively test their function.

I think you still do not understand my point. Users fundamentally do not care about it. Everything, to them, is a problem of your creation and they'd quite rightly regard your long-winded explanations with complete skepticism. To you it always feels likes it's someone else's fault, but to users it sounds like complete BS. No matter how right you are about it being someone else's fault. Someone else's fault is the very definition of a lame excuse from their perspective. They are still getting nowhere and you are even more useless to them because you still can't fix anything and just confuse them and waste their time.

It's a very user-hostile attitude and makes for terrible customer relations. That attitude is also counter productive and helps no one. No wonder people hate tech.


Reality creeps in from us creating tech-utopia trough "legacy" systems where the internet, cryptography, multi-core architecture and full program isolation didn't exist yet.

Software has a nasty habit of iterating on yesterday's ideas instead of rewriting for tomorrow. Not that there's anything wrong with that, it seems to be the path of least resistance thusfar.


I disagree - by and large, we don't need to "rewrite for tomorrow". Almost every significant problem in Computer Science and Software Engineering was solved (well) decades ago, often in the 1960s. Security is a bigger issue now, but it was largely solved years ago.

The problem is that we do engage in so much "rewriting", instead of leveraging known good, quality code (or at least fully-fleshed out algos, etc.) in our "new, modern, elegant, and trendy" software edifices of crap.

To me, this may be the one really good thing to come of the cloud (as opposed to the re-mainframe-ication of IT): the "OS" of the 21st century, allowing plumbing together proven scalable and reliable cloud/network services to build better software. (Again, not a new idea, this was behind the "Unix Philosophy" of pipes, filters, and making each program do one thing well. Eventually, it will be in fashion to realize this really is a better way...)

We need smaller, better software, not the latest trendy languages, insanely complex platforms that no one understands, and toolchains of staggering complexity that produce crap code so bloated that it requires computers thousands of times faster than the Crays of the 1990s just to do ordinary desktop stuff. (Seriously, folks, the Raspberry Pi 4 on the next table is a rough match for the Cray we had in Houston when I worked for Chevron in the early 90s! Think about that, and then think about how little data you really need to deal with to do the job, vs what you're actually shuffling around.)


You reminded me of this quote.

“Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary. No such faith comforts the software engineer.”


We probably could make gas stink less just like we could fix that bug. The ROI just isn’t there.


Gas stinking is a feature, not a bug. It's a safety measure. Particularly when we're talking gas (and not gasoline), which is naturally odorless and made to stink on purpose.


> Doesn’t reality suck the same ?

No. A hardware product like a car has predictable wear and tear governed mainly by the laws of physics. The fact that I can no longer use my smart speaker because the manufacturer decided to stop supporting it, went out of business, or got bought is not at all the same. My car will still work through all of those things in the exact same way. It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it. Not the same at all.


A car has tons of parts that, sure are "governed by physics", but in effect just randomly fail. I can theoretically understand that my there's a clunk in the frontend of my car because I've exceeded the MTTF of a suspension bushing. To almost everyone though, it's essentially just a random event based on nothing they've perceived.


John Deere has entered the chat.

Also, car companies have been tinkering with "electronic repossession" - remote kill switches due to nonpayment.

So ... get ready for other things to suck as we attach technology to them.


> John Deere has entered the chat.

Thank you for bringing this point. The actual problem is not the software itself, but its proprietary nature and infinite hunt for profit without any limits. Consider free software instead and you will see that it is improving year by year, despite very slowly (which is logical, in the absence of infinite resources).

My Linux machine never forces me to reboot, shows any ads or suddenly changes its interface.


> It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it.

I see you never had a (EU) Ford.


Sure there are things that don’t work, but it’s not nearly comparable. In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer, my closet Just Works, books Just Work (with an ink smudge maybe every 50-100 reading hours) etc. I can expect each of those things to last decades. Looking at the goods I’ve bought on Amazon recently, digital electronics/software as a category doesn’t hold a candle to everything else I buy in terms of reliability.


> never had a problem with a refrigerator

These turn of phrases make me wonder what we are really expecting from software.

I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.

Or do we monitor energy consumption close enough to realize it’s eating much more than what should be expected, the same way people complain about chrome eating too much memory ?

It can also get pretty noisy but I’d assume most people just think it’s normal.

And we put the blame on ourselves for a lot of issues (didn’t put the bottle at the right place, didn’t put the right amount of force to close, didn’t set the right temperature, forget to leave space around the fridge for ventilation etc.). But few users blame themselves for not having understood the software and worked around its weakness, we just call it broken.

That benavior is normal, but I’d take a lot of “my applicances just work” with a grain of salt.


> I can’t imagine you never slapped the door of your fridge and it didn’t properly close. You gave it a nudge when you realized it, and it was fine, but it must have happened. And your whole food supply would be rotten if you didn’t notice in time.

But if the fridge was software it would randomly turn off and ruin all your food. The light would sometimes stay on, except when you open the door. It would require you to decline an update before you could get the milk out for breakfast. During an update the fridge and freezer components would switch places and then give tips about efficient ways you could manually move everything. If you bought a new fridge, part of it would locked shut until you paid more money, but the one in the store was already unlocked. And god forbid you lose your 2FA device used to setup the fridge -- it will destroy everything inside (including irreplaceable heirloom tomatoes) upon reset. It will then update to a subscription model where custom temperature settings will require a monthly fee or you'll be limited in the number of items you can store in the fridge or number of times you can open the door per day.


We saw a failure case like this with a microwave in the workplace kitchen. It somehow got into a mode where it only turned on when the door opened. Needless to say we threw it out shortly after that was discovered. We didn't bother debugging it, but my guess is it was a hardware problem because the interlock should have obviously made it work the opposite of how it was and you'd hope that a software glitch couldn't get it into a mode like that!


Oh crap. That could blind a person. I thought microwaves had to have a hardware interlock for that reason


Since this is critical health and safety stuff that can lead to serious injury, everyone I've ever seen uses hardware interlocks - generally, the door-closed switch is in series with the microwave power supply, so it's impossible for it to make microwaves with the door open. Only an idiot would put a safety interlock under software control, when a simple switch will do.


A lot of these look like pricing and marketing issues to me.

Fridges have been with us long enough in a ‘pay everything upfront’ setting that we‘d battle to the bitter end if we had to do micro-payments or aggressive planned obsolescence.

To your point, I lived in long stay apartments where you put coins to have the fridge and air conditioning work because they didn’t bother having pay as you leave metered use. That’s super rare (I hope ? I’d expect the same in some predatory situations towards low income people), but it’s to say that alternative exists.

Otherwise fridges randomly turning off is just a matter of time and/or build quality. Sooner or later it happens (or it stops turning on, which is arguably better, but you wouldn’t say it’s great)


> In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer

I think blaming software for this is a little naive. Take a look at consumer reports for any modern fridge, stovetop/oven, washer/dryer, etc, and you will see complaints about fridge motors dying, touch panels going on the fritz, etc. -- none of which involve anything more than low level firmware.

If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.


> If you want to put a tinfoil hat on, you can consider that it may be planned obsolescence, but to put the blame squarely on software, I would disagree with.

You don't need tin foil hat when facing the truth :).

Also, while things you mentioned aren't software-related, they're modern tech-related. Like, last 20 years. Fridge motors dry out because they're being made cheaper, with not enough lubricant put into them and no option of opening them up and pouring in the grease. Touch panels are going on the fritz because touch panels suck (that's often very much software), and they shouldn't be put on appliances in the first place. But it's cheaper, so they are.

Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.


Right, but the move towards cheaper and lower-quality is more the fault of the current economic system and its incentives than it is the fault of software.


It's very instructive to look back to the 70's when electronics running a little bit of software had just come into being.

The big deal, at first, was really with memory. Your alarm clock could ring at the same time reliably. If you invested in a VCR, it could record at a programmed time. If you had a synthesizer it could store and recall exact preprogrammed patches. Pinball machines could downsize in weight and keep truly accurate scores instead of relying on tempermental relays and score reels. And so on, with every category of gadgets getting the computerization treatment. Although not everything succeeded there were lots of straightforward cost and quality improvements, with the main downside being that IC designs are less obviously repairable.

And then pretty much every year afterward, the push was towards cheaper with more software, with decreasing margins of real improvement, with the "smart" device representing an endpoint where the product is often price discounted because its networking capability lets it collect and sell data.

What comes to mind is the Rube Goldberg machines and their expression of a past era of exuberant invention, where physical machines were becoming increasingly intricate in ways not entirely practical. Our software is kind of like that.


"Kind of" like that?

Every other week I read about someone's entirely-too-roundabout way of doing X via an IoT device (requiring packets to probably circumnavigate the globe). Meanwhile I'm sitting here opening my garage door with a physically wired switch like a pleb.


Why do we have a touch panel on a fridge in the first place? The only thing we need to be able to specify is desired temperature...


... and at that, one could argue that even that isn't necessary.

I just checked my fridge, it has six buttons and an LCD panel, and in all my (4) years of home ownership, I haven't touched the buttons a single time.


> Worth noting that there wasn't some big price reduction happening from solid appliances 20 years ago to crappy ones today. Price remained roughly fixed, but appliances started to suck more and more.

First, the "solid appliances" weren't 20 years ago, but more like 25-30.

And though there wasn't a big price reduction in the interim:

- Refrigerators are more energy efficient.

- Refrigerators have larger internal volume for a given size.

Equivalent improvements have been made to other appliance types such as washers and dryers, but not stoves, as far as I know.

Those improvements are largely orthogonal to declining design and build quality, but it should be noted that there are at least some ways in which newer appliances have been getting better (that aren't just gimmicky features) while prices remained the same.


Right, but my $400 bose headphones have a broken integration with my $2400 mbp. Swiping the volume controls on the headphones also moves the balance.

Conveniently, because macos is ass, it's nondeterministic whether the balance controls display in Sound Preferences to fix the balance issue. You just have to open and close the settings panel in the hopes that it will display.

I'm a software engineer and I don't even know where to begin to debug this idiocy.

Duolingo regularly freezes audio in chrome. Once this happens, no audio will play in chrome until you restart or kill "Utility: Audio Service" with the chrome task manager.


This is the second time in as many weeks I've read a complaint about Bose headphones. The first was that their Bluetooth was so janky the audio itself was delayed multiple seconds and out of sync with video playing on the device.

That blew my mind, my $20 Amazon-purchased Bluetooth earphones just work™ with no delay.


I guess it depends how fussy you are. I've had several fridges/freezers/ovens that don't actually maintain the set temperature. for ovens this is merely annoying, but for fridges and especially freezers, this is a food safety issue. on fridges/freezers that have a dial instead of a digital temperature control, I've found that some just can't maintain a stable temperature, no matter how much I fiddle with them. after setting them "just right" with my own thermometer, I'll come back the next day to find an exploded bottle in my fridge or cold water in my ice tray.

books work really well until a pipe bursts in your attic. then you wake up and notice half your collection has been ruined (personal experience).


Well, digital services are much more complicated than a fridge, or a book. Not only that, but they also require a machine to run that is also orders of magnitude more complicated than a fridge or a closet.


Do you know how hard it is to produce a book from nothing? Make paper, all the same thick, print and so on. Or just a steel tube .. all can be done without computers and software. The process behind is difficult for every step. I work in industrie automation and I can tell you, right now, with this quality in software and "computer everywhere" we are building a super high tower in the softest sand.


Part of the complaint is that these things don’t have to be as complicated as they are. The fridge with the touchscreen screen is usually more frustrating than the fridge from thirty years ago. The smart TV is usually more frustrating for its smart features.

Things are getting more complicated, like you say, but they frequently aren’t getting enough better to justify the added complexity, especially given all the issues that come along with it.


> Doesn’t reality suck the same?

To me, software is as if when I open a book to read it, then, the book suddenly snaps itself shut, hurting my fingers.

Thereafter, the book gets wings, tries to fly away, but bumps into my coffee mug on my desk, so coffee spills on the floor. Then the book does fly out through the closed window — smashing the glass into pieces — and gets larger and larger wings, flying higher and higher until it disappears into the blue sky.

It's as if software was alive, and does random things I don't always agree with.

But actually — the bees building nests on the balcony: That feels pretty close to misbehaving software. Or the cat, bringing in a snake from outdoors. Or a squirrel shewing through a power cable, shutting down a city.


There is a difference between design trade offs and flawed design of deviations from the design. Your car does what it’s supposed to do, within the predictable limits imposed by the fact that it’s a gas-powered car. Since MacOS 10.15.4 or .5, my 16” MacBook Pro crashes waking up from sleep due to some glitch in waking up the GPU.

Of course, people perceive that software sucks because it’s more complicated than people perceive. I forget what book said it, but an operating system has more separate components than an aircraft carrier and they’re more tightly coupled. (I’m not sure that’s true, but it conveys the idea.)


Houses, cars, etc are far more reliable and well designed than software. Think about all the extreme conditions cars continue to function in. How many people don't even follow basic maintenance schedules?

Another key difference is that in maintenance of your home, you have complete control. It's extremely easy to understand and act to improve or maintain it. When large software systems (like the IRS login) have problems, you are totally helpless.


> Houses, cars, etc are far more reliable and well designed than software

Buy software the price of a house and you’ll be right to expect the same build level.

Then even at the price of your house you’ll have fun with mold growing inside the walls issues, soil that degrades in unexpected ways after heavy rain hits the hill you’re built on; rooms were fresh and bright enough on a hot summer day when you visited, but you realize overall orientation makes way darker and gloomy in winter that you expected. And you’ll pay for that house for your next 20 years.

Cars are the same at a lower level, and you see small issues creep up as you lower your budget (or go buy a fancy vintage italian car and you’re in for the wild ride).

> Another key difference is that in maintenance of your home, you have complete control.

In the good old days people had timers on their desk to remember to restart programs before they crash. Also saving stuff, making backups etc.

Of course online services are a different beast, but it’s more akin to fighting bureaucracy, which I see as a our society’s software in a way, with the shitty forms with not enough space for your name and other niceties.


This is a straw man argument.

Cars vary widely in their product quality. Houses vary widely in their product quality. Some things in life are inevitable facts of nature, but product quality is not. Quality is to a large extent determined by the time and care taken by the manufacturer.


>My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.

That's not a good example, nor is it parallel to the dynamic the article describes.

Your car stinks a lot less than cars did 10/30/50 years ago (emits less in the way of pollutants or CO2 per mile driven), is less likely to kill you in a crash involving the same size cars/velocities (despite weighing less!), needs less maintenance, lasts longer, and can notify you of potential collisions and sometimes avoid them.

It's probably only worse in terms of education needed to perform maintenance or nominal sticker price.


But devices have also gotten smaller, lighter and more efficient, and software can also do much more today than it could a long time ago. I think the analogy is fine.


When the analogy was built off of saying that saying that cars are bad my metric X, when the claim was that software has gotten worse by metric X, and cars have actually gotten better my metric X, no, it's not a good analogy.

And yes, there are some ways in which hardware has improved. But the claim is that, judged by what you're using it for, most UX-apparent aspects have gotten worse. Is there a clear way this is wrong? If you look at most UX-apparent metrics, it hasn't. Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.

None of the nightmares described in the article were typical of software UX.

These would be arguably fine if the additional features you get were a necessary cost, but they're not.

I'm also not sure that devices have gotten more efficient in all respects. Each version of iOS gets more energy-intensive, for example.


> Latency from keystroke to character render has gotten worse. App start time has gotten worse. Lots of other latencies have gotten worse.

Do you have sources for this? I mean, I'm not sure there aren't rose-tinted glasses here.

> These would be arguably fine if the additional features you get were a necessary cost, but they're not.

> Each version of iOS gets more energy-intensive, for example.

I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.


Increasing keystroke latency was discussed on HN before: https://news.ycombinator.com/item?id=15485672

>I would argue that multitasking, camera postprocessing, widgets, background app refresh, and others are all features worthy of more resource usage. Many of those are things you can choose not to use if you want to save power.

For all those features turned off (before and after), the usage increases with each version.


Not the same, we expect the problems you mention. There are just some laws of nature that we get used to dealing with. Tech has the tendency to produce random problems. The one we have all dealt with is, everything was working fine and then suddenly stopped. You call tech support and after an hour of troubleshooting it with them we get, the, "We've never seen this before. It must be caused by one of your other SUCKEE tech toys." Ahhhhhhhhhh...


I think in 20~50 years those random software issues will be what we knew for our all life, basically what reality is, and we’ll just give warm patronizing looks to kids complaining stuff doesn’t work.


software doesn't decay, it just sucks even if you preserve it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: