Hacker News new | past | comments | ask | show | jobs | submit login

The implication that software providers should be liable seems to reappear eternally here and remains misguided. Even when we're essentially discussing hardware here.

Software is the perhaps that area where "good" or "crappy" is most undetermined. A given piece of software can be bullet-proof today and a catastrophic hole can appear tomorrow. And even if the producer releases an update, there's no guarantee it will be picked-up.

Overall situation is that what's needed is standards of software use for those companies which actually do damage. Without standards, your use of "crappy" is meaningless.




> A given piece of software can be bullet-proof today and a catastrophic hole can appear tomorrow.

Sometimes you do everything you can and things still go wrong. That's okay.

What happens in practice is totally different though. Gross negligence is endemic in the technology industry. Most companies out there simply don't give a shit. Their negligence is deliberate, calculated and pre-meditated. They know exactly how much damage they're causing and they don't care because caring costs money.

> Without standards, your use of "crappy" is meaningless.

It's not meaningless at all. For example, nearly every laptop manufacturer I've ever seen has delivered to me software that is unambiguously bad. This opinion is not controversial at all. You just need to fire up some manufacturer app to see just how incredibly bad they are.

I've posted about that here many times and people explained to me that the software is garbage because hardware companies literally don't care about it. They see it as just additional costs to be eliminated and as a result we get products which are total crap. My laptop came with a driver that intercepts my keystrokes and sends signals to the keyboard so that it can light up the LEDs under the keys I pressed. What caused an insane design like this to even come into existence is beyond me, no doubt it came down to saving a few cents in manufacturing. I replaced this functionality with free software and I'm not sure if I even want to know whether there are any vulnerabilities in that driver.


>My laptop came with a driver that intercepts my keystrokes and sends signals to the keyboard so that it can light up the LEDs under the keys I pressed. What caused an insane design like this to even come into existence is beyond me, no doubt it came down to saving a few cents in manufacturing. I replaced this functionality with free software and I'm not sure if I even want to know whether there are any vulnerabilities in that driver.

That sounds pretty cool and hackable actually.


This is how malpractice qorks in every other injury. It isn't just about damage being caused by the software, but if there was a violation of the reasonble standard of care


Exactly. Us engineering types tend to underestimate how much intent and judgment matters when it comes to matters of malpractice (and similar) laws.


> A given piece of software can be bullet-proof today and a catastrophic hole can appear tomorrow.

Only because you didn't notice the catastrophic hole today, and that's true of everything. When a building collapses the construction company/architects can't really get out of that by going "well it was perfectly fine yesterday, it's hardly our fault!", I don't see why we'd accept that attitude from software engineers.


The state space of a complex piece of software is vastly larger than that of a building, and the desirable states within that space are not continuous, while a building's desirable state space is pretty smooth. It's like the difference between walking along the knife edge of a fractal vs. standing on top of a mostly smooth hill.

Because of that, software simply can't be treated the same as other engineering disciplines, at least not yet.


> The state space of a complex piece of software is vastly larger than that of a building

Is it? Unless we can accurately simulate reality at the atom level and bruteforce building designs against every possible scenario to make sure it doesn't fail catastrophically, it's still all down to prior knowledge/experience, reasonable assumptions, approximations and measurements, just like in software. If anything, software is easier, as with enough efforts (formal methods, etc) you can get to prove the correctness of your software, but you can't really do so with a building.

A large chunk of software vulnerabilities is either due to outright incompetence, legitimate mistakes or cost-cutting. The example of an insecure, black-box backdoorable EC like in this article would have a "building" equivalent of using rebar or concrete of unknown specs and origin and then wondering why the construction collapsed.

Thankfully the liabilities associated with civil engineering means we mostly don't use unknown materials of shady origin and there are multiple layers of review to catch any oversights. The same can be applied to software, and while you're never going to get 100% in either domain, if software was as reliable as buildings (as in you can count the number of major collapses/bugs on one hand in the past few years) it would be a major improvement.


Disagree. First of all, the level of detail an atomic level simulation could provide is simply useless in the grand scheme of things. Classical physics is more than enough for the vast majority of use cases and they can be simulated by computers. Eg, you can programmatically measure the load bearing on any part of a structure. And the more important thing, engineers can usually fall back to safety margins that “linearly scale” the provided confidence in a given case. What I mean by that, is that if a given part has to be this stable, as per the calculations, you can trivially make it 3x wider.

While a while loop in a computer program will eat away at any sort of redundancy you may introduce, and the complexity of computability itself simply leaves behind even mathematics. There is no universal way to prove a non-trivial property of an arbitrary program.


> The state space of a complex piece of software is vastly larger than that of a building

Not really, we've just gotten very good at constructing buildings thanks to millennia of experience. We're terrible at writing software.


Yes. People forget how difficult it is to produce even a screw or a nail. And then there are way more difficult things like tubes.


This isn’t really fair. Most buildings, public works, utilities and industrial processes are extremely vulnerable to the most basic of attacks, they just don’t get carried out as often in physical space because it’s logistically more difficult to execute.


Your claim is fundamentally "It's logistically easier to attack software, so such attacks will happen more often".

This is absolutely true, but it's also proportionally easier to defend software. It's insanely easy to test whether your software is vulnerable to SQL injection, it's not particularly easy to test whether your building can be destroyed with explosives.

Combine that with the fact that just about every piece of consumer software on the planet has a laundry list of bugs that don't require malicious intent to reproduce, and I find it very hard to accept that reasoning for software developers absconding responsibility.


Empirical/dynamic testing for vulnerabilities is not rigorous or complete in any any way, and is only viable for a relatively small subset of issues.

I'm only suggesting that holding programmers liable for security vulnerabilities isn't really precedented across any other engineering discipline. That's not to say there isn't tons of shitty software being shipped with reckless disregard for quality, and some reckoning there might be useful.


We don't expect buildings to hold up to physical attacks. Most would crumble immediately. Just like most hardware doesn't catastrophically fail through normal use.

Computers are one of the only things we expect flawless defense against malice.


Today when computers are connected to the whole world, they have to build safe. The analogy would be, if everybody in the whole world could access the buildings frontdoor. In a such scenario, the frontdoor would, of course, build very heavy.


This is true but it still does not validate the OP comparison between buildings and software. Software faces a much much more difficult task.


I agree. Standards or requirements should be enforced, and if there's a failure in the company's infrastructure, a panel should investigate if it was due to negligence or non-compliance with the standards. Also, insurance should be mandatory. If a given online platform was a physical piece of infrastructure like a school or a bridge, current state of affairs would cause an uproar.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: