I make software that is broken ALL THE TIME. And what do I do then? Find the faults and then fix them. Just because software is not perfect does not mean that I sit like a dog in a house fire saying, "this is fine."
Killing 300+ people is broken. And our obligation is to admit that it is broken, and fix that thing. When we find another thing, we admit that the system is broken there as well, and go fix that.
Trying to diminish the seriousness of 300 deaths with mealy-mouth terms like "imperfect" is pure spin. If a plane, staffed by pilots allegedly trained on how to fly it, crashes from pilot error... TWICE, the system is broken. terribly broken.
See also: Security, physical or digital.
Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it. We put on some secuity theatre too. But we didn't say, "It's impossible to fix everything, so the system is fine." We didn't shrug it off as "imperfect."
We always begin by being truthful with ourselves about the fact that we have discovered that the system is broken. And if the consequences of its broken-ness are unacceptable, we fix it.
"Imperfect" is a word that should only be used for acceptable faults. Like, "The seating in economy is imperfect." It's too close together, but we don't have planes falling from the sky because they're cramming passengers together.
---
And now a footnote: Please avoid ad hominem arguments like "please name one system you've..." If the speaker's argument has a logical fallacy, point it out. If the speaker's argument is sound, it doesn't matter whether they write faultless software, or even whether they write software at all.
It's the argument we are discussing, not the person making it.
> Trying to diminish the seriousness of 300 deaths with mealy-mouth terms like "imperfect" is pure spin. If a plane, staffed by pilots allegedly trained on how to fly it, crashes from pilot error... TWICE, the system is broken. terribly broken.
Wait until you find out how many people die in car crashes every single day. It does not mean “the system is broken”. In the real world systems can be improved without throwing disruptive tantrums to make fundamental changes.
>Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it. We put on some secuity theatre too. But we didn't say, "It's impossible to fix everything, so the system is fine." We didn't shrug it off as "imperfect."
Excellent example of an extreme overreaction that caused immense destruction to the aviation industry because emotional politicians wanted to fix a “fundamentally broken system”. The correct approach would have been, “we’ve found a big flaw in security, we will now install flight deck door locks”. Instead, some dim politicians went with the “system is broken” approach and now we have nudity machines (or grope-downs if you don’t submit) at every major airport in the US.
You don’t drastically change a system on the discovery of a single flaw, no matter how big. You fix the flaw because it’s one known bad vs the giant pile of unknown flaws with fundamental redesigns.
"Wait until I find out?" Don't patronize me, I am keenly aware of automobile deaths, and furthermore, it's a complicated problem to solve for many, many reasons, some of which are caused by greed, lack of oversight, recklessness, and other social issues.
Nevertheless, in my lifetime cars have gotten safer by several metrics, most notably by distance traveled.[1] And they did not get better by thinking that driving around in deathtraps without seat belts and swigging whisky behind the wheel was acceptable.
They got better when people like Nader and MADD stood up and defied arguments like yours and loudly complained that yes, the system was BROKEN and needed to be fixed.
And then other people went about trying to fix automobile safety. Imperfectly. In fits and starts. And sometimes entirely wrongly. But nevertheless, progress happened when people attempted to fix things they acknowledged were broken.
> They got better when people like Nader and MADD stood up and defied arguments like yours
I don’t think you understand. Cars are still killing thousands of people and the system is not considered broken right now. You’re emotionally lashing out and assuming I’m promoting drinking and driving when I said no such thing.
The entire point is that deaths alone do not indicate a broken system. Society is riddled with acceptable death rates in countless industries and people don’t childishly accuse politicians of being “okay with people dying”.
There is a world where nobody can accept any deaths and then there is reality where the adults exist and make trade-offs to have a functioning society.
All systems do have flaws. Even if we had very intrusive security, 9/11 may have still happened. It's possible no amount of security could have prevented it.
And it's possible in attempting to launch several thousand pounds of metal into the air that sometimes your testing is off and people die. That may mean the system is broken, it may mean it's just imperfect - it is not sufficient for either of those claims by itself. That people died does not mean the system is inherently broken - life is not without risk nor should our goal be to make that risk non-existent.
Security, to use your example, is fundamentally a trade off - accessibility vs security. Are we willing to trade a more intrusive screening process to prevent some set of attacks? Even if we make that tradeoff and an attack still happens, should we keep it? Would you be willing to be stripped search each and every time you fly if it lowered that risk of an attack another 10%?
Your call out of an ad hominem is incorrect. Parent is generalizing human behavior in an uncertain world, not attacking you as an individual. Nobody can write a perfect system, even NASA has failures. A super abstract argument about flawless systems loses to real life statistics every time - theory is great but only reality matters.
I think we must reiterate to help you “get” it. One accident is a tragedy, a sign of imperfection, and we allow for accidents in most cases.
The difference between an accident and what happened here is that the issue was caused _deliberately_ by a series of bad choices, reflected in the fact it happened twice in exactly the same way. Yes, it took a lot of bad choices to get to the point of creating the issue, but that doesn’t really matter, and in fact, might only serve to further the point.
When a system allows for deliberate choices to cause an issue, the system needs to be addressed.
Anecdotally, I have never worked at a company where the management didn’t have an outsized say in how we broke apart our time between product development and stability.
I suspect this is endemic of a system that lacks accountability for people in managerial positions, and is probably even beyond the scope of the FAA, MCAS or Boeing.
Have you considered you may not be correct here? I know that may be difficult to comprehend, but you are not a teacher shedding light on the unenlightened - you're just another layperson attempting to understand a complex system.
> what happened here is that the issue was caused _deliberately_ by a series of bad choices
Are you suggesting Boeing intentionally downed it's own planes? That would seem to be an extraordinary claim - and not one reflected by any news source.
> When a system allows for deliberate choices to cause an issue, the system needs to be addressed.
What was the deliberate choice here? Only enabling one MCAS sensor or having calibration issues? Was it not making the need to disable the MCAS sensor in some situations clear? Was it changing the altitude model without changing the plane model number and forcing a pilot relicense?
Or, was the issue with Boeing being able to do their own testing? Was it due to a lack of guidelines on MCAS sensor calibration and placement?
> I suspect this is endemic of a system that lacks accountability...
Yet we are talking about this after their fleet was grounded, a congressional inquiry has happened and the investigation and analysis are ongoing. That doesn't seem like a lack of accountability to me.
Again, you're making the claim that the system is broken intrinsically because this occurred. You keep even saying it's deliberate - do you have evidence here? Because making poor choices is not the same as intent, and a bad actor does not a fundamentally broken system make. You seem to be in the position that because bad things happened we must dismantle the entire system without actually proposing a replacement - it is not enough to throw stones, we must actually put forth a solution set.
"Killing 300+ people is broken. And our obligation is to admit that it is broken, and fix that thing."
This is a massive trivilisation of the problem.
This is not 'code' - it's a massively complex system of large companies, various bits of tech, supply chains, varying international standards, considerably amount of legislation and that's before we get into the humane issue of 'acceptable loses' because '0 deaths' may not actually be feasible. Maybe, but maybe not.
"Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it."
Again, this is a misrepresentation. The system was not 'broken' because culturally, people 'didn't do things like that'. The 'fix' for this wouldn't be 'more security' but rather to encourage a civil culture where people don't take over planes and fly them into buildings. And some of the solutions are systematic, i.e. not 'safety checks on planes' but externalities like 'invading countries and destroying places where terrorist plan things' - which isn't so nice.
These are very complicated problems that don't have obvious solutions.
They may not have obvious solutions, but we recognize that they are broken and we try to find solutions. We then find places the solutions don't work and we iterate or even replace the solutions with new solutions.
All systemic problems have complex sets of interactions and unexpected consequences. Racism, automobile deaths[1], and yes, air transportation.
We still don't shrug our shoulders and deny that the causes of deaths reflect a broken system, nor do we refuse to attempt to find solutions just because it's damn hard to make progress.
[1]: Sometimes, making the vehicle safer makes drivers more reckless. Making cars safer in practice is not as obvious as it appears in theory.
Killing 300+ people is broken. And our obligation is to admit that it is broken, and fix that thing. When we find another thing, we admit that the system is broken there as well, and go fix that.
Trying to diminish the seriousness of 300 deaths with mealy-mouth terms like "imperfect" is pure spin. If a plane, staffed by pilots allegedly trained on how to fly it, crashes from pilot error... TWICE, the system is broken. terribly broken.
See also: Security, physical or digital.
Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it. We put on some secuity theatre too. But we didn't say, "It's impossible to fix everything, so the system is fine." We didn't shrug it off as "imperfect."
We always begin by being truthful with ourselves about the fact that we have discovered that the system is broken. And if the consequences of its broken-ness are unacceptable, we fix it.
"Imperfect" is a word that should only be used for acceptable faults. Like, "The seating in economy is imperfect." It's too close together, but we don't have planes falling from the sky because they're cramming passengers together.
---
And now a footnote: Please avoid ad hominem arguments like "please name one system you've..." If the speaker's argument has a logical fallacy, point it out. If the speaker's argument is sound, it doesn't matter whether they write faultless software, or even whether they write software at all.
It's the argument we are discussing, not the person making it.