Hacker News new | past | comments | ask | show | jobs | submit login

You really want the engineer who was last on the line of decisions to go to jail?

Because that’s how this will go.




Yes! Even as an engineer. I've quit jobs over ethics / safety concerns and I want that pushback to be the norm. When I need to convince other engineers I want to be able to point to such consequences as to why we shouldn't be dangerously cutting corners. And even if you only punish engineers it will decrease the pool of talent / cost effectiveness of workers available to the unscrupulous managers such that cutting corners could very well not be worth the cost. Money saved on cutting corners would be lost in trying to find competent engineers that were willing to cut those corners.

Of course I want the entire chain of those involved criminally punished but punishing people anywhere on that chain, even those at the bottom is a start.


When you start criminally charging people for mistakes in critical systems, what you're going to get is coverups.

With a no-fault system, mistakes are not covered up, but fixed.

With a fault system:

1. if a mistake is discovered, it will not be fixed because that would be an admission of criminal liability.

2. Quality will not be improved, because that is (again) an implicit admission that the previous design was faulty.

3. You won't get new airplane designs, because (again) any new design is an inherent risk.

I understand the desire for revenge, but there are heavy negative consequences for a revenge/punishment/fear based system.

P.S. I worked at Boeing on the 757 stabilizer trim system. At one point, I knew everything there was to know about it. It was a safety critical system. I did a lot of the double checking of other peoples' work on it.

I did not work in an atmosphere of fear.

The 757 in service proved an extremely reliable airplane. I've flown commercial on many 757s, and often would chat a bit with the flight crew. They were unanimous in their admiration for the airplane. Made me feel pretty good.


You're not the first person I've head of this idea from; it's from the NTSB charter

The NTSB does not assign fault or blame for an accident or incident; rather, as specified by NTSB regulation, “accident/incident investigations are fact-finding proceedings with no formal issues and no adverse parties … and are not conducted for the purpose of determining the rights or liabilities of any person”

But there was plenty of blaming / faulting going around outside of the NTSB. Hence the fraud we're currently talking about where Boeing tried to blame the pilots who rather conveniently for Boeing died and are no longer around to defend themselves.

I'm glad you feel personally assured though your enthusiastic personal connections but that does not work for me. That it works for you actually makes me feel less safe flying.


> That it works for you actually makes me feel less safe flying.

Feel free to ask the flight crew next time you're on a 757. The service record of the 757 speaks for itself. I fly Iceland Air because their fleet consists of 757s. I literally bet my life on my work.

> Boeing tried to blame the pilots

The pilots were partially responsible.

1. First incident - the LA crew followed emergency procedures and continued the flight safely.

2. Second incident - the LA crew restored normal trim 25 times, but never turned off the stab trim system, which is supposed to be a memory item.

3. Third incident - Boeing sent an Emergency Airworthiness Directive to all MAX pilots. The EA crew did not follow the simple 2 step procedure. A contributing factor is the crew was flying at full throttle, and ignored the overspeed warning horn you could hear on the cockpit voice recorder.

> who rather conveniently for Boeing died and are no longer around to defend themselves.

The flight voice and data recorders spoke for them. We know what they did.

If you want to know the truth, read the official government reports on the crashes. Ignore what the mass media says.


Weird to see such a strongly held belief of not blaming people abandoned so quickly.

It's pretty standard to blame the pilots, as we know there is a Swiss cheese model for statistical analysis of risk and from my vantage point opening such large holes in the MCAS layer with the expectation that events are caught at the pilot layer is inadequate. That not all events are caught at the pilot layer is to be expected.

Edit: I should add that the pilots already got a death sentence for their involvement.


Finding the source of the error is not the same thing as meting out punishment. I am not aware of any pilot being criminally charged for making a terrible mistake.

Though their careers are usually ended by it. That seems fair to me. An engineer at Boeing who could not be relied on will also find his career shunted to working on non-critical stuff. That's fair as well.

> That not all events are caught at the pilot layer is to be expected.

Of course.

But consider that piloting can all be automated today. Why isn't it done? Because pilots are the backup system for unexpected failures. A great deal (most?) of pilot training is training on what to do in an emergency. One aspect of that training is turning off the stab trim system if the system runs away. It's a switch on the console in easy reach, and it's there for a reason.

Remember that Boeing also sent out an Emergency Airworthiness Directive to all MAX pilots after the LA crash. The EA pilots did not follow it.

Do you want to fly with pilots who don't read/understand/remember emergency procedures? I don't. I wouldn't put them in prison, though, I'd just revoke their license to fly. Pilots undergo regular checks for their competency. No pass => no fly.


Dying in plane crashes and having their license revoked are pretty severe punishments, perhaps if management could similarly have their license to practice management revoked we could consider that somehow equivalent. I'm sure many people would prefer prison time to dying in a plane crash or having their career destroyed.

I don't trust Boeing and haven't for several decades so while I agree in theory an automated system could fly aircraft with a great deal of reliability I do not trust Boeing to be able to deliver that reliability. I like the idea that someone educated in the safety of the aircraft is up in the airplane with me sharing in the risk of flying even if all they are doing is babysitting a computerized system.

Most of my knowledge of Boeing is from engineers that used to work there and complained bitterly about the degradation of engineering safety culture. And sure, perhaps flying is safer than it has ever been, I think that's more down to improvements in technology than the culture of safety. Unfortunately while technology generally improves it's hard to say the same thing about culture. In general I'm upset that flying is less safe than it could have been.


> Dying in plane crashes and having their license revoked are pretty severe punishments

Neither are punishments. If you stab yourself that's your own doing, not punishment. Removing someone from a position where they cannot be trusted is not a punishment.

Airbus has had problems with their automation systems, too, that the pilots were able to save. And crashes where the pilots forgot how to recover from a stall.

Nobody is saying safety cannot still be improved. Every accident gets thoroughly investigated, and all contributing causes to the accident get dealt with.

Whipping people is not going to improve matters.


Mark Forkner was a pilot charged in the MCAS debacle (and acquitted).


This comment seems to ignore all kinds of good practices in human factors engineering that are prevalent in aerospace.


Not sure what you mean, but pilots are trained to deal with runaway stabilizer trim. The EA pilots were also sent an Emergency Airworthiness Bulletin reiterating instructions on dealing with runaway stabilizer trim.

They're also trained to reduce thrust when they hear the overspeed horn, rather than continue at full throttle. Overspeeding the aircraft is extremely dangerous, and also makes it almost impossible to manually turn the trim wheel.


As an example, you said they “ignored” the over-speed warning. I would bet it’s much more likely that there was too much going on in the cockpit, likely with confusing or conflicting information, that prevented them from making the correct assessment in a time critical environment. Expecting the humans to act perfectly when the system is working against them is as bad of a design assumption as assuming your airspeed sensor will always act perfectly.


In the LA crash, the pilots restored normal trim something like 25 times over 11 minutes (or maybe it was 11 times in 25 minutes, I forgot). That's plenty of time to realize that the stab trim system should be turned off. They never turned it off.

In the EA crash, they restored normal trim at least once. They had the overspeed warning going off. I don't recall the exact sequence of events, but they turned off the stab trim with the airplane sharply nose down, and tried to restore normal trim by turning the manual wheel. At high speed, they should know that wouldn't work. They'd need to unload the stabilizer first by reducing the speed.

The overspeed warning should never be ignored, as it means parts of the airplane can be torn off. Especially in a dive.

Even so, if they had followed the directions in the Emergency Airworthiness Directive to use the electric trim thumb switches (which override MCAS) they could have restored normal trim.

It's not hard:

1. restore normal trim with the thumb switches

2. turn off the stab trim

That's it.

> Expecting the humans to act perfectly

Reading the EAD and do steps 1 and 2 is not some super complicated thing. Besides, pilots who get flustered in emergency situations should be washed out of flight school.

My dad was a military pilot for 23 years. He had many in flight emergencies, but kept a cool head and properly resolved each of them. In one, the engine on his F-86 conked out. The tower told him to bail out. But he knew exactly what the rate of descent of the F-86 was, his altitude, his speed, how far he was from the landing strip, the effect of the wind, etc., and calculated he could make the field. Which he did, saving the very expensive airplane. (However, he was reprimanded for not bailing out, as the pilots were more valuable than the jet.) But he was unrepentant, confident in his calculations.

BTW, I've talked to two different 737 pilots on different occasions. Their opinion is the pilots should have been able to recover.


>In the LA crash, the pilots restored normal trim something like 25 times over 11 minutes (or maybe it was 11 times in 25 minutes, I forgot). That's plenty of time to realize that the stab trim system should be turned off. They never turned it off.

Yes, and Boeing also changed the functionality of the Stab Trim cutout switches such that the Flight Computer with MCAS running on it was never able to be isolated from the electronic actuator switches on the yoke, and the use of said trim switches reset the undocumented MCAS system activation to 5s after release, which also ramped itself up far beyond the documented limits sent to the regulators, upward to a max of 2.5 degrees per activation triggered by a fubar'd, non-redundant, misclassified, intentionally non-cross checked, ultimately safety-critical sensor!

>In the EA crash, they restored normal trim at least once. They had the overspeed warning going off.

Airspeed unreliable checklist on takeoff/climbout: increase throttle. Maybe they went through a different CL?

>I don't recall the exact sequence of events, but they turned off the stab trim with the airplane sharply nose down, and tried to restore normal trim by turning the manual wheel.

Because of aforementioned screwing with the cutout switches which was not clearly communicated to pilots and only came out in retrospect.

>At high speed, they should know that wouldn't work. They'd need to unload the stabilizer first by reducing the speed.

...using a maneuver removed from documentation for several versions of 737 that only military trained pilots clued into cold on a simulator, and a civil aviation captain failed to arrive at, again, due to it's undocumented nature since about dino-737.

>Reading the EAD and do steps 1 and 2 is not some super complicated thing. Besides, pilots who get flustered in emergency situations should be washed out of flight school.

Aerospace Engineers who design airplanes and don't take into account human info processing limitations, human factors research, basic ethics, and an appreciation for the fact that "management can wish in one hand, shit in the other, and see which one fills up first, because I. Am. Not. Going. To. Kill. People." should likewise, also wash out. But lo and behold, we live in an inperfect world with fallible people. Which has generally meant we should be on our toes to be up front about what we're building instead of hiding implementation details from regulators for fear of all the money we might lose if we actually do our jobs to the professional standard we should.

>My dad was a military pilot for 23 years. He had many in flight emergencies,>>

>BTW, I've talked to two different 737 pilots on different occasions. Their opinion is the pilots should have been able to recover.

Walter, if it'd have been built the way it should have, and been documented as it should have been, and had the simulator training it damn well should have requirex it never would have happened.

Stop trying to justify it. This wasn't a bunch of "get 'er done" skunkworks engineers, working on mindbending, cutting edge, ill-understood designs.

This was a civil transport project, co-opted by a bunch of fiscal min-maxer's who pressured everyone, and continue to pressure everyone to cut every corner imaginable.

I genuinely feel bad for you. It'd rip at my soul to see something I worked so hard for to fall so damn hard. I'm going through a crisis of faith on that front at the moment. It ain't fun at all.

We cannot afford to be kind to these institutions once we've left them though. People like you were, and I certainly am are counterweights to people who think that all those corners we fastidiously upkeep and check are just so much waste, when they damn well aren't.


You can also read the Boeing documents. The hazard analysis listed MCAS as hazardous. By Boeing process controls, that should have mandated redundant sensors. Why were they optional? Because that could increase profits. (Seattle Times has done good reporting on this).


Boeing indeed has a failure here, and is responsible for it. But as the first LA incident confirms, the pilots, by following their training, were able to deal with it. It was the subsequent flight on the same airplane that crashed.

This remains omitted by the mass media, which contraindicates any of the reporting on it as "good".

There is also responsibility in the production of a faulty AOA sensor, and the failure of LA to fix the known faulty sensor, or inform the crew, before authorizing the next, fatal, flight.


I disagree. The media has covered the training (or lack of it) extensively. It seems like you want to point to a singular cause. As with most mishaps with complex systems, it’s the result of a cascade of failures. Training is only one part of the problem. Fixing that is still only an administrative mitigation which is the least desirable form of mitigation when designing a safety system. Even if the pilots had additional training, it wouldn’t fix the root cause of the issue and is just asking for trouble because administrative fixes “only” require humans to act perfectly in accordance with their training, all the time.


> The media has covered the training (or lack of it) extensively.

I've never seen it in the media. Not knowing what the stab trim cutoff switch does is a massive failure either on the part of the pilot or the training. Not reading the Emergency Airworthiness Directive is a failure of the pilot.

> It seems like you want to point to a singular cause

Au contraire. It seems that everyone but me wants to blame a singular cause - Boeing. There was a cascade of failure here:

1. A defective AOA sensor was manufactured and installed

2. After the first LA incident landed safely, LA failed to correct the problem before sending the airplane aloft again

3. Single path MCAS inputs could not detect failure

4. Pilots failed to apply known emergency procedures

> require humans to act perfectly in accordance with their training, all the time.

Of course we try to design airplanes so they do not cause emergencies. But we still need pilots to train to react to emergencies. You don't put a person in the pilot seat who is not well-trained in emergency procedures.

Making airplane travel safe means identify and correct ALL causes in the cascade of failures. Most accidents are a combination of airplane failure and pilot failure.

Sometimes pilots make deadly mistakes. That's why we have copilots, they check each other. But in the MAX crashes, both pilot and copilot failed to follow known emergency procedures. Why they didn't, I have no idea.


"Boeing" is not a cause. Things like their safety culture can be a cause, or lack of good process control can be a cause. The reason why Boeing is catching a lot of flack is because they deviated substantially from best practices. This is related to your other comment so I'll just respond here.

There is a general hierarchy when it comes to controlling hazards (remember, Boeing already identified MCAS as hazardous): remove the hazard, engineering controls, administrative controls, and PPE. We can apply a little thought experiment to see the gaps in what you're advocating when it comes to the MCAS issue. (Admittedly, this is contrived to just illustrate the point within the confines of a forum reply).

1) Remove the hazard: They could have redesigned the airframe to adjust the center-of-gravity to remove the stall issue that MCAS was developed in the first place. Why didn't Boeing do this? Because of cost and schedule pressure to have a new plane ready, after threats that American Airlines would take their business elsewhere.

2) Engineering controls: MCAS was an engineering control, but an incomplete one. Because MCAS was listed as "hazardous" in the hazard analysis it required redundant sensors. Why didn't they put on redundant sensors as the default? I can only speculate, but considering they were sold as optional, profit motive seems likely. (This also ignores the fact that MCAS should have been categorized as 'catastrophic' meaning they didn't fully understand the impacts of their system)

3) Administative controls: this was the training piece that you're hanging your hat on. This has multiple problems. For one, even though MCAS changed the handling dynamics of the airframe, Boeing pushed hard to reuse the same certification to avoid additional pilot training. Again, this was a business decision to make the airframe more competitive. Secondly, administrative controls are an inherently weak because of human factors. There's a lot that can go wrong if your plan is to have humans follow an administatrive procedure. You claim "it's not hard". Sorry, this is just a bad mindset. Usually when I hear people say things like "it's not hard" or "all you've gotta do" when they're talking about complex systems, it indicates they take an overly simplified mental model. In this case, you may be ignoring the chaos in the cockpit, the fact that the plane has handling characteristics different from what the pilots were trained on, human factors related to stab trim force at speed, conflicting or confusing information (like why the MCAS comes on at changing timeframes, time criticality, etc. Having adminstrative controls as your main mitigation is bad practice, and setting the system up for failure.

4) PPE: this isn't particularly relevant to this case, but a silly example of PPE control is giving everyone parachutes and helmets in case things went south.

You can obviously see PPE is an absurd control. But your main point is that the main control should be administrative, which is the next worst option. Boeing ignored good safety practice to pursue profit. So they probably deserve some of the heat they are getting.


"redesigned the airframe" means designing an Entirely New Airplane. Boeing didn't do this because designing an entirely new airplane would have been:

1. an absolutely enormous cost, like a couple orders of magnitude more

2. several years of delay

3. pilots would have to be completely retrained

4. the airlines liked the 737 very much

5. mechanics would have to be retrained

6. the inherent risk of a new, untried airframe

There was nothing at all inherently wrong with the concept of the MCAS system, despite what ignorant journalists wrote. What was wrong with it was it was not dual path, had too much authority, and would not back off when the pilot countermanded it. These problems have been corrected.

Pilot training - there have been many, many airplane crashes because pilots trained to fly X and Y did the Y thing when flying X. The aviation industry is very well aware of this. Boeing has been working at least since 1980 (and likely much earlier) to make diverse airplane types fly the same way from the pilot's point of view. This is because it is safer. And yes, it does make pilot training cheaper. Win-win.

> You claim "it's not hard". Sorry, this is just a bad mindset.

Yet it's true. Read about the first MCAS incident, the one that didn't crash. There were 3 pilots in the cockpit, one of which was deadheading. The airplane did some porpoising while the pilot/copilot would bring the trim back to normal, then the MCAS would turn on, putting it into a dive. Then, the deadheading pilot simply reached forward and turned off the stab trim.

And the day was saved.

I've seen your points many times. They are all congruent with what journalists write about the MAX. Next time you read one of those articles, I recommend you google the author to see what their background is. I have done so, and each time the journalist has no engineering degree, no aeronautical training, no pilot training, no experience with airline operations, and no business experience.

You can also google my experience. I have a degree in mechanical engineering with a minor in aeronautics from Caltech. I worked for Boeing for 3 years on the 757 stabilizer trim design. (It is not identical to the 737 stab trim design, but is close enough for me to understand it.) At one point I knew everything there was to know about the 757 trim system. A couple dozen feet away was the cockpit design group, and we had many very interesting discussions about cockpit user interface design.

I'm not a pilot myself, but other engineers at Boeing were, and they'd take me up flying for fun. Naturally, airplanes were all we talked about. My brother and cousin are pilots, my cousin was a lifer engineer at Boeing, my dad flew for the AF for decades in all kinds of aircraft, with an engineering degree from MIT. I inherited his aviation library of about a thousand books :-/ Naturally, airplanes were a constant topic in our family.

I've talked with two working 737 pilots about the MAX crashes.

I've read the official government documents on the MAX crashes, and the Emergency Airworthiness Directive.

That's what I "hang my hat" on. So go ahead, tell me I don't know what I'm talking about. But before you do, please look up the credentials of the journalists you're getting your information from.

P.S. "Aviation Week" magazine has done some decent reporting.

P.P.S. Amazingly, the "Aviation Disasters" TV documentary is fairly decent in its analysis of various accidents, lacking the histrionics of other treatments. But it's rather shallow, considering it's a 40 minute show.


>designing an entirely new airplane would have been:

1. an absolutely enormous cost, like a couple orders of magnitude more

2. several years of delay

3. pilots would have to be completely retrained

Yes, this is the same case I was making. They took a higher (or unknown) risk for business (profit) motives. That's why they deserve a large chunk (but certainly not all) of the blame.

There was nothing at all inherently wrong with the concept of the MCAS system

My claim isn't that the concept was inherently wrong, it's that the execution was wrong. Their own process documents say so and they also belie the fact that they didn't understand their airframe. The damning part of it is that they were likely wrong for the reason of increasing profit. (Still, even if MCAS isn't inherently bad, we still have to acknowledge it's not the best solution...see the discussion above about hierarchies of controls).

>You claim "it's not hard"...Yet it's true.

This is exactly the wrong way to think about this. Just because a mitigation works some of the time doesn't mean it's the best mitigation. Can I still design a car with a coal-fired steam engine and cat-gut brake lines and drive it safely? Sure. But by modern standards, it's still a sub-par design and the likelihood of safe operation is lower because of it. That likelihood is the entire reason there is a hierarchy of controls. You are advocating against well-established best practices in safety and reliability.

>You can also google my experience. I have a degree in mechanical engineering with a minor in aeronautics from Caltech. I worked for Boeing for 3 years

Please don't do this next time and argue the points rather than appealing to (relatively weak) authority. I'm familiar with your points and can usually set a clock by the time it takes you to either bring up your experience at Boeing or some story about your daddy in these discussions. But you aren't the only one with aerospace experience. I've been an airframe mechanic. I also have an ME and additional engineering degrees to include a PhD and published aerospace-related research. I've worked in NASA for many more years than you worked for Boeing. I filled roles in aerospace, quality/safety, reliability, and software engineering related to both software and hardware design. I've worked alongside Boeing on crew-rated space systems. I've also piloted aircraft (although my ratings are no longer current). I've had dinners and discussed similar issues with pilots and astronauts with thousands and thousands of hours of flight time. But parading out your credentials doesn't make your points any stronger and tends to be the bastion of those without much else to rely upon. This isn't a pissing contest, so please make an argument based on its own merits rather than relying on credentials.


> The damning part of it is that they were likely wrong for the reason of increasing profit.

Are you still advocating designing a whole new airframe instead?

> we still have to acknowledge it's not the best solution

We don't have to agree on that at all. It's an inherently simple solution, although Boeing made mistakes in its implementation.

> Just because a mitigation works some of the time

It's turning off a switch. The purpose of that switch is supposed to be a "memory item", meaning the crew should not need to consult a checklist. The switch is for dealing with runaway stab trim. Reading the step-by-step of the crisis, it is impossible for me to believe that the pilots did not know they had a runaway trim problem. There are two wheels on the side of the console, painted black and white, that spin when the trim runs, making a loud clack-clack sound. They are physically connected to the stab trim jackscrew with a cable. If the trim motor fails, the crew can manually turn the jackscrew via those wheels.

> You are advocating against well-established best practices in safety and reliability

Turning off a system that is adversely working is well-established in aviation. It's quite effective.

> and argue the points rather than appealing to (relatively weak) authority

Appeal to authority is not a fallacy when one has some authority :-) And so it is fair to list what makes one an authority.

> This isn't a pissing contest

You might want to review the condescending and quite rude post you wrote that I replied to. Your reply here is also rather rude. I don't think I've been rude to you.

Thank you for listing your credentials.


>Are you still advocating designing a whole new airframe instead?

That would be the ideal solution for that hazard. But I can’t say if it’s the best risk profile overall. I would settle for a properly implemented engineered mitigation of MCAS.

>* It’s an inherently simple solution*

The fact that the engineers who built the thing mischaracterized it would seemingly be evidence to the contrary. I have experienced this flawed thinking often, where software is treated as a quick simple solution without considering the effects on the overall system.

>Appeal to authority is not a fallacy when one has some authority

It actually is. As Carl Sagan said, “mistrust arguments from authority.” But regardless, we seem to have different ideas on what makes someone an “authority”. You may be an aerospace authority when you’re in a room of CRUD software developers, but this forum has a much wider net than that. “Technical authority” is an actual assigned title at NASA, and you probably wouldn’t get it with 3 years of experience from decades prior.

“Turning off a switch” is the easy solution when you’re dealing with the benefit of hindsight. The pilots were operating in a completely different different environment. That’s why administrative mitigations are not a favored approach. Boeing simulator results demonstrate that it was a confusing scenario to identify the correct root cause in a time-critical situation.

As to the tone of the post, I’ve witnessed you in many aero circumstances state your credentials as a “I know what what I’m talking about so that’s the end of it” puffery tone. It in itself comes across as condescending and, worse, adds little to the conversation. I generally try to be respectful on these forums until someone shows they aren’t reciprocating. Normally I just roll my eyes and move on. I’ll give you the benefit of the doubt and assume you don’t even realize the condescending tone some of your posts take. I debated even responding but thought it might make you realize how off-putting you’re style can be. In your case, it comes across as very arrogant rather than curious which is contrary to the HN guidelines. If it offended you, I apologize.


I admit to being arrogant, perhaps that's an inevitable side effect of being confident. When proven wrong, however, which has happened on HN (!) I try to admit it. I don't like making mistakes.

I'm not offended, as that's to be expected when I write things that are unpopular.

But I still accept your apology, and no worries. Perhaps we can engage again in the future!

P.S. I know about the simulator issues, but the information came filtered through a journalist and I am skeptical. What I cannot reconcile is the first incident where the deadhead pilot simply turned off the switch, compared with the simulator pilot. It didn't matter whether the runaway trim was the root cause. It did matter that runaway trim will kill you and it must be stopped. All three crews knew that, which is why they fought it.

In my work on the stab trim system, it was accepted that the first resort for trim failure was turning the thing off. Overhead in the 757 is a matrix of circuit breakers, the purpose of each is to turn off a malfunctioning system. The stab trim one, being critical, isn't located overhead but right there on the console.

I work with machinery all the time. When it fails, my first reflex is to always turn it off. For example, one day I smelled smoke. Looking around, smoke was coming out of my computer box. I could see fire through the grille. The very first thing I did was yank the plug out of the wall, the second was to take the box outside. The flames went out when the current was removed.

I simply do not understand failing to turn off a runaway trim system. Especially when it kept coming back on after normal trim was restored.

For another example, an engine fire. I don't know what the fire checklist says, but I bet right at the top it says to activate the fire extinguishers and cut off the fuel and electric power to the engine. I've done the same for an engine fire in my car :-)


>3. Third incident - Boeing sent an Emergency Airworthiness Directive to all MAX pilots. The EA crew did not follow the simple 2 step procedure. A contributing factor is the crew was flying at full throttle, and ignored the overspeed warning horn you could hear on the cockpit voice recorder.

Walter, ffs, they were taking off hot & high from Adis Ababa with airspeed unreliable because of the screwed AoA sensor; a sensor unjustifiably classified too low in terms of consequence for failure, and deliberately wired non-redundantly and kept from being cross-checked in most configurations in order to avoid training burden to avoid cert work to compete with the NEO. Without that AoA sensor, overspeed had no basis; garbage in, garbage out.

Let it go. Boeing f*cked up. Period. Their (the pilot's) instruments and alarms were completely untrustworthy. You (Boeing) don't get to blame pilots for not doing memory items when your (again, Boeing's) technical design is screwed from the first. Your 757 work was fine. The MAX as it was was not. Nothing will ever make it so.


P.P.S. My work was checked by others, like the stress group, my lead engineer, and the test group.


> Yes! Even as an engineer. I've quit jobs over ethics / safety concerns and I want that pushback to be the norm

Yes

But it will be much more effective if the managers risk jail

Engineers often need the money and simply cannot quit like thar


You had the privilege of being able to quit the job over ethics / concerns. I'm assuming you didn't wind up homeless with starving children. I have definitely not always had that in my career and many people don't.


I have ME/CFS which means for most of my life I've barely been able to make enough in the good times to cover the extended bad times. Running out of money for me is basically a death sentence as it would become increasingly impossible to dig my way out of such a hole. Normal people at their worst could probably stack shelves or wash dishes for a living but that option is not available to me. I'm better at managing my condition now but at the time I quit jobs for ethical reasons I did so knowing that if I didn't find something else quickly then I would eh. end up taking a long walk on a short pier. To make matters worse I'm highly noise intolerant and have to spend more money to live in quieter places otherwise my already precarious health would rapidly deteriorate.

As engineers we are often responsible for the creation of assets more valuable than ourselves and I think it's an essential part of the job to put our lives on the line in much the same way an airplane passenger does. And as engineers we are often also airplane passengers ourselves and must trust that other engineers that they too have put our safety ahead of their personal wellbeing.

As the middle class is crushed then sure, such ethical boundaries will fade out of necessity, and I see this as part of a descent into a low trust society and why I expect more planes to fall out of the sky, both metaphorically and literally. Once a culture tolerates such flexible ethics the boundaries will continue being pushed - there isn't a lower bound. This corruption inhibits the creation of valuable assets and will result in a massive erosion of our standard of living.

An often touted solution is Universal basic income (UBI) that would create a safety-net for engineers and those with my type of disability - but having experienced constant gaslighting on ME/CFS from doctors informed by state funded research and given the expansion of Canadian MAID style solutions to people like myself I'm very fortunate that the capitalists opportunities existed such that I did not have to rely on state 'care'.


I was homeless at 17 in high school before going into the Marines in 2002. I know what having negative money is like, health problems or not.

That doesn't change the fact that it comes from absolute privilege to tell someone that they should just quit their job because you don't agree with the ethics or morality of work they're doing.

If you HAVE the privilege, the savings, the good health not needing COBRA, etc, then by all means walk away. You also don't mention having children, I know a LOT of parents and they absolutely don't have the money to be able to just walk away and keep their kids lifestyles, healthcare, etc until finding something new.

But judging someone potentially desperate and on already on the edge for where they work in most cases is pretty off. You have your own story someone wouldn't be aware of without telling them that as you say may make it impossible to dig yourself out of a hole. Who knows what situation someones in?


I disagree again, and I would much rather be homeless than have ME/CFS, I'd still make that trade today. You clearly don't know what it means to have ME/CFS and I don't begrudge that, hardly anyone does.

I would rather you be homeless than have you be part of making an airplane unsafe and I will judge you for that. I don't care how hard life is for you and I don't expect others to care how life has been for me - if you are part of making other people unnecessarily unsafe I hope you burn in hell for eternity. It's not a matter of privilege and just because you have found your way of being ok with it does not mean I still will not judge such decisions as unethical, immoral, improper and unbecoming of human in general and an engineer specifically.


What absolutely weird replies. You clearly are the victim of the world, sir. Nobody has worse problems than you!

None of this was a competition, super strange to make it so and not just recognize not everyone can quit their job today, nod your head and agree.

Because that is a fact.


What part of "I don't expect others to care how life has been for me" did you not understand?

I would like to think that even without ME/CFS I would still hold such views.


"I recognize that my life experience is not the life experience of everyone else. Some have it better, some have it worse, regardless of how good or bad MY OWN life is.

I recognize that Jane with 4 starving kids, one with cancer, making $7.15/hr with an empty fridge and a broken down car potentially can't just quit her job at the factory today even though they switched from building cars to bombs.

My life is not a comparison against other peoples lives."

Thanks. Just because you have problems doesn't mean they're better or worse than others. This isn't about you. I'm sure your response is "I'd rather be jane than have CFS," or something about how you're morally superior and would be homeless before making a bomb/whatever.

Absolutely weird thought process, especially from someone who has never been homeless but would prefer to be homeless than xyz. Very im14andedgy vibe.


Well, I've also been homeless... before my ME/CFS was bad so it was much easier to deal with. Anyway, that's irrelevant.

You can recognize that such people will make such decisions, and I agree that they will, but where I don't agree is that making such decisions is in anyway morally ok. If Jane chooses to keep her job where she is knowingly cutting corners on making lifesaving widgets I will judge her harshly for that regardless of her personal circumstances. If Janes work is unimportant then I couldn't care less.


> None of this was a competition

You tried to invalidate their position by making this about personal experiences. When they counter with personal experiences, you don't get to cry foul and claim they made it a "competition".

And your first paragraph is just making shit up. Come on.


Honestly, meh, especially for white collar professes like ours or this one. First of all, generally we are not talking about people who have no other choice but do something unethical. We are talking about people who have choices and are choosing either somewhat higher salary or inertia.

And second, I do know multiple people who made the choice of not getting employment somewhere unethical. It is not some kind of rare decision making. These people have children and parents that depends on them.

It is not some great unusual privilege to walk away from job like this you want to frame this as. It is where quite a lot of technicians are. You could make that argument about factory near some trailer park or poor area, but not really about us.


> you could make that argument about factory near some trailer park or poor area, but not really about us.

That's the entire point. We're privileged. My god. My example was literally a mom in a factory with starving children who began building bombs instead of cars.


Yes. This might be unfamiliar to software engineers, but there's a reason mechanical engineering and architecture (there might be more, but these are the ones I'm familiar with) schools put students through at least a semester of ethics classes.

We are literally the last line of defense in a lot of industries. From the first line of the ASME code of ethics:

> Engineers shall hold paramount the safety, health and welfare of the public in the performance of their professional duties.

If you can't do that, you should be in another field. Maybe software engineering at Meta would be a better fit.


I went to a Science and Engineering school, and everyone had to take ethics: ME's, ChemE's, CompEng, Civs, etc.

This is such an important part of engineering as a profession.


This is an absolute joke in almost all fields of engineering. Milage may vary, but it's basically the wild west. I can't speak for PE's, but obviously there's at least a little more regulation there.

However a huge number of engineers critical to safety would not have one.


This is a noble goal but unrealistic and unfair at scale. Also it belays just a wildly unrealistic understanding of how risk distribution works in corporations. It’s as if in your mind one engineer makes the singular call and sees the entire picture of the project and its risks. Where as more likely a downstream worker who very likely had an incomplete picture will be blamed.

Also the personal insult is not needed. It just makes you out to be an ass rather than someone making a point.

Consider your romantic notions might not be realistic.


>It’s as if in your mind one engineer makes the singular call and sees the entire picture of the project and its risks.

To a certain extent this is true. To extend the OP's example, a mechanical or civil engineer who stamps a building design takes responsibility for that design being safe enough for public use. The same could apply to the MCAS design, or the quality engineer who signs off on how the plane was put together. These are not "romatic notions"; the latter is already common practice in aerospace. Do you think if a QE stamped that the door bolts were installed should not bear responsibility for the bolts being missing?

In those cases where the engineer is supposed to be aware of the risks, there should be processes in place to account for, and mitigate, those risks. For the example of MCAS, the software was categorized as hazardous in the hazard analysis document. As such, Boeing processes required it to have redundant inputs as a default. It did not (the redundant sensor was sold an optional equipment); as such, I think the engineer who signed off on the hazard analysis may bear some responsibility. If, on the other hand, there were no business processes in place to require redundancy, that seems like a CEO responsibility.


I didn't read it as a "personal insult".

It's true that most software engineers never work on safety-critical systems. The nature of the products they work on, or the nature of their contributions to those products, limits the possible impact of flaws and mistakes.

Once you work on a product with safety-critical features and/or become responsible for one, there really is a marked psychological change that takes place. I've lived through that change in my own career, and many comments (including some I've written, I'm sure) now strike me as callous and trivializing important matters to an extent that grates.

It's important to hold engineers accountable for their work in proportion to their impact, which is typically very significant. If you build machines that can kill, stakes should exist for you as well.


I don't see a personal insult, but maybe SWEs should take the ethics of their work more personally. Maybe the shape of our epidemic of social and psychological dysfunction would be different (if not smaller).

When I worked at a retailer, we were told to push buy-now-pay-later schemes on our customers (on top of the corporate credit card, and insurance plans, and service plans). Most of my coworkers simply decided that they weren't going to - in part, because such schemes are nakedly predatory, but also because it was yet another dumb metric we knew we'd be held to if it were successful. I don't know of anyone who was fired or docked pay or denied a promotion because of it, and it's not worth luring someone into financial ruin over a TV and an entry-level sales position. Which leads me to another thought: if you're so low in the chain that you have no say in how and what you work on, is it a job that's worth sacrificing your morals for? Maybe take advantage of that increased earnings potential from job-hopping and get out of dodge.


To the contrary, there is usually _several_ engineers who see the ethical/safety problems and talk about them with colleagues. They just don't report it to the right authorities.


Nobody asks the engineer to see the entire picture. (Very few, senior, engineers see the entire picture - to the shock of many newbies. But that's for another discussion). The problem is when the engineer DOES see the problem and doesn't question the project, and stays on, and doesn't figure out a way to get it questioned, and in the end doesn't report anything. THAT is an ethics problem. And yes, in some companies it will cost you your job. That's where the other jobs come in - because there ARE other jobs.


> This is a noble goal but unrealistic and unfair at scale. It is in the current organisation of software development because it doesn't allow that.

In the building sector you can't build without your design being signed of by an engineering office and the engineer and its office take the responsability for it, both at the criminal and civil level.

If the same rule existed for the software industry, the organisation would adapt itself around it.


I'm sure they can try. But put yourself in the shoes of that engineer. Imagine what would go through your head. My entire MO would be CYA. Every 1on1 conversation, secretly recorded (thank you favorable NY state laws), and a paper trail a mile long. For just this reason.


I don't get what you are trying to say though. If companies can use 'a mile long paper trail', why can't the engineer do the same?. If it turns out they were forced to do X over their own objections (people do have mortgages and need medical coverage), then that manager gets the blame. Seems fair??


A manager needs to do their own CYA. If they don't, well, that's on them. But you as an individual should only focus on what you can control.


CYA emails are not exactly a new concept. Perfectly acceptable from rank and file engineers. And the more senior people you are sending that to - if they are not clueless which does happen - know exactly what this is about.

Project documentation also not a new concept. Also acceptable.


Professionally licensed engineers are required to take an oath to protect the public. What many don't realize is that manufacturers like Boeing operate under a licensing exemption, on the assumption that business and manufacturing processes are sufficient to protect the public. In cases like this, there is a case that is insufficient.

There has been a movement in some states to remove the manufacturing exemption, in which case the cognizant engineer would bear the responsibility. The optimistic view is that this will help balance the power between the engineer and the C-suite. The cynical view is that this will allow the C-suite to push responsibility down to the engineer.


There's a code of ethics for a reason.


Such decisions are way above the pay grade of random engineers


Never forget Kareem Serageldin - the one person who went to jail for the 2007 financial crisis.

In fairness, he was an exec making $7M per year - so not a nobody, but he was only managing 70 people (Credit Suisse was a ~50k person company), and obviously not the only person who should've went to jail.


A systemic failure doesn't necessarily mean that people should go to jail; Kareem lied about the price of bonds and deserved jail time.


Everyone lied about the price of bonds, that's why our major investment banks still exist (though, I suppose CS doesn't at this point; that's another story).


What is your point? (1) That others should have gone to jail too, or that (2) Kareem was only doing what everyone else was doing?

For (1), were other people prosecuted? Maybe there was insufficient evidence or process issues that got in the way of additional convictions.

For (2), how does that excuse a crime?

Kareem's story is fascinating for many reasons and I don't find lying about bond prices to be a crime to hate someone over. But, a crime it is, nonetheless.


The point that was made earlier in the thread is definitely 1.

If there was insufficient evidence in every other case, then that's a massive systemic problem we need to fix.

And I can't think of any "process issue" that would cause this particular outcome other than "massive corruption".


Whether to do something you know is morally wrong or refuse and quit is always your pay grade.

And don't give me the Yuppie Nuremberg Defense. "I was just following orders... because I've got a mortgage!" That doesn't fly when lives are on the line.


I think people seem to have a very romantic notion that there is some meeting where all the engineers get together and decide whether or not they are going to kill their users or not.


Not to "kill their users" but they realistically have meeting to vote on whether the system is "go/nogo". This is the basic structure of many aerospace designs. If you ever watch a rocket launch, they are literally checking in with every system owner for that decision. Prior to even getting to the launch date, there are meetings where they are checking with the designers for similar decisions. There are good documentaries/movies that show this dramatized in examples like the shuttle Challenger.


Yea and with challenger only one engineer raised the alarm while a large group of well meaning engineers overrode him.

This is reality. Decisions aren’t kill/not kill.


Yeah we both agree that the decision isn’t “kill/not kill”.

It’s much more ambiguous and murky, especially when dealing with low probability events. That’s why it takes expertise, but it doesn’t mean there shouldn’t be accountability for those “expert” decisions. Other engineering domains already have this, there isnt anything that makes these decision inherently different.


When you criminalize the role either no one will take the role or an incompetent patsy will. I dont know when HN became a bunch of keyboard jockies white knighting what they'd do... but reality is a very different story.


When you avoid accountablilty, you pervert the risk/benefit. So people get incentivized to roll the dice on low-risk/high severity events because most of the time, it will come out in their favor and they can continue cashing their checks. But in the off-times when it doesn't, people just shake their heads and say "Who could've known!?" when in reality, there were people sounding the alarm the entire time. The outcome you're advocating hiding the risk into the shadows where nobody talks about it. I'm advocating a structure that isn't risk-adverse necessarily, but at least lays it all out for transparent risk-informed decision making.

You're talking to someone who spent years working in the aerospace industry. I'm not sure, but it seems like you have some hypothetical idea of what reality is, but it doesn't align with my actual experience. Not to sound disrespectful, but it sounds like one of us actually has experience and the other is going off a narrative they've created in their head.

I do hear people often talk about the "incompetent patsy" excuse. But I'm curious, where do you think that stops being relevant? Do you not hold medical doctors liable for decisions because a patient will just find another "incompetent patsy" to prescribe them whatever they want? Do you not expect civil engineers to be responsible for a structural design of a bridge because a company will find an "incompetent patsy" to sign off on a sub-standard design that is better for profit margins? We're used to holding all kinds of professionals liable, but there seems to be a cultural shift in the last 40-50 years where we've rationalized bad behavior as the norm rather than holding people accountable.


Wait, what? I'm sure here we can relate to the software side of things. How many times have you been in a position of saying "go/nogo" where saying "nogo" didn't imply it would also be your last day on the job? (or at least a severe Career Limiting Move)


For a relatively brief period, I worked as a software quality engineer related to aerospace research and development. Practically every test that used software required a “go/nogo” decision before they were allowed to run any safety critical test. The safety chief would give the overall “go-nogo” decision but the subsystems (like software safety) would require “go-nogo” decisions to be fed through the chief.

As I’ve said elsewhere, good organizations implement a distinct chain of command for those decisions so they can be made more impartially. Even then, it’s not without career risk, but IMO that’s part of the gig and why it takes a certain amount of professional integrity. As someone else said, if someone isn’t up to that task, maybe developing safety critical software isn’t the right gig for them.


I was able to formaly state a "nogo" when I worked in banks. The management could bypass it, but it was then a formalized decision of their part and it discharged me of any liability for the problem that lead to my "nogo".


They aren’t in these industries. Relatively low-level engineers have the capability (some would say duty) to stop work. Typically, they have a different chain of command for that very reason.


No they are not! You are responsible for what you sign off on/seal. If you don't want that responsibility then join another profession.


Importantly here, Boeing doesn’t require licensed engineers because they operate under an industrial exemption.


Sounds like it's time for that to go away


> it would be great for people behind this to be directly tried for their criminal actions as well.

How did you jump to engineering directly from this sentence?


Yes. Absolutely.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: