Are you asking about Teslas or Teslas being driven by Autopilot? Autopilot so far has a fatality rate of roughly 0.25 per 100 million miles. Just for information's sake the overall rate in the US in 2017 was 1.16 per 100 million miles. Although there are plenty of caveats that should prevent you from comparing those numbers directly. The national number is for all cars, all drivers, all conditions, etc while those driving a Tesla in Autopilot are generally considered to be in a safer cars, safer conditions, and be safer drivers than average.
For all driving protocols P, and accidents A, P would have zero fatalities if it switched to a better protocol just before encountering the situation that led to A.
That stat of 1.16 includes motor vehicles, motorcycles, pedestrians, buses, bicyclists and trucks. How is this a comparison to luxury electric cars again?
So... Two planes falling out of the sky, killing 400 shouldn't be taken as proving anything?
Or how about 1 stupid decision by a radiotherapy machine? Ever heard of THERAC-25?
In each case, it was just 1 stupid decision by a machine.
The entire reason Engineering as a practice is a thing is because when you implement the capacity for a stupid decision into a system that is then mass produced, dire consequences can result.
I look down on any thought process that doesn't discriminate between the difference between 0 and 1.
If the system provably worked, that decision would not have happened (0). The stupid decision happened (1), however, which means it can happen again at a poorly understood confluence of circumstances.
To err is human, and we forgive each other every day for it.
To err as a machine is a condemnation to the refuse bucket, repair shop, or back to the drawing board.
To err so egregiously as a machine to cause an operator and those around them to lose their life is willful and moral negligence on the part of the system's designer. Slack is cut when good faith is demonstrated, but liability is unambiguous. The hazard would not be there if you hadn't put it there.
And people have died and been paralyzed as a direct result of the flu vaccine. Death and paralyses that would not have occurred had they not received that vaccine.
Does that mean the flu vaccine should get dumped in the refuse bin?
Similarly, some other aircraft flying today/tomorrow has automation with an unknown bug/issue that will cause loss of life. Should we disable everything except the 6-pack and stick and rudder?
It would have saved the lives killed by automation, but we would have more aviation death overall.
Liability is still clear, and in the specific case of medical practice, a degree of "we can't foresee everything" is implicit in that our understanding of the governing principles of the human body is incomplete.
Aviation does not have that excuse. The 737 MAX 8 system description is enumerated from the ground up. Seeing as there was so much recertification effort that didn't need to be done, it makes the failure to properly handle the MCAS implementation all the more damning.
This wasn't some subtle bug. This was an outright terrible design choice. Anyone with any experience composing complex systems out of smaller functional building blocks should have been able to look at the outputs, look at the inputs, and realize there was the potential for catastrophic malfunction.
As I've said elsewhere, automation should make flying a plane easier when functional. When non-functional, however, the pilot should still be able to salvage the plane. That requires clear communication of what automation does, and what it's failure modes are.