Saying it caused "a fatality" is being disingenuous. It's along the same false pretense as "we shouldn't use self driving cars at all, if they lead to even one fatality".
Fatalities aren't good things, but they're inevitable consequences of driving. And it's disingenuous to expect anyone's code to be guaranteed to work on every mile of road in the US, under all conditions, at all times. If autonomous vehicles substantially reduce overall fatalities we are better off.
(Nobody talked about banning combustion vehicles after the pinto or banning Fords over their side mounted fuel tanks or their explorers that rolled over when the defective tires blew out)
> Saying it caused "a fatality" is being disingenuous.
It is accurate.
They pushed out an update that changed how AutoPilot behaved, someone had AutoPilot enabled, AutoPilot accelerated straight into a concrete barrier, and the occupant died.
Here is the NTSB's initial report on the incident:
> It's along the same false pretense as "we shouldn't use self driving cars at all, if they lead to even one fatality".
I didn't say anything remotely like that. I stated what had already occurred. Trying to put those words in my mouth doesn't seem like you're responding in good faith.
"You have no proof that the accident wouldn't have occurred anyway without the update."
Well, the NTSB preliminary report doesn't make any statements about what the software update did, per se. But it does indicate that the driver's hands were not on the wheel at the time of the accident and that the "autopilot" software was activated. So it's fair to say that the "autopilot" software was responsible for the crash.
"Or that their updates didn't save one or more lives."
This is a red herring. It could be true. But there's no evidence for it so it's not worth thinking about. Our system of moral judgements rightly puts a lot of weight on demonstrable causes and effects, and generally ignores hypothetical, but totally unproven speculation like this. Otherwise anyone could get off the hook for anything, by saying, "I may have done bad action X but you can't prove I didn't also do good actions Y and Z which could well outweigh X."
"Not an "accurate" but misleading accusation at the software without considering the hidden variables."
In ordinary life we never know all the hidden variables. We just make judgments based on the best available information (or if necessary, decide to postpone judgment until better information is available). The NTSB preliminary report seems credible and I see no reason not to draw reasonable conclusions from it.
Fatalities aren't good things, but they're inevitable consequences of driving. And it's disingenuous to expect anyone's code to be guaranteed to work on every mile of road in the US, under all conditions, at all times. If autonomous vehicles substantially reduce overall fatalities we are better off.
(Nobody talked about banning combustion vehicles after the pinto or banning Fords over their side mounted fuel tanks or their explorers that rolled over when the defective tires blew out)