> Also, lets back up our AI’s with old school collision avoidance! Intelligence is not the same as perfection, at least for now.
The car was travelling at 38mph and never braked. Even if the collision-avoidance only saw it at the last moment it would still have braked and potentially slowed the car enough so that the woman was injured instead of being killed.
I'm all for self-driving and fully believe it can improve on humans, but I don't see how it's possible for self-driving cars to be on the road if they can't properly detect the most vulnerable users in all conditions.
This is absolutely a case where an autonomous system should have out-performed a manned system. Lasers see through darkness very well. A camera-only system is insufficient.
Not to mention it wasn't even very dark on the road. If you look at other video and images, that stretch is very well lit. The dash cam video was poorly calibrated / selected (but probably not used in the decision process of the AI).
Humans can't properly detect the most vulnerable users in all conditions. Not an excuse for this system failing to perform, but all self-driving cars have to do is be a little bit better for net safety to improve. People are notoriously unreliable. Computers don't get drunk or sleepy, for instance.
> all self-driving cars have to do is be a little bit better for net safety to improve.
That's true abstractly but ignores several important real-world factors about the adoption of self-driving cars.
On the one had, autonomous cars have to be a lot better than humans to prevent these sorts of PR trash can fires or they won't be given the opportunity to improve net safety.
On the other hand, people are so bad that we're liable to soon live in a world of autonomous cars, regardless of the effect on net safety.
I hope they can be made safe, because it's vital for the future of our car-obsessed culture. But I don't have as much faith as you.
The car was travelling at 38mph and never braked. Even if the collision-avoidance only saw it at the last moment it would still have braked and potentially slowed the car enough so that the woman was injured instead of being killed.
I'm all for self-driving and fully believe it can improve on humans, but I don't see how it's possible for self-driving cars to be on the road if they can't properly detect the most vulnerable users in all conditions.