>What if the wetware system kills 100 times as many people, should you still be able arbitrarily override the automated driver?
Yes.
Because human beings are the result of a million-odd years of evolution and perfectly capable of making their own decisions, negotiating obstacles, pattern recognition, adaptive reasoning and reflexive response. To decide that people are too stupid and clumsy to be allowed the luxury of driving so let's just have car companies, insurance companies and Google decide what's in their best interests, is essentially a well meaning but potentially pernicious form of autocracy.
Pun intended.
People die in housefires all the time too, maybe they shouldn't be allowed to cook their own food? People die from accidental gunshots and police misconduct, so clearly automated weapons systems should be the only ones allowed to enforce the law? Yes, sometimes there are auto accidents. But also, millions of drivers make it to and from their destination without incident every day. Proponents of self-driving cars make it seem as if it's bloodsport to let humans drive their own cars. Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.
How many people die because of a manual override that does worse than the computer? How many people die because the computer itself did a mistake that a human could reasonably have overridden safely?
I fully expect the second number to be much lower than the first —eventually. And when it does, I'll be glad when manual driving is considered a lesser crime. Because make no mistake: risking your own life is your choice. Drive on tracks, smoke, drink, have unprotected sex (with an informed and willing partner), whatever. On the other hand, driving on public roads put everyone else at risk. A small risk, but still. I say that having such control over their lives is simply unethical.
> Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.
Of course, never trust them farther tan you can throw them. But you can throw them pretty far: just sue their ass off whenever they're responsible for an otherwise avoidable accident (like a bug in the software caused by sloppy practices). Also, they can test the cars, and you can expect more consistency, compared to human drivers.
Those are good arguments for the automation existing to begin with, but don't address why manual controls shouldn't be available.
Which to me is the crux of the issue - in the scenario without an available override, when something goes wrong, there isn't anything you can do about it except hope that it doesn't kill you, because at that point it's no longer about the AI and all about collision physics.
Yes.
Because human beings are the result of a million-odd years of evolution and perfectly capable of making their own decisions, negotiating obstacles, pattern recognition, adaptive reasoning and reflexive response. To decide that people are too stupid and clumsy to be allowed the luxury of driving so let's just have car companies, insurance companies and Google decide what's in their best interests, is essentially a well meaning but potentially pernicious form of autocracy.
Pun intended.
People die in housefires all the time too, maybe they shouldn't be allowed to cook their own food? People die from accidental gunshots and police misconduct, so clearly automated weapons systems should be the only ones allowed to enforce the law? Yes, sometimes there are auto accidents. But also, millions of drivers make it to and from their destination without incident every day. Proponents of self-driving cars make it seem as if it's bloodsport to let humans drive their own cars. Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.