Something I don't get: what is assisted driving for? I can see it being used in a transition period, where we want to assess the reliability of computers, get a corpus… But once the car can drive itself, why not let it? That way, no matter how unreliable you are, it won't influence the car.
Higher insurance rates will be only for people who insist on driving their cars. I believe this would be a very good thing.
A system in which no one ever has any control over where their vehicles choose to take them is not necessarily benign. Such a system can be tampered with, taken over or simply glitch because some idiot caused an overflow or the maps are inaccurate or all of a sudden $something_inefficient.
The purpose of 'assisted' driving lies in the premise that a 'perfectly' automated system doesn't exist, and will fail, and when it does, it should fail to wetware.
The notion of "control" is silly, you will still be the captain of the car, just not the driver. What if the wetware system kills 100 times as many people, should you still be able arbitrarily override the automated driver?
I already accept a lot of places that I should let machines handle a task for me. I drive an automatic transmission rather then shift myself. I have traction control that overrides my foot to give me better control of the car. I fly in planes that are mostly flown by autopilot.
Assisted driving will only be a short stopgap. It will be quickly gotten rid of because of cost to maintain two systems and because the ugliness of a "drivers seat". Interiors will look nothing like they do now. There will be no drive-shaft (fully electric) so no big bump or center console in middle of car. No steering column, no mirrors and a completely different arrangement of seats then now.
In other words a fully automated car looks nothing like an assisted driving car. I doubt people are going to want to pay more and sit in less comfortable cabin, just for the opportunity to override the computer driver.
>What if the wetware system kills 100 times as many people, should you still be able arbitrarily override the automated driver?
Yes.
Because human beings are the result of a million-odd years of evolution and perfectly capable of making their own decisions, negotiating obstacles, pattern recognition, adaptive reasoning and reflexive response. To decide that people are too stupid and clumsy to be allowed the luxury of driving so let's just have car companies, insurance companies and Google decide what's in their best interests, is essentially a well meaning but potentially pernicious form of autocracy.
Pun intended.
People die in housefires all the time too, maybe they shouldn't be allowed to cook their own food? People die from accidental gunshots and police misconduct, so clearly automated weapons systems should be the only ones allowed to enforce the law? Yes, sometimes there are auto accidents. But also, millions of drivers make it to and from their destination without incident every day. Proponents of self-driving cars make it seem as if it's bloodsport to let humans drive their own cars. Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.
How many people die because of a manual override that does worse than the computer? How many people die because the computer itself did a mistake that a human could reasonably have overridden safely?
I fully expect the second number to be much lower than the first —eventually. And when it does, I'll be glad when manual driving is considered a lesser crime. Because make no mistake: risking your own life is your choice. Drive on tracks, smoke, drink, have unprotected sex (with an informed and willing partner), whatever. On the other hand, driving on public roads put everyone else at risk. A small risk, but still. I say that having such control over their lives is simply unethical.
> Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.
Of course, never trust them farther tan you can throw them. But you can throw them pretty far: just sue their ass off whenever they're responsible for an otherwise avoidable accident (like a bug in the software caused by sloppy practices). Also, they can test the cars, and you can expect more consistency, compared to human drivers.
Those are good arguments for the automation existing to begin with, but don't address why manual controls shouldn't be available.
Which to me is the crux of the issue - in the scenario without an available override, when something goes wrong, there isn't anything you can do about it except hope that it doesn't kill you, because at that point it's no longer about the AI and all about collision physics.
I bet cities with congested infrastructure (read: not in the US) will be the first to go 100% autonomous just to optimize infrastructure utilization. Other cities (read: in the US) will then be forced to follow to remain economically competitive (b/c cheaper/better utilized infrastructure = lower taxes + higher economic output).
You'll probably still be able to drive in the country side though.
This is perhaps unintentionally describing the entire population. I believe this will be a huge barrier to adoption overall.
Americans, in particular, don't drive cars for transportation. They drive cars because they wish to maintain the idea of having complete control over their lives.
Higher insurance rates will be only for people who insist on driving their cars. I believe this would be a very good thing.