I'm looking forward at the handling of the "corner cases" related to society.
E.g. somebody that wants to rob you could just stand in front of your car (or put some small obstacle) knowing that the car will stop (to avoid damaging itself and/or whatever is in front of it).
Even in the case of simpler tech which is generally available already today some unexpected problems might arise once people become really aware that it exists on 99% of the vehicles.
For example, in the case of an at least partially selfish person, why should s/he wait to cross the street if s/he knows that the car that is nearing has an anti-collision/auto-break system? I admit that I might even start doing that myself whenever I'm late for something, have to catch the bus, etc... :)
Might end up being a funny experience sitting in a car that has to do emergency breaking every other minute... :)
> why should s/he wait to cross the street if s/he knows that the car that is nearing has an anti-collision/auto-break system
I would highly recommend visiting Cairo in Egypt and crossing a busy road:
With a swarm of cars going left and right, you simply 'pick a car' by looking directly at it and step out into the stream of (multi-lane) traffic. It's freaky at first, but it slows, moves and bends around you as you walk across the road. I've seen mothers with bags and children doing this at even busier places.
It seems that here in the west, when we're driving, we have a 'get out of my way' mentality that seems out of proportion with how we walk down a pavement. ie, As we walk, we move and flow around slower/oncoming people -- we certainly don't shout at fellow pedestrians to get out of the way and then barge them out of the way when they don't.
Egypt also has one of the highest number of traffic-related deaths in the world. I don’t quite understand why your interpretation of it turns into a criticism of the “west”. The Egyptian system you describe is a dangerous mess, and significantly inferior to western traffic laws.
Funny, in Hanoi I noticed a similar tactic -- make it absolutely clear that you can't see the oncoming car. Look at the ground. Walk steadily. This is "precommitment" to not flinching in the game of "chicken" (in a game-theoretic sense.)
I like to joke that it's why people wear those big conical hats in Vietnam -- you can't see the cars, so it's their job to avoid you :-).
So, once everybody owns a self-driving car or at least one with auto-breaking, nobody would have to use the egyptian "pick a car"-method as people would just cross the street knowing that the cars would just break/stop on their own.
Traffic might come to a stop in a big city (as they are today)? Or will "good habit" prevail? Or, will it be a grey zone?
Pedestrians already have this theoretical right. In Montreal, it's heavily used. And it's fine - if you wanted to get somewhere quickly, after all, you would walk!
Maybe that would be a good way of making cities more human friendly instead of car friendly - if you want to cross the road, just go for it and the cars will stop or avoid you. Probably not a good idea on highways and fast roads obviously.
Interesting point of view - this might be the real end result of the implementation of such automation. Still, some details will be tricky to solve... .
I saw someone criticize self-driving cars by saying that if someone just covered a stop sign, the car wouldn't know to stop. This is different than the status quo how?
The status quo is "a large number of drivers will recognize that the intersection should probably have one and at least exercise some caution".
It should be able to program a self-driving car with a) a database of known intersections and their conditions as a fallback and b) to evaluate the conditions of an upcoming intersection for safety even without the requisite signage.
I'd suspect they do that already - based on what I've read they're not relying entirely on signage and the lidar is looking for potential hazards. I'd suspect it act similarly to a human in that situation.
Maybe it'd run it if there's no one else around, but a human probably would too in that case.
A while back, one of the four-way stops in our area got turned into a two-way stop. It took months for folks to stop treating it as a four-way, despite "new traffic pattern" signs.
Another imperfect analogy is power outages and stop lights. For the most part, it doesn't result in full-speed collisions; instead, folks treat it as a stop sign.
Self-driving cars are going to need a "huh, that doesn't look right..." system. Detecting missing/defaced/misleading signs should be a part of it.
That's what you're supposed to do. In some cases, the traffic lights fail to a mode where one direction is blinking red, which is treated like a stop sign, and the other is blinking yellow, which is treated sort of like a yield sign (i.e., proceed, but with caution).
I believe the law (and common sense) states that if the traffic light it out, you treat it like a 4-way stop sign intersection.
If I see rocks I can e.g. drive around them, judge if they're really too big for my car, drive on the sidewalk, etc.. => all these options are really complicated to program.
Current cars don't stop, at least not in Europe - they might break/go slower but it's going to be mixed with steering to avoid whoever is there causing all that fuzz. And if I think that whoever is in front of me is a threat that I might just run over him/her (or "it" for "animal" - a different kind of complexity - e.g. do you stop for 1 frog? 10 frogs? 100 frogs?) , at least partially.
When has DRM ever worked as more than a temporary measure? Also if car DRM is like games and software DRM, it will punish the honest user while not even remotely harming the thief!
ESR rethoric aside iPhones Activation Lock works pretty well and phones are hardly as regulated as cars. If you jailbreak your cars onboard computer it’s no longer road legal.
> why should s/he wait to cross the street if s/he knows that the car that is nearing has an anti-collision/auto-break system?
In the least dystopian-surveillance-state future scenarios, there will be a giant array of cameras and sensors pointed at anyone who crosses this way. How hard would it be to identify this person and ramp up enforcement as necessary? You'll never completely stop it, but I imagine it could be lowered to acceptable levels.
>For example, in the case of an at least partially selfish person, why should s/he wait to cross the street if s/he knows that the car that is nearing has an anti-collision/auto-break system?
If they get advanced enough, they may pull a Terminator 2: slow down enough to hurt you, but not kill. This fulfills their directive not to kill while disincentivizing jaywalking.
In Pune, a metro city of India, we used to cross 5 lanes of a highway each side by crossing one lane at a time. You look at lane traffic, get a spot, cross that lane, then repeat 9 times.
E.g. somebody that wants to rob you could just stand in front of your car (or put some small obstacle) knowing that the car will stop (to avoid damaging itself and/or whatever is in front of it).
They'd also have to disable any manual override controls.
EDIT:
None of the links that I posted work directly (they're accessible only through https://spectrum.ieee.org/cars-that-think/transportation/sel... in the area at the top which states "blah blah... said today in an 80-page report") - don't know why the direct links aren't working... .
It would change their mainline design (the new "country-specific" look most probably not looking as good as the original one) + having a different design (only for the US market, especially involving the front part which is extremely important in the general look) might probably have to be marketed under a different name (e.g not "A4" but "A4us") with all kind of repercussions on in/direct costs of all kinds (storage space for replacement parts, marketing, research on future evolution, ...).
Additionally, accepting a request from a specific country might trigger a chain-reaction from other countries, multiplying what stated above by the nbr of countries (and I personally don't think that it would be a linear curve as at the same time the costs for coordination between the multiple lines would rise).
From the article, "Because the manufacturers haven’t yet proven there is significant enough life-safety benefit to a laser-based matrix system, regulatory reworking is a tough sell."
Because of the regulations, the burden is on the manufacturer to show the matrix light technology is actually safer than the traditional one.
Is there any evidence whatsoever this increases safety? In my experience driving at night, traffic with led lights are so obnoxious it's almost worse than high beams with traditional headlights.
Definitely not the case for signaling to bicyclists and motorcyclists. Intent needs to be communicated as clearly as possible. It bothers me when drivers in the city do not use their turn signals.
> It bothers me when drivers in the city do not use their turn signals.
I will never understand this. In the case of people who sometimes use signals but not always, it must take more mental energy to evaluate the situation and decide than reflexively signalling every time you turn or change lanes. In the case of people who never signal, it must be much harder to navigate a city when nobody knows what you're trying to do.
>The Model State Policy confirms that
States retain their traditional responsibilities
for vehicle licensing and registration, traffic
laws and enforcement, and motor vehicle
insurance and liability regimes while outlining
the Federal role for HAVs.
Surely, liability for Level 4 autonomy vehicles would be borne by the car's manufacturer, rather than the operator. Is that maybe one of the more controversial aspects separating the definitions of Level 4 and 5 autonomy? I'm surprised this would remain unregulated at the federal level.
Liability generally goes with the owner of the vehicle, not the manufacturer or the operator (with some exceptions perhaps for proven manufacturing defects).
What about the client of a self driving cab. Are they operating the cab when they tell it to go home?
What about a rental vehicle?
What about a vehicle you borrowed from someone?
I really don't see why the manufacturers would not be the ones to be liable, and totally agree that none of the people in the vehicles should be liable, even better: they should be able to sue the manufacturer too since they were made part of the accident.
The hand wringing over this is unwarranted. In practice if the owner is liable insurance will be required, and accidents will result in massive PR consequences for the manufacturer.
Manufacturers can make unsafe cars right now. Regulators and consumers seem pretty good about stopping most of that from happening.
The manufacturer could have prevented it by supplying a vehicle capable of avoiding collisions.
I certainly understand the owner/operator having some form of liability insurance based on usage, but an at-fault collision shouldn't be the responsibility of the owner or operator, nor should it affect their rates. That fault lies entirely with the manufacturer, or maybe eventually the subsystem supplier.
Thinking about it, I'd not be terribly surprised if it plays out with manufacturers successfully lobbying to avoid any liability, and insurance gets pooled among owners of specific Level 4 autonomy subsystems. Maybe the best we can hope for would be an equitable split of liability between the manufacturer (who controls the behavior) and owner (who controls the level of usage).
Is no one going to mention how stupid the concept photo looks? If you don't have a steering wheel, why wouldn't you have a console for the driver or per person? If I have nothing else better to do, I'd want the controls the driver will interact with to be more accessible to the driver.
Seems OK to me. Why increase costs and unreliability by adding extra screens and controls? As for having it "for the driver", that's old-car thinking. In this car, both front seats are equivalent, there's no need to select one as "the driver".
There's no driver position, though. It could be the left or the right side. Perhaps for drive-thru reasons people would prefer the left, but there's not a lot of required left-handedness when you're not driving.
'primary passenger' then. And my initial thought was that there should be two consoles, then I reverted back to a single console for the driver without making the final leap to, 'oh wait, there is no driver'.
Honestly what I'd like is a safety oriented cabin designed to minimize whiplash. Restrain my head and neck, surround my body with impact absorbing material and put a screen in front of my face and a keyboard within reach of my hands and we're done.
I don't want a large void space inside the passenger area of my car at all.
Parking valets in ultra-cramped garages come particularly to mind for me, after having watched a group of attendants perform an elaborate repositioning of a dozen cars in one of those mechanical-car-lift parking areas with barely enough free space to fit a single car at a time.
This is just stupid. Why would you ever want to completely remove the driving experience? It should be so that I can drive when I want to, and let go when I don't want to.
Oh yea this is great idea. Lets all just not own cars. I can't wait to wait 10 min for a automated taxi to arrive, where the interior is left gross and nasty by the previous riders who were doing God know what inside. Also if you ever need to leave in a hurry (get a call that a loved one is the hospital, etc.) and need to leave right the fuck now, too bad. Also when your shitty taxi arrives it won't go any faster, it is programmed to never break the speed limit.
Sounds like fucking HELL. NO WAY, NEVER, EVER, EVER. I will fight to the death to keep manually driven cars.
Tiller steering is dangerous, that's why. I know of someone who did that to a lawnmower and had it ripped out of his hand only to swing around and split his hand open.
Any non-power assisted system can hit you back hard in the right conditions. If something is regularly a pain in the butt to operate then it's going to go away, just like manual steering, synchronized transmissions, non-power assisted brakes and hand crank starters. No need for a law.
Is this likely to happen in our lifetime? As far as I know it's reasonable from a technology perspective, but I'm imagining this being a profoundly divisive political issue where politicians stall meaningful legislation for decades in the name of liberty (at least in the US). I'm basically imagining a watered down version of the debate over firearm ownership without the obstacle of a constitutional amendment. Maybe I'm projecting present circumstances onto the future a bit too much.
I can imagine it possibly happening on a small scale limited to urban areas in the near future, possibly as an attempt to manage congestion, and I can see trucking being automated (at least on highways with some human intervention) and taxis, but I don't see the public accepting a total transition until the technology has proven trustworthy and seems so obviously more safe than the alternative.
That could happen, but if it did, it would probably require a generational shift in public perspective.
> I don't see the public accepting a total transition until the technology has proven trustworthy and seems so obviously more safe than the alternative.
This would definitely be the minimum requirement, but it would be consistent with history (cigarettes, firearms, climate change, etc.) for people to deny any safety advantages or simply not be persuaded by them long after research suggests safety benefits.
I don’t think that quote works so well with the insane amount of car deaths we have. Cars also being something created by us in the last 150 years. Not some universal things.
If most of the cars on the road are auto-drive, that might be exactly what happens. Imagine trying to negotiate crossing a busy highway when all the other cars know what they're each doing, and you're invisible to them. It'll be 'playing chicken' all day long.
If that's the case then self-driving cars will never take off. What are they only aware of other self-driving cars? How about children? Or a tree that fell in the road? They would have to be aware of any object.
A quick Google search for average car age hints that it's nearly 12 years [0]. Even if everyone buying a new car bought an EV, we're probably 12 years away from 50% of the cars on the road being EVs. And since the age trend is actually increasing, we're probably looking at 13-15 years in actuality. And again, that assumes 100% of new vehicles purchased are EV, which definitely isn't the case.
This doesn't even begin to look at self-driving. We're still years away from reliable level 4 automation.
> Self driving cars will have their AI-winter time soon
I don't see this happening any time soon. There might be a contraction in highly speculative investments (i.e., VC cash) aimed at autonomous driving systems. However, even if tech never gets beyond level 2, this will still be a high-growth and high-wage field for the next couple of decades as automobile manufacturers and their tier one suppliers incorporate existing ADAS across the lineup.
And that's to say nothing of the many limited domains where level 4 is definitely doable, including closed-environment mining and manufacturing sites. As well as adjacent industries (e.g., maritime) where even levels 1 and 2 could help a lot.
Self-driving is a scorching hot fireball right now. It might cool down as people realize that level 4-5 is not happening any time soon, but a prolonged winter is hard to believe.
>And that's to say nothing of the many limited domains where level 4 is definitely doable, including closed-environment mining and manufacturing sites. As well as adjacent industries (e.g., maritime) where even levels 1 and 2 could help a lot.
Yeah, I have no doubt about special purpose level 4 in limited environments. My comment was about seeing general purpose 4 and 5.
The definition of level 5 being "under all roadway and environmental conditions that can be managed by a human driver" means that whatever mechanism being used would have a better "brain" (at least in the area of "driving") than the human one, correct?
If yes: kind of scary, as such capability would probably have to include the comprehension/understanding of e.g. "context", the ability to "abstract", be able to generalize from indirecly related informations, etc.. to understand potential upcoming dangers.
For example when driving behind a truck which has its trailer that wobbles continuously left and right (but which can still keep its lane) because of a partially flat tire would probably need a very advanced AI to recognize the potential danger.
Yes... the problem of autonomous cars is likely that of strong, general purpose AI.
Unfortunately, the politics around autonomous cars insists that human beings are basically idiots behind the wheel, so the problem seems more trivial than it is. After all, how hard can it be to design an AI smarter than an idiot?
>For example when driving behind a truck which has its trailer that wobbles continuously left and right (but which can still keep its lane) because of a partially flat tire would probably need a very advanced AI to recognize the potential danger.
I think it would be surprising if self driving cars didn't launch in the next 10 years. Consider that most people never drive more than 10 miles from their home on a given day. It's pretty trivial to imagine even "limited" AI vehicles being able to serve vast swaths of the population.
In that case, likely a lot of people would want to not have to invest in the hefty price of owning, maintaining, and insuring a vehicle, let alone driving which everyone knows is more dangerous than most things a person does in a given day.
> Consider that most people never drive more than 10 miles from their home on a given day.
This is simply untrue if you're considering average commute distance of drivers in major cities.
> hefty price of owning, maintaining, and insuring a vehicle
These costs are highly variable, and many reasonable options are quite cheap.
I don't see self-driving cars catching on any time soon in the US. Possibly in Europe. I think they will be limited to smaller vehicles that operate in glorified bike lines, only on known routes.
> This is simply untrue if you're considering average commute distance of drivers in major cities.
I'm sure that'd be correct if you'd said suburbs and rural areas, but according to this[0], the average commute for almost every major city is under 10 miles.
>In that case, likely a lot of people would want to not have to invest in the hefty price of owning, maintaining, and insuring a vehicle, let alone driving which everyone knows is more dangerous than most things a person does in a given day.
All of the costs are worth it, as long as I don't have to share transportation with anyone else.
Even if self-driving cars become the norm in developed countries, there is no way they'll be so in the rest of the world.
In many places, there are no lines drawn on the road and you have to imagine the lanes. It could be a space big enough for 3 or 4 lanes. It makes me wonder if cars would recognize the general direction of the road and not drive diagonally through the imaginary lanes when there's a turn.
Some lanes may also have horrible holes that seem like they could easily take your wheel off if you fall on them with speed. Sometimes they're sinkholes, other times they're part of a construction job that was left midway for months seemingly until someone has a horrible accident. Some lanes look like they're at risk of becoming large sinkholes, and you'd rather avoid them lest the whole car suddenly falls meters below the ground. For both of these, you know they're there, and you know they're basically unavoidable once they become visible. How would you communicate these risks to the car?
Jaywalkers on high-speed, high-traffic highways might be common due to lack of bridges or any other alternative to crossing the road. They coordinate their movements with the incoming traffic and the drivers also coordinate their movements with them. Behaving unexpectedly, like simply changing lanes, even at a distance, could be fatal because it changes the shape of the incoming traffic the jaywalkers depend on once they decided to start crossing. Emergency stopping might worsen the situation when tailgaters are common.
Pedestrians might also be suicidal and you might be able to discern their intent from a distance by watching their behavior, but the car is not going to interpret that and it'll get close enough for the pedestrian to throw themselves at the road when it can't avoid them.
Advancing when a streetlight becomes green might generally be unsafe in zones where it's common for cars to cross at high speeds when the light is about to or just turned red. Would cars be on the lookout for high speed traffic coming from the left or right at a distance?
Simply put, there's lots to be on the look out for, especially when pedestrian city infrastructure, vehicle city infrastructure, traffic law enforcement, driving education, etc. is lacking. If self-driving cars become the norm in developed countries, I think it'd be in great part because they're not lacking in any of these things, and the car software can deal with a somewhat consistent environment. That wouldn't be the case elsewhere, though.
As to what my opinion is on the issue at hand, as a software developer, I wouldn't put so much trust in software as to not have a way to take manual control when this software has the ability to bring physical harm or death to me and others. Even disregarding the possibility of malice through malware and assuming whatever code that executes was written with the best intentions, there's just too many ways for things to go wrong in this problem domain and the consequences can be deadly.
Good design lies in simplicity. I cannot imagine the behemoth of code that must be required to implement safe automated driving. That's a lot to put faith in when lives are on the line.
Yeah, if there is no way to mechanically control the car, then that's kind of a problem. In that case, I'd have to be given some pretty ironclad indemnification before I'd get in one.
If the car runs down a pedestrian in a crosswalk while said pedestrian has the right of way, on whom, exactly, does the civil liability fall? The manufacturer, or me, the occupant. Even more importantly, on whom does the criminal liability fall?
If the answers to those questions are not spelled out, explicitly, in the regulations congress puts out, then I agree, it would be hard for me to get into one of those things.
Especially if they said you were at fault because you told the car, "Take me home", or something. I can hear it now, "Your honor, the occupant was obviously exercising control over the vehicle!"
As you probably already know. A drone pilot used the "return home" feature of his drone and the pilot is liable for damages it caused when it collided with a helicopter in an area with a temporary flight restriction.
A friend of mine used the "return home" feature of his drone and instead of returning home, it drifted off over the fence of the nearby BAE Systems establishment. He couldn't do anything except go home.
Later in the pub he told someone who worked at BAE, and was fairly high up, and the guy said, "yes, I heard about that, you need to contact them."
He did, and he got the full "men in black" treatment with several interviews/interrogations to determine exactly who he was and what his intentions were, and threatened with jail time. Eventually they believed his story and he even got his drone back, although the camera was obviously missing.
Apparently it had landed next to something "extremely sensitive", which may or may not have been the cause of it drifting off in the first place.
It's not equivalent, though. A car without a steering wheel will be legally required to be L4 - to allow the user to take their mind off the route, or even sleep. A drone has makes no such promise - the feature is known to be "dumb" and you therefore must judge if it's safe to use it.
If the manufacturer claims your vehicle is L4 (and even goes to the extent of not adding a steering wheel), then for liability purposes it ought to be L4.
It may fail in practice, but the claim is all that should matter in court.
Yet, I'm sure somebody somewhere is going to be thrown into the lions for protecting some rich company. So I'm certainly not going to be one of the first to buy one.
> Even more importantly, on whom does the criminal liability fall?
Observing how these things are usually handled I would say the liability would fall on no one. Unless the driver is intoxicated car crashes, even those resulting in deaths, are treated as unavoidable accidents with charges rarely being filed.
When a cab maims a pedestrian in NYC the typical response is the NYPD releasing a list of pedestrian safety tips.
You wouldn't be alone, I doubt automakers would sell a single solitary L4-capable vehicle without locking it down legally first. My guess is that they'll have to assume liability but with exceptions for if the user modifies the device in any way or otherwise breaks the user agreement. If accidents happen they'd have to have some procedure where they treat it like a plane crash and pore over every detail and issue recommendations/updates to some regulatory body.
The downside to this is that these vehicles would likely make even the most cautious myopic octogenarian look like speed racer.
If it's L4 and no steering wheel so the occupant can't override control, it's 100% the manufacturer's fault.
However, if the manufacturers are smart, they'll make the self-driving hardware record video footage they can use to defend themselves in court. You wouldn't want people to be able to jump in front of a moving car to try to get a payday after they get hit. This already happens and is why many people have dash cams. See https://www.youtube.com/watch?v=ibjWz9ehrJ8 for some good example.
That's not how a electronically assisted steering rack works. There is a manual connection with a little bit of slop and that slop provides some difference between steering wheel position and input shaft (to the steering rack or gear) position which triggers the power assist. It's basically the same as an old school power steering gear.
I can envision this being ok for urban use, such as taxis, shuttles, and busses, but not so great for personal use for everyone. For example, I want to be able to drive my car across the lawn to load/unload stuff.
What bothers me most about this tech is we have a pretty long history of auto manufacturers knowing about problems involving safety issues that they evaluate with a cost/benefit methodology as opposed to a safety first and always approach.
The Ford Pinto [1] is a good example of that.
I still trust me more than an autonomous machine but I do look forward to a day when I don't.
You may find me in one, at some point in the far future, but I won't buy one. If they're really that good I'll use a service to have one come pick me up to take me places. But I've not heard much about self-driving cars performing well in all types of conditions to think that this will be safe enough (or at least reliable enough) any time soon.
There are plenty of places where I wouldn’t mind using one now. Anywhere where the speed is under 30 miles an hour, for example.
Have you ever been at the Las Vegas airport at the start of CES? I’ve waited an hour in line for a taxi. Of course they should have built the metro to the airport.
If you commute by train, this type of car would be perfect to get you to and from the station, especially if it could drive itself back home during the day.
I'm sure some did. I base my decisions on risk ranking. Let's look at an elevator.
An elevator can go up or down and has inertia brakes among other safety mechanisms. Most of these mechanisms are regulated and inspected and can not be disabled by software. I have watched videos of elevators behaving badly, but it takes a comedy of errors and many inept people for this to be a thing.
An autonomous car on the other hand, can do anything a car can do and anything that the buggy or compromised software tells it to do. I will not be victim to the first angst filled 12 year old that is abused by his/her parents daily, or by some burnt out developer, or a serial number typo in an assassination attempt. The serial number of my car could be one digit off from some politician or spy.
By all means, have fun in these things. Nobody will convince me to use one of these cars. Not now, not five hundred years from now.
Sorry, that was in no way meant to be a personal swipe. Was just saying that the comment sounds similar to someone saying that. I should have said 'the comment sounds like what someone would have said about horseless carriages'
I don't think this is as much of a troglodyte statement as much as it is a lack of personal autonomy and freedom.
The user isn't against the horse-less carriage out of fear, he's against the horse-less carriage because the option to control the horse has also been removed.
It is possible that a self-driving car could still give you personal autonomy. Although you would lose direct mechanical control over the car, it is still technically possible to implement a self-driving car that always goes exactly where the occupant tells it to, without ever phoning home. Of course, none of the companies making self-driving cars actually want that, and most of the people buying them either don't care or don't realise, so it's probably not going to happen.
"jstaney you are under arrest - remain calm your vehicle will deliver you to the nearest processing facility. Do not attempt to control the vehicle. Do not attempt to exit the vehicle. Any action deemed as attempting to exit the vehicle or prevent it from arriving at the detention facility will result in your immediate admission of guilt and the appropriate sentencing will commence. Have a nice day, thank you for using GovCloud Services."
You're talking right past the comment you're responding to, which is highlighting the distinction between the inherent aspects of the technology, and the non-inherent authoritarian aspects which will be shoved down our throat by business interests and government perverts (different specializations of the same group, really).
Similarly, we could easily have mobile phones that do not track us. It's just in none of the manufacturers' interest to design them, as the vanishing epsilon of people who care enough to pay extra do not offset the economies of scale from going with the current surveillance-based designs.
Having said that, one of the themes in discussion of mass surveillance is the quantitative difference of it being easier than a police department having to expend the effort of physically following someone around. Government (de jure or de facto) controlled cars will make the act of arresting someone similarly effortless.
I mean, most advancements in technology come with restrictions on personal autonomy and freedom. Today's urban factory workers have significantly less autonomy than independent craftsmen, who had less autonomy than subsistence-farming homesteaders on the prairie, who had less autonomy than the Native American tribes they replaced.
Historically, it seems not to matter. The advancements in technology let societies that adopt them support a much higher population level. Those societies then go and kill off or enslave everyone who didn't adopt the new technologies, once a few generations have passed, their military supremacy is obvious, and there's enough cultural distance to view the adherents of the old technology as savage, subhuman primitives.
This. It's important to note that human society's major driver isn't freedom or self-actualization or any other variant of Maslow's heirarchy. The thing that drives every one of these decisions (from "let's stay in one place and grow crops" to "let's have a person spend their entire life attaching a single bolt to every single member of an infinite line of widgets" to "Let's pay a human to put things in boxes and dock his pay if he takes a break") is maximizing viable reproduction and minimizing the energy to do so.
Chrysler's trying it all over again with their "turn the dial" transmission control. One of the basic rules of interface design is that the larger the effect of the button on the system the bigger and more attention grabbing you make it (like in an industrial machine where you have a big red button that requires a lot of force to activate and makes an impressive sound when you hit it and a green one that behaves similarly). The push button shifter and knob shifter both violate that.
E.g. somebody that wants to rob you could just stand in front of your car (or put some small obstacle) knowing that the car will stop (to avoid damaging itself and/or whatever is in front of it).
Even in the case of simpler tech which is generally available already today some unexpected problems might arise once people become really aware that it exists on 99% of the vehicles.
For example, in the case of an at least partially selfish person, why should s/he wait to cross the street if s/he knows that the car that is nearing has an anti-collision/auto-break system? I admit that I might even start doing that myself whenever I'm late for something, have to catch the bus, etc... :)
Might end up being a funny experience sitting in a car that has to do emergency breaking every other minute... :)