There's going to be a tipping point in the near future. When autonomous vehicles reach a big enough share of miles driven, there's going to be a snowball effect. All of our roads and infrastructure are built for human drivers. Robot-only roads could far cheaper and more efficient. No more signs or lights. No more lanes. Not even partitions. You can dynamically adjust your 8-lane freeway to be 6 lanes one way and 2 the other and flip it morning and evening.
Once the tech is proven and accepted by enough people, it will become mandatory and we're going to rebuild everything with a much smaller footprint.
> There's going to be a tipping point in the near future.
I'm not holding my breath. I grew up hearing people talk about how we would have Mars colonies by now. Imagine my surprise a couple of decades later when the shuttle program ended and the U.S. lost the ability to send people into space for a time.
The way I see it, the clock is ticking. If self-driving cars don't reach a certain level of capability soon, I suspect they simply won't; the technology will have plateaued at a point where it will not be viable in the foreseeable future. It'll be like supersonic flight: full of promise in the 1960s and 1970s, but mothballed for decades after that waiting for some engineering breakthrough to make it commercially viable.
Another problem is consumer confidence. Self driving accidents have a much higher impact in the news and people's minds than human error accidents.
The perception problem is the same as air flight. If self driving cars can't be essentially accident free passengers will be filled with the anxiety of lack of control. Even if the self driving car is 99% safer than human operated cars it's a hard sell to the buyer when you face that 1% risk that the consumer doesn't perceive as being a mitigation toward their own driving safety skills.
Apologies for the incomprehensible last sentence, hope you get the gist though.
I have the same problems with taxis. They might make more economical sense for me. But their drivers are terrible and I fear being a passenger there, so I prefer to drive my car for my own safety.
Airplanes have essentially the same problem with respect to accident publicity and we've managed to get a functional industry out of them, even if some people are afraid of flying and so never do. So while this is clearly a problem that will delay the introduction of driverless cars it's one that can be overcome.
I sort of suspect Waymo would have been selling driverless cars a year ago if it weren't for this, actually.
Novel things have a much higher impact in people's minds than routine things. Eventually self driving car crashes will be routine. 1 death is a tragedy, a million is a statistic kind of thing.
Government investigators will not have that bias. I expect in ~10 years it will be illegal make a new car that is not self driving because self driving cars will be statically safer. Some years (10?) after that and human driving cars will be limited to parades.
Note, I include in self driving cases where there is a human seemingly in control, but the computer will override bad input and even take over all driving as needed..
Edit: originally this implied that it would be illegal to posses a manually driven car. Governments will not remove all the existing cars from the road that quick.
Even simple things are hardly self-driving at the moment.
Take trains for example. No need to steer, have priority over anything else and basically can't brake.
Yet, a limited (and mostly subway) trains are self driving.
We have hardly any self 'driving' robots in day to day life. Those would be a lot slower and lighter.
Given that we have no industry standards for those kinds of things, it would be surprising if even one is considered safe enough in 10 years. Let alone that they would be mandated.
What makes them work at the moment is that they formally have to have a human in control. Same thing as an autopilot in a plane.
But a few more deadly mistakes and the whole concept may get scrapped.
Ha, the DC Metro makes your point. It started out being nearly all self-driving. All the operator did was close the doors. Now they have disabled one automatic system after another. They are all manually driven now, riders endure a jerky ride and wait several seconds at platforms so the operators can manually open the doors.
Except that a few accidents and news coverage will bring out the suburban and anti-big-corp folks with banners. That'll put enough pressure on the regulators to impose draconian safety rules on the self-driving cars, slowing it down substantially.
Regulations are driven a lot by feelings and emotions.
I agree with the last paragraph in practice but I think what will have plateued is investor interest, and not necessarily the technological progress. Right now there is a huge investment and if it doesn’t deliver in expected timeframes it could cause another mini AI winter as enthusiasm crashes hard. I bet the biggest threat is patience, because of the sale that self driving is around the corner for so long.
In practice though if that happens there will be less people getting payed fat salaries to work on it, and then progress will probably slow, so the effect will be the same.
I'm not sure how much interest will weather. GM cannot afford to ignore it completely, if Ford invests and gets there, they face the real danger that Ford will have all the patents and a much safer car.
What the safety reality of self driving will be is of course an open question. The risk that someone else gets there and proves they are safer is too high for the big auto companies to ignore. Regulators already insist that all companies have any useful safety device when the device proves itself.
Note that the car companies are interconnected. The ones that are not investing in this technology have cross agreements with those that are.
This effect works in reverse though- they could all make agreements to share technology that the others develop with each other, or form alliances such that there are only one or two outfits actually doing research. Then, since that research isn’t making progress and isn’t going to give them a competitive edge, they collectively deprioritize it to a shadow of its former self, safe in the knowledge that they have research agreements.
I would watch for signs that one of the big companies deprioritizes internal research. I think it will be a lagging indicator that the self driving startup scene has cooled and they think that outside startup challengers will be underfunded and easy to buy or compete with. Right now it is accelerating in the opposite direction, but I don’t see why it has to if we don’t see progress in the next couple of years.
The agreements already exist. Note that is plural - they are watched closely by governments because the come so close to illegal monopoly behavior. One of the reasons it is not a monopoly is because each agreement excludes several big players who form their own alliances.
Of course they could have secret agreements that the governments don't know about, but cartels like this are a dangerous game that tend to backfire on the members. Each one that agrees has to play along, but if any one decides to work in secret they are first and get to scoop the competition and win. Since the agreement is already illegal under most national laws there is nothing that the others can do about the cheater.
I'm glad to see I'm not only one very conservative about real technology progress. I'd love to see star-trek future, but I just do not believe in it anymore.
Never forget that you're on the Internet, using probably a device that 20 years ago would have been sci-fi, complaining that the future may not come fast enough.
20 years ago I was arguing with strangers on the internet over a dial-up modem. It's not so very different. Moore's law hs been quite magical but the real changes have been qualitative, like smartphones.
It was sci fi 50 years ago. By the time of the Newton (25 years ago) and iPhone was a predictable evolution of existing technology. So was the Star Trek voice command interface. The question is: are self deiving cars iPhones (where the technological curve played out as expected) or voice recognition (which never really got as good as people expected it would be).
For me, Star Trek future is mix of politics, technology and physics.
In terms of politics, I see just deterioration in my region. We get technology we want (smartphones), but not what we need (carbon sequestration, better food sources). Our current tech is ruining the planet.
And as a developer, I see a lot of new and different, but little better. But that might be also connected to my world view already.
The second essay in David Graeber's book "The Utopia of Rules," which I think is called something like "On Flying Cars and the Declining Rate of Profit," has an interesting discussion on why the future you described (and, let's face it, were more or less "promised") never came to be. I'd say it supports the conclusion you've already intuited.
Yeah but at the same time, if you look at most sci fi movies from 20 years ago, they waay underestimated the actual progress that happened in things like computers.
The opposite is true. Compare an 8086 running DOS (1978) to a Pentium II running Windows 98 (1998) to a Core i7 running Windows 10 (2018). Progress has slowed down immensely. Look at other things like voice recognition, video conferencing, and VR. Dragon Naturally Speaking came out in 1997; I don't think anyone predicted that more than 20 years later, voice recognition on iOS/Android would still be so awful (and not work without an Internet connection). Likewise, I don't think that anybody would've anticipated that video conferencing would still suck, and most people would still prefer using voice phone conferences. At a more technical level, I don't think anyone anticipated that a 1970's era UNIX clone would be the dominant operating system, that we'd still be writing apps in Objective C, etc. Visual programming, micro kernels, object components systems--none of that stuff panned out either.
I'm fairly confident they will reach a good level within the next 5 years - that's pretty much what Nissan and Ford projected a couple of years ago and things have gone along those lines since. It's people saying the cars are here now or they will be here in 6-12 months that are creating ridiculous expectations.
I think they're already good enough. We set the bar extremely high due to safety concerns, but in reality we'd be better off doing a full cutover now. The biggest hurdle is the transition of having some autonomous and millions of manual vehicles on the same roads. The technical hurdles remaining are all sensors and software. Things we already know how to do and just need to improve. Space colonies and fusion reactors are still in the unproven science phase.
PEOPLE have not gotten driving on the highway correct and PEOPLE are running down people in streets. If you define "good enough" as perfection then of course the tech will never get there.
How well do they operate with iced roads, deep snow, and active snowfall? I'm not aware of any such operating conditions for SDV's yet, but maybe I missed it.
"Once the tech is proven" is the big caveat. This all is a fun thought experiment, but we are nowhere close to the future where this tech is reliable enough to fulfill all of these functions in an economically beneficial way. These companies exist because the VCs have money to burn, and self-driving [X] is the hot trend of the day, but for all the dreams to pay off the tech has to prove out, and I see no evidence in this article that these companies are any closer to solving the real problems of confronting real-world real-weather real-roads autonomous vehicles than anyone else. Not to mention the additional gaps in service. These robot cars can't bring my deliveries to my doorstep; they can't improvise when mapping data is outdated or invalid; they can't understand or react when the product they deliver is damaged or wrong.
The only tipping point we're nearing is when people finally realize that 80% of a self-driving car may as well be zero percent, that the last 20% is way more complicated than anyone wants to admit, and that even to they extent they ever "work", flooding our streets with robot cars is not the ticket to utopia.
No doubt the problems are difficult and time consuming. Self-driving vehicles is the culmination of thousands of technologies coming together. It’s complicated.
And while it may take longer than everyone expects, the demand for a self-driven car is not coming from VC money. People, for the most part, would trade driving for the convenience of doing something else.
I think the force of demand will bring about the technology, sooner or later. It’s difficult technology, but most people don’t think it’s impossible.
I am absolutely with you on replacing family cars. The last 100m problem is really hard and isn’t going to get solved for many real world scenarios any time soon.
However there are a huge range of driving situations in which the last 100m can be designed around self driving vehicles. Many trucking and general pickup/delivery end points could be designed this way. Even many taxi style services could use this approach, with designated parking points for self driving buses and cabs. So I can see near future self driving tech accounting for a significant fraction of road traffic over the next decade, given that it will take time to develop the infrastructure to match.
> situations in which the last 100m can be designed around self driving vehicles
Although that's a true statement, I'd argue that it's essentially meaningless for existing cities on a time scale less than multiple decades. Retrofitting is difficult, expensive, and takes a very long time, if wheelchair accessibility is any example.
As a foreigner to the US, when I was visiting Burlington, MA. I found really odd that there was almost no sidewalks, so it was really difficult to walk by, everything was built for traveling by car from place to place (Which is not the case in the cities, That's the pattern in the suburbs)
Following your thoughts, this patterns will extend and multiply, probably became even more hostile for pedestrians in some places.
I am worried about that too. Right now the biggest challenge for autonomous cars is to recognize obstacles like pedestrians. I wouldn't be too surprised if we would end up with a situation where it's up to the pedestrian to get out of the way of autonomous vehicles because they have always right of way.
I hope the opposite. When humans don't drive, they won't mind the car taking an extra minute to drive around the other side of the building. There'll be more opportunity for pedestrian-only streets.
I have a slightly different take on what will drive adoption of these cars.
Insurance.
If there is a significant reduction of cost from an insurance perspective, the cost to insure a human-driven vehicle will increase substantially to the point that it will become a burden to own a vehicle that isn't driven by software.
That is, unless the government feels the need to intervene here. But, business seems to have the ear of government a lot more than the citizens do now days.
>> No more signs or lights. No more lanes. Not even partitions. You can dynamically adjust your 8-lane freeway to be 6 lanes one way and 2 the other and flip it morning and evening.
Trains. Such concepts have been around for a long while in rail travel. Implementations never go smoothly. The reality of automating fast moving boxes full of living people are stark. Even where trains are driverless, with extensive signalling and redundant navigation systems, they don't get closer/faster than human operated trains. As systems grow more complex, to the point that failures are a regular occurrence, the dream of hyper-efficient automated travel disappears.
Actually driverless trains DO get closer and faster. This is why some lines have been automated in Paris for instance, after manual driving reached the limit.
Subways in some cities (I'm in Vancouver) do some of this. But they are low speed and are on dedicated track. Trains hauling freight and people between cities have to share track with other types of trains, on irregular schedules. That's what an automated road would look like: not one team of cars but a random collection of vehicles that must somehow share the same infrastructure. The number of vehicles per mile of track would also be exponentially higher than any subway system.
Automated train systems are still monitored by humans. When things go wrong humans make the little changes to keep the system going. The computer doesn't decide whether to offload the passengers if a train is stuck, or make them wait for some other humans to fix the train. A team of humans can do this for a subway system with a few hundred cars on a dozen lines, but a few thousand cars on many hundreds of roads is another matter.
That's likely going to happen in some underdeveloped country with a government willing to take risks, not in developed, heavily regulated ones. And globalization might prevent it even there; something about Tower of Babel and all nations cloning the same thing that doesn't work well comes to mind. We probably need diverse cultures more than global ones for the sake of progress and original solutions, i.e. optimization exploring different local optima instead. I think China was experimenting by marking whole regions as following a certain development philosophy, i.e. this part behaves like Germany, that one like Russia, another one as UK etc.
You can not compare the times back then with what's happening now. Alot of things have changed. We are living in the information age, the science and knowledge growth is exponentional.
Dystopian flipside; it then becomes impossible to use the roads (and thus, travel other than by foot, struggling through unmaintained footpaths and dirt trails; many places simply impossible to get to on foot) for anything not approved by society. Your insurance company gets a say in where you go to eat. Google and chums get a say in where you go to shop. You pay extra to get to your destination sooner. You ask to go to one bar but get dropped off at another; one that paid to have you delivered there.
I wonder which of these will come true. Some of them, in part, I'm fairly confident of.
(Adapted from a deeply-nested too-late-to-notice reply I made on another thread[1], but which I suspect more people would benefit from seeing)
I am skeptical that AIs capable of piloting fully driverless cars are coming in the next few years. In the longer term, I'm more optimistic. There are definitely some fundamental breakthroughs which are needed (with regards to causal reasoning etc.) before "full autonomy" can happen -- but a lot of money and creativity is being thrown at these problems, and although none of us will know how hard the Hard problem is until after it's been solved, my hunch is that it will yield within this generation.
But I think that framing this as an AI problem is not really correct in the first place.
Currently car accidents kill about 1.3 million people per year. Given current driving standards, a lot of these fatalities are "inevitable". For example: many real-world car-based trolley problems involve driving around a blind curve too fast to react to what's on the other side. You suddenly encounter an array of obstacles: which one do you choose to hit? Or do you (in some cases) minimise global harm by driving yourself off the road? Faced with these kind of choices, people say "oh, that's easy -- you can instruct autonomous cars to not drive around blind curves faster than they can react". But in that case, the autonomous car just goes from being the thing that does the hitting to the thing that gets hit (by a human). Either way, people gonna die -- not due to a specific fault in how individual vehicles are controlled, but due to collective flaws in the entire premise of automotive infrastructure.
So the problem is that no matter how good the AIs get, as long as they have to interact with humans in any way, they're still going to kill a fair number of people. I sympathise quite a lot with Musk's utilitarian point of view: if AIs are merely better humans, then it shouldn't matter that they still kill a lot of people; the fact that they kill meaningfully fewer people ought to be good enough to prefer them. If this is the basis for fostering a "climate of acceptance" then I don't think it would be a bad thing at all.
But I don't expect social or legal systems to adopt a pragmatic utilitarian ethos anytime soon!
One barrier it that even apart from the sensational aspect of autonomous-vehicle accidents, it's possible to do so much critiquing of them. When a human driver encounters a real-world trolley problem, they generally freeze up, overcorrect, or do something else that doesn't involve much careful calculation. So shit happens, some poor SOB is liable for it, and there's no black-box to audit.
In contrast, when an autonomous vehicle kills someone, there will be a cool, calculated, auditable trail of decision-making which led to that outcome. The impulse to second-guess the AV's reasoning -- by regulators, lawyers, politicians, and competitors -- will be irresistible. To the extent that this fosters actual safety improvements, it's certainly a good thing. But it can be really hard to make even honest critiques of these things, because any suggested change needs to be tested against a near-infinite number of scenarios -- and in any case, not all of the critiques will be honest. This will be a huge barrier to adoption.
Another barrier is that people's attitudes towards AVs can change how safe they are. Tesla has real data showing that Autopilot makes driving significantly safer. This data isn't wrong. The problem is that this was from a time when Autopilot was being used by people who were relatively uncomfortable with it. This meant that it was being used correctly -- as a second pair of eyes, augmenting those of the driver. That's fine: it's analogous to an aircraft Autopilot when used like that. But the more comfortable people become with Autopilot -- to the point where they start taking naps or climbing into the back seat -- the less safe it becomes. This is the bane of Level 2 and 3 automation: a feedback loop where increasing AV safety/reliability leads to decreasing human attentiveness, leading (perhaps) to a paradoxical overall decrease in safety and reliability.
Even Level 4 and 5 automation isn't immune from this kind of feedback loop. It's just externalised: drivers in Mountain View learned that they could drive more aggressively around the Google AVs, which would always give way to avoid a collision. So safer AI driving has led to more dangerous human driving.
So my contention is that while the the AVs may become "good enough" anytime between, say, now and 20 years from now -- the above sort of problems will be persistent barriers to adoption. These problems can be boiled down to a single word: humans. As long as AVs share a (high-speed) domain with humans, there will be a lot of fatalities, and the AVs will take the blame for this (since humans aren't black-boxed).
Nonetheless, I think we will see AVs become very prominent. Here's how:
1. Initially, small networks of low-speed (~12mph) Level-4 AVs operating in mixed environments, generally restricted to campus environments, pedestrianised town centres, etc. At that speed, it's possible to operate safely around humans even with reasonably stupid AIs. Think Easymile, 2getthere, and others.
2. These networks will become joined-up by fully-segregated higher-speed AV-only right-of-ways, either on existing motorways or in new types of infrastructure (think the Boring Company).
3. As these AVs take a greater mode-share, cities will incrementally convert roads into either mixed low-speed or exclusive high-speed. Development patterns will adapt accordingly. It will be a slow process, but after (say) 40-50 years, the cities will be more or less fully autonomous (with most of the streets being low-speed and heavily shared with pedestrians and bicyclists).
Note that this scenario is largely insensitive to AI advances, because the real problem that needs to be solved is at the point of human interface.
People love to talk about trolley problems, but they almost never happen. Every accident myself or anyone I know of has gotten into was entirely the result of simple human error, with no difficult ethical decisions involved.
It's a more generalised problem than you probably think, and is a factor in virtually every multi-party accident. Even if the root cause of the accident is someone else's human error, this still leaves surrounding vehicles with a choice about how to respond to it: whether to take damage or inflict damage, and if the latter, then inflict damage upon whom.
For an AV, that's a trolley problem. For a human, it's not. We don't rationalise about such things: it occurs in an eyeblink, and gets covered under "shit happens". We convince ourselves that the outcome is inevitable, just a function of physics. But it isn't. Unlike humans AVs will have the ability and obligation to make choices in such situations; this is critical for their harm-reduction capabilities. When all of those choices are to some degree bad (which is very, very often), then it's a trolley problem. Which is why people who work on AVs are legitimately concerned with them.
Eh. Or you can just let the auto-driving auto-mobile take the same "shit happens" approach that humans do. Whatever the code happens to spit out, that's the direction it goes, ethics ignored.
Yep. All that stands in the way is every personal injury lawyer, class-action lawyer, consumer interest group, public safety regulator, competitor, and short-seller in the world.
I could be wrong, but my understanding is also that fatalities per event tends to be in the low single digits. There are just a high number of fatal collisions.
That, of course, makes sense if cars generally have 1 or 2 occupants and accidents are usually single or dual vehicle.
It would make more sense for self-driving busses to be touted for safety and the trolley problem instead (though, granted, they may be far less likely to be in accidents already, having professoinal drivers).
There's a hypothetical world in which organ distribution is so efficient and prevalent that the average fatal car crash saves net lives. What a strange kind of "trolley problem" that raises...
Total vehicle miles will probably increase, but fleet vehicles tend to require less maintenance because they're more regularly maintained on average (I'm told.) Maintenance will probably also become more of a fleet/B2B thing than B2C "individual car problem" thing. Partnerships will be formed or maintenance will be brought in-house, and margins will go down.
I think the police will switch from using driving as an excuse, to using online activity as an excuse. "This guy has been googling the same stuff that ISIS does, better bring him in!" And then they send a signal to whatever self-driving car you're in and tell it to lock the doors and drive you straight to the station for questioning.
What will really change is your ability to hop in a car and literally drive anywhere you want; be it some random corner of your yard, backing down a boat ramp, or some unmarked gravel road deep in the country side. Not to mention the day companies and governments have the ability to electronically restrict where you or the population at large can visit. It will become trivial to look down entire cities and road systems.
I think people will under-appreciate this incredible freedom in travel which we have now until it's gone.
Interesting read, though I think they're overestimating the cost of the human driver in relation to the total cost of running a route/taxi/shuttle/bus/etc. If you go completely autonomous, maintenance costs don't go away. (And in fact vandalism probably goes up.) Fuel costs, which at some point are going to become a big deal again, don't go away. And permitting fees either stay the same, go down or go up, depending on how local governments handle it.
I guess we can approximate vandalism very easily through both shared car platforms such as car2go (for vandalic users that are left alone in the car), and vandalism against taxi substitutes such as uber (for rage against displaced driver jobs).
It doesn't seem that car2go-style cars get vandalized a lot; next-user reporting controls most of what goes above a level of vandalism (smoking users and some trash in the car is normally not reported).
Rage against uber by taxis has been more rampant (imho); so if we add bus drivers, lorry drivers, taxi and Uber drivers, etc. there might be a lot of damages on the short term. But once these displaced workers find another job that vandalism will disappear, so it will be transient during adoption times only.
You haven't captured every motive for vandalism, but I agree with you that it's probably mostly transient. Kind of like how the people who go crazy with drugs & prostitution in Amsterdam (well with pot legalization I probably no longer have to travel quite so far to fetch this example, but anyway...) are mostly visitors for whom such a thing is novel. Most Dutch natives are pretty blasé about having access to those things.
I notice the Car2Go (or e.g. Zipcar) next-user reporting system depends on their knowing the identity of each user at every point in time. Which is presumably also true if you've used an app to hail or summon an autonomous vehicle... Still I guess I was nostalgically imagining something like the relative anonymity of today's cash-fare bus services. (Though even those are loaded up with surveillance cameras.)
At the same time, the world moves on marginal changes, and many industries that depend on shipping are inherently low-margin anyway (think grocery stores). Business models that are barely viable now could be substantially viable with a ten percent reduction in transit cost.
Autonomous vehicles aren’t public spaces. They’re privately owned capital. I’m not making a value judgement here, but I am saying that other people do.
I think that a large amount of skepticism is justified... There's a reason that all self-driving car trials are running in states with mostly sunny weather and no snow.
Let me know when a self-driving trial is being run in Vermont or New Hampshire. In November.
> all self-driving car trials are running in states with mostly sunny weather and no snow. Let me know when a self-driving trial is being run in Vermont or New Hampshire.
Yes, they just won't run. Many sunny places don't have plows and such to handle snow. If it happens to snow, the entire place grinds to a halt. Even if the cars can handle the bad weather, the drivers are so unaccustomed to it that they're terrible on the road to the point of being very unsafe.
Would you buy a car which can drive itself in good weather? Many people would, because depending on the weather the car would just drive itself most of the time and you could just chill out.
If the weather is good it drives itself. If the road conditions change then it asks you to take over.
If I could get an automated driver 50% percent of the time or more then I'd take it without hesitation.
There will probably remote operators for bad weather, but not in the way of remote steering wheel, rather the operator will tell the car "these are the limits of the road" and the car will be able to use that info instead of needing to see lane markings.
I think this is artifact of where US self-driving car developers are located. Like German cars are great for driving on autobahn. Yandex posted some videos of self-driving car in snow, for example.
If I were building a self-driving car, I’d be pretty eager to provide a counterpoint to the GP to the point that I’d be willing to open an office in Boston.
When this starts scaling, there is definitely going to be a culture clash between these autonomous vehicles which follow the exact letter of the law, and the human drivers who routinely don't.
When you get stuck in the line of cars behind one of these things in the fast lane, and it's not going 3-8 mph over the posted speed limit like everyone else...
I'd bet that they are going to have dedicated lanes.
This is always the first thing I think of when I picture a future with wide adoption of autonomous vehicles.
Not only is there road-rage potential, think about the municipal revenue from traffic enforcement. The public purse will remain thirsty for the funds, but autonomous cars will be very difficult to fine (either because they don't break the law or because every ticket gets disputed with hard data from companies that can afford to defend themselves). So what will happen? People driving older cars will start to become "over fished". As time goes on, it will only be poor people getting tickets (even more so than today) and therefore disproportionately minorities, and the social unrest between those people and an antagonistic police force will worsen.
The way laws are made is an interesting and complicated topic. Most people have an innate understanding as to what is "correct", when when the laws don't match up then there are problems. They keep the speed limits a little below that number because then the police can pull over cars they don't like at will, and because of the ticket revenue.
Honestly, I think once autonomous vehicles pass the 50% mark, they will have to increase the speed limits just to keep everyone safe.
Right now the law does reflects neither the way people act nor the way the infrastructure was designed (see 90th percentile rule vs politically determined speed limits).
People want to go what is a safe and reasonable speed for the conditions (55mph is safe but not reasonable for an interstate highway in most non-traffic jam conditions), the driver-less car operators want to be able to get people and things from A to B just as quickly and the highway engineers know that having two groups of traffic following different rules on the same infrastructure is a recipe for disaster.
The law will be what change because changing the law doesn't require millions of dollars like creating separate infrastructure does.
As a bonus the people who follow the law without considering their surroundings (e.g. people going the speed limit in the left lane) will be less annoying to the majority of the population when the letter of the law becomes aligned to how the majority of the population acts.
The letter of the law in many states says that if you are not passing you must use the rightmost lane. So, these cars should pool up in that lane while human drivers zip by to the left of them. There are humans that go the speed limit today, and this is usually what happens.
In my experience, the cars in the slow lane are often going well below the speed limit. So you could easily have self-driving cars, driving at the speed limit, in the passing lane.
I'm curious if they would be programmed to 'move in' to allow a faster (exceeding speed limit) car to pass.
I can't help but think there are going to be significant unforeseen consequences of this -- consequences that will give rise to business models we haven't even begun to consider yet. Off the top of my head: why have retail stores to which you have to drive 5 to 15 miles? Put a densely packed warehouse (one that looks more like a large home than a warehouse going ten levels below grade) of the most commonly needed groceries right in a cluster of neighborhoods and orders can be made online, picked by robots, and delivered by autonomous vehicle (and restocked from larger, more central warehouses in a similar manner).
And kicking it up a notch: how about the domestic robot signing for the order and beginning dinner preparations with the just-in-time delivered products while the humans in the house... well, play video games....
While I am enjoying the Jetsons-like future that is envisioned in your post, I'm quite curious how we can maybe do simpler solutions. For example, we used to have a milkman who just dropped off milk on our front door. I can easily imagine dairies owning their own regional distribution networks with autonomous vehicles performing milk deliveries for much cheaper. No domestic robot really needed to sign for the order.
Same goes for groceries, I suppose. A co-op of farms operate an automated warehouse together whose responsibility is to consolidate the various fruits and vegetables and distribute via autonomous vehicles to homes. It gives a competitive advantage to regional farms so they can actually still do things at a reasonable economy of scale.
Just my two cents. Also, yes, video games all day. :)
The domestic robot and video games part was tongue-in-cheek but your idea of locally-sourced, fair-trade, free-range, antibiotics-free, non-GMO, brownie-points-loaded options could greatly benefit from this type of logistics setup. Really what this does is make it a lot easier to never have to leave your home for any reason if you don't want to. There are some people with this type of phobia now who can work from home and have everything they need delivered to them, but they are an abnormal exception. As the cost over going out to get your own stuff versus staying at home and having it all come to you drops toward zero I wonder if anti-social reclusion might become an unintended consequence.
I like the concept car of Nuro, looks like a moving shelf, maybe could be possible to scale that to a bigger size. Where each customer, can pickup his groceries.
random thought: an autonomous milk car with fresh diary from your local farm
A lot of efficiency and responsibility/insurance issues will be solved if the store used its own fleet of delivery cars. Other than that, we can hope this materializes. Domestic robots will be a revolution.
I'm not sure. I'm thinking it will be on my way home my car stops at the grocery store for a few minutes to fill my trunk, thus saving the energy to drive my car and their car to my house.
I've had a smart fridge, I will never own one again, my fridge should not need a reboot. I will also not buy Samsung again. They use a bunch of custom software for no reason.
Living in a popular rural area with a lot of seasonal tourists ... I see the future of self driving cars as mobile homes and hotels. Sort of like a mobile air-b-and-b. A lot of my friends chase wind, waves and events around the area. So they pack up a caravan or home made camper van and drive to the location and party and surf. There is a crowd of 20 or so who share van space and give lifts. I can only see this getting more popular. They already rent vans out. When these become custom self driving surf wagons complete with shower kitchen etc people will live on the move. Travelling at night while the passengers sleep.
They don't even need the kitchen/shower. When traveling I tend to eat at restaurants, or find a park with a grill (sometimes I bring a camp stove). Truck stops already have showers - they will need new users for them as there are no longer truck drivers.
I've long wanted this, travel for 10 hours at night while I'm asleep, check out a historical museum in middle of nowhere Kansas, drive an hour, have lunch, and check out some county park (again in the middle of nowhere), then have supper, drive a bit and watch the sunset over some historical marker and the process repeats until I'm at Grandma's house. Much better than the current plan of watching cotton fields go by.
Not mentioned here but likely highly dependent on Self-driving technology is all the various forms of flying vehicles (example https://lilium.com/). The advantages are endless, 3d traffic, zero road cost, direct line flight, speeds exceeding highways. Primary real downside is energy cost.
Self Driving RVs and Bay Area rent/realestate prices. Imagine being able to wake up in front of your office, then walk the 10 feet back "home" at the end of the day. Then after an hour or so of hanging out in your mobile home's living room walking outside to your spot in the Muir Woods.
Integrate into this mix all the future possibilities created by new materials and AI including better drones, flying cars, advanced forms of collapsible bicycles, and new forms of intercity transport including self carrying luggage and we can expect churn in the ways we transport ourselves and our stuff for a long time.
Interesting idea, but I don't think it works in practice.
I have foods and spices scattered across many cabinets in my kitchen - just like every other family in the world (except for a few hunter/gather families). Making them all autonomous vehicles would be expensive.
I also sometimes come in at random times and snack - the fridge with the snack I want better be there, not out to the store to be refilled. This again is a situation that every family in the world faces some variation of.
Once the tech is proven and accepted by enough people, it will become mandatory and we're going to rebuild everything with a much smaller footprint.