People are excited about self driving cars for their potential to reduce accidents by removing human error from the equation. It's not just human error that causes accidents though.
Human emotion in general causes dangerous situations. When we drive now, we're restricted to driving and driving alone. We can listen to the radio but for the most part we are restricted. Because of this, we tend to want to get to our destination as quickly as possible. We drive fast to get the restrictive experience of driving over with.
However if we were free to do other things, like get work done or sleep or watch TV while in our cars, I think we would generally be much more tolerant of travelling at a lower speed. While eventually our self driving systems may be capable of being very safe at very high speeds, I think reduced speeds tolerated by people could offer an extra layer of safety.
Have you never had the experience of being on a crawling train and counting the minutes after the meeting you were supposed to be at has already started?
It seems you're confusing the effects of travel being slow with the effects of travel being unreliable. It's not the slowness of a train that's annoying you, it's that the train has slowed unpredictably.
That said, I suppose there's every reason to think traffic impacts of self-driving cars (more cars circling, waiting for passengers) would make traffic slower and more unpredictable.
Edit: And, of course, the people in the non-self-driving cars would get the slowness without any direct benefit to themselves.
Well technically you could be telepresence into the meeting because your car can be outfitted with a nice TV/monitor, audio recording equipment, and high speed wifi... so you can have meetings while you drive to work. Or you spend your quiet time in the car working, and instead use office time for meetings.
Besides, in the hypothetical self-driving future the transit times will likely be far more predictable than it is today... with AI running and optimizing everything.
We can already telepresence into the meeting. I moderately understand your argument, but I think the parent is talking about meetings you have to be in physically. And if your argument is "Why do you have to be there physically" I would counter that if you don't have to be there physically why do you ever need to get into a self driving car.
I'm curious what Utopia you live in that has network coverage reliable enough in subway tunnels or on trains going through rural communities to make teleconferencing from mass transit a tractable solution. ;)
(No, but more seriously: I'm a bit envious that this is working for your teams. It's completely impractical in the East Coast US cities I've frequented).
In the SF Bay Area, Caltrain is aboveground and is more than quiet enough to call into meetings. Think NYC's MetroNorth, there's no reason you wouldn't be able to make a call until you get into the tunnels near Manhattan. On the other hand, BART is underground with signal repeaters just about all the way through - but it's very noisy so you could call in and listen but probably not contribute.
If your transit system is underground without repeaters though, I agree that you're out of luck. I can't imagine trying to take a call on the T.
Here in Taiwan, we also offer free Wi-Fi access across the High Speed Rails system (~180mph, ~200 miles of coverage), reliable enough even in tunnels, as of this year.
In suburbia/regional areas certainly, but most large cities are already heavily congested, how many more people could possibly fit in peak hours? As optimised as we can make it, there's still a predictable one-directional crush every day.
The real optimisation of roads would be incentivising companies to make their employees start/finish outside of work hours.
A self-driving future will likely still heavily involve public transport in the short term at least.
Hopefully fully autonomous vehicles will get far smaller, travelling pods which are both cheap to buy and maintain, much like a motorbike but without the downsides that turn most people off.
Smaller vehicles, especially trucks will quickly be needed to reduce road wear and offset the likely cost of greater road usage. Smaller personal transportation would also be a boon to road efficiency.
Ultimately the key point is predictability. Being able to plan when your meeting will be. Even if congestion doesn't change or gets worse, the point is that it will be easier to plan as trip times become more predictable thanks to machines. That way you won't be late for meetings. as the OP inmplied.
Selfing driving + fleet cars will most certainly reduce the average car size. People buy 4-5 seats for the edge cases when they might need to pick up some friends or more than one child at a time, or merely because it's the most common type of vehicle available, which people buy out of habit or whatever social reason.
But looking at car on any highway and you'll only find more then 1-2 drivers a minority of the time.
When we eliminate the need to buy a single car for all general use-cases, and have a high quantity of self-driving Uber style cars available, then 1-2 seater cars with become the most popular size. Which in itself with reduce climate damage and increase capacities on roads.
Especially as self-driving cars can drive at higher speeds and closer together, potentially linking up like trains for long-distance rides.
You're not going to feel any better if you're late in your car on the way to the meeting.
I take the bus every day to work even though in total it takes exactly twice as long than if I took my motorcycle? Why because I like to read on the bus. It's not actually important to me to get there any quicker. In fact on days I have earlier meetings I just leave my apartment earlier. I think the idea that people wouldn't mind going slower may be real for a lot of people.
That idea isn't real for anyone who had obligations outside work at fixed times. Especially for parents of small children who have to be reliably picked up and dropped off.
> blame it on something that nobody can lambaste me for blaming
I once had somebody pull this shtick for being late to a meeting. I put it in the same category as using "traffic" as an excuse. Disasters which make the news are acceptable. Routine inconveniences causing one delays are okay, once in a while, but never if treated as an immunity against criticism. One should have planned better, by leaving earlier or taking another mode of transportation. Then there are the characters who leave the office late and blame it on traffic every time.
Well I assumed the example in GP's case is some kind of notable breakdown or delay. If you're just late because you don't account for random 10 minute delays in the trains and such, that's entirely your fault, as much as being late because an accident slowed down highway traffic today.
But if your train is stuck in the middle of a tunnel for 3 hours because a crazy person ran onto the track and authorities are chasing them, that's a pretty fair excuse.
In my experience, chronic lateness is a function of the person. The person who is late at 9am will be late at 6pm or when they choose the time or when it's a block from them or a city and a half away. (Note: I used to be this person when I was younger. I now leave for any meeting in my city one hour before and then hang out until the start time.)
> People who are idling all day to be available for meetings likely aren't part of a high-value organization
Why would one be idling? The founders and executives who do well have a handle on their schedule. They avoid over-scheduling, keep thing short, show up on time and get shit done. At the larger scale, my friends at Google, Apple or Amazon don't run late for anything, be it a personal brunch or M&A negotiation. Properly functioning as an adult, and fixing problems over making excuses, is simply well internalized.
I find meetings tend to start a few minutes in. Also at one of those companies you mentioned, people did run late or just didn't show up, especially the more senior they were (not out of laziness, they tended to run more important meetings late, or be double-booked).
I hear in Germany if people aren't there when the meeting starts, it gets cancelled. It's definitely part of the culture.
Not Germany but at my company we wait about 5-6 minutes and if you don't show up for the meeting everyone drops off. Exceptionally rude to schedule something and expect people to wait. Doesn't happen again after everyone hangs up the first time.
I have been unfortunate enough to have that experience. Being on a commuter train is like being a sardine packed into a can though. I imagine fully autonomous vehicles will offer more comfort.
I train commute everyday and despite the ability to watch a movie or whatever, I still just want to get to my destination and move on with my life.
I think it'll be interesting to see society push the Time to arrival vs. general safety balance. I think we'll see a rising speed limit as the risk becomes more and more mathematical.
I am excited to see what effect moving transit into private vehicles will have. I currently enjoy my commute time with a book but there are other things I'd prefer to be doing. What if I could enjoy a bit more evening in exchange for shifting my showering/breakfasting/working out into the morning commute? I'd be pretty happy with that.
Especially if we move sleeping into self-driving cars then that long commute of yours might start before you're awake
What I think would be even better is to exercise to commute - many folks I know have the joy of this option e.g. biking 10mi or less to your work and showering there.
I'm more excited about hybrid bikes (allowing even older folks to be more mobile) than self-driving cars.
Having spent the last two years in various Asian cities, I feel confident in saying most American cities could halve average commute times with no sacrifice to safety. To name a few cities that have much better commutes than the bay or NYC: Tokyo, Hong Kong, Shanghai, Taipei, and Seoul.
Whether things get better in American cities is not a matter of technical innovation at all, but politics and economics.
>I think we'll see a rising speed limit as the risk becomes more and more mathematical.
Risk and speed limits are already mathematical. See the 90th percentile rule. If self driving cars makes obvious the disparity between the letter of the law and actual practice then the law will get changed.
Pretty much the only stakeholders that want the current mismatch between 90th percentile speed and speed limit are the cops who want a pretext to go fishing and the states that want revenue.
Self driving taxis, delivery vehicles, etc that are doing (for example) 55mph "because the law" when the road should really be an 80 is a terrible user experience, especially on roads with mixed human/ai traffic. Insurance, self driving fleet owners, car companies, etc all have a huge incentive for self driving cars to be able to get from A to B just as fast as a human car. Nobody will take your self driving taxi if a human taxi is faster and comparably priced.
We'll see a higher speed limits because the automakers, fleet operators, etc will lean on the states and if that doesn't work they'll lean on the feds (nhtsa, dot and similar) to lean on the states to set limits that are actually consistent with the 90% rule.
Could have the unintended effect that people are ok with spending longer traveling in a car though. E.g. people commute 2 hours a day to get into London via train
Thats not necessarily a bad thing if you make transportation efficient enough. Self driving pod that hooks up to public transport, giving me privacy the entire time? Sure.
2 hours is time for art, digital or pen and paper (I draw while on airplanes and such and have learned to compensate for bumps and things). I could knit. Watch things. Read. My spouse would sometimes make music. Some of these are good even if the commute is shorter, and would be an improvement for anyone commuting a good distance already due to traffic and whatnot.
4-5 hours? Split sleep schedule. (I have no children).
It also means folks can better commute to work from small farming towns that lack jobs. I know that an hour commute each way (by car) is too much for me, but I've had to just to make decent money. This was living in the midwest, mind you, and it might be different other places. It wouldn't have been an issue if I wasn't actively driving.
Yes, I do understand that some of the hope would be that people live closer, but that's the hope and push now and it breaks often enough.
I'm worried about them because they will replace human error with computer error, which although rarer, is systemic and harder to predict and deal with. Human error is bad, but I don't want to wake up one day and find a Heartbleed-level error in self-driving cars, or the equivalent of a hardware-level recall.
I'm always surprised people are so accepting of it, to be frank.
If fewer people are killed in auto accidents, it’s a win even if the type of error is different. Dead people is dead people.
Why are you surprised? People are cool with massive metal murder machines flying down roads just because it’s a human driving it and not a computer? I seriously can’t wait for all us flesh drivers to be replaced with computers. The thing is that over time computers become better and better drivers. Humans don’t, actually we just get worse as we age and stay on the road anyway. Our human reflexes and biology also limit how fast we can react and subject us to road rage, stress, fatigue and other problems that can’t be debugged. Why is it worse to have software that we might actually be able to fix?
If fewer people are killed in auto accidents, it’s a win even if the type of error is different. Dead people is dead people.
Well, I think the GP was sketching a scenario where fewer people might be killed until one day when a lot of people are killed by a bug/hack/zero-day that suddenly generates huge crashes.
Where does blame get placed when there is a computational error and my self-driving car kills another human-being? Am I still liable under my insurance policy even though the error was out of my control? Is there any conversation going on about insuring self-driving vehicles and responsibility when an error does occur and someone is killed?
> Where does blame get placed when there is a computational error and my self-driving car kills another human-being?
Why do you assume the courts are incapable of handling this, or the law doesn't deal with it? Most likely you won't even own a self driving car, you'll be getting into a car owned by Uber or Google and they'll have to face liability, the same way you aren't liable as a passenger in a taxi if the driver gets into an accident.
On the victim side, you're actually more protected because a self driving car is less likely to be uninsured. If an uninsured driver hits you, you're just out of luck, they may go to jail but nobody is paying your bills. In the near term Uber or Google will have loads of insurance and loads of money to pay out claims.
There was no assumption. Just a simple question. If we are using assumptions, I assume it won't be as easy for the courts to handle initially as you suggest. It's a newer technology therefore there won't be any/many precendents. I'm not a lawyer but it seems quite obvious that this type of thing could get complex fast.
There are a lot of conversations happening about this. In short, right now the autonomous features available to consumers are considered level 2 so the driver is required to be alert and ready to take control at all times (similar to using cruise control).
Thanks, fitzroy. That is interesting I'll have to read more about this.
It's just my opinion but at this time I can't see self-driving vehicles as a viable option for anyone. It just seems like another layer of complexity added into an already complex situtation.
In the case I need to assume control of the vehicle am I going to be fighting against the autonomous system for control? Not only will I need to understand the current environment and what others are doing around me; I also need to understand how the system is computing the situation and maintain control based on my observations and my observations of what the system is trying to do.
It has been talked about from the start. I remember a documentary from the late 90's about self driving cars (it may have been BBC Horizon) and the liability issue was discussed.
Maybe the answer will be Uber-like car fleet services that self-insure.
Is it actually harder to deal with? About 1/6 of fatalities are drunk drivers[2], and they cause another 1/6 of of car accident fatalities[1]. Almost half of all front-seat (driver + front passenger) fatalities aren't wearing seatbelts while on average 90% of Americans do wear seatbelts[2]. There's no way of solving either these problems without identifying people individually and improving their behavior, which is a pretty intractable issue.
I would argue that aggressive education campaigns (similar to anti-smoking campaigns which drastically reduced the number of smokers) could help with seatbelt awareness. Showing how far someone crashes brought a windshield even at 35 mi/h can be pretty effective at getting the point across.
I don't think we will have much choice about accepting the risks of computer errors/bugs. Transportation systems generally are probably reaching the the limits humans ability to manage them, we will have to turn them over more fully to computers at some point.
Road rage is one of the bad emotion-induced states as well. Something competitive about human nature makes us bristle at minor events like another car getting ahead or cutting in line, and likewise makes ourselves do dangerous things.
Humans driving faster than is safe, even if due to emotions, is still human error. The primary limiting factor on safe speeds (at least on "proper" highways) are humans and their poor reaction times.
Couldn't be said enough. What's sad about people that speed too, is that they mistakenly believe that they are the only ones affected by their speed. When in fact most of the affected people are the ones that feel safe pulling into an left-turn intersection knowing the speed limit is 25 and realize too late that someone is barreling toward them at 45. I wish there was more statistics about this phenomenon, but I wholeheartedly believe the mistake speeders make is thinking they have complete control and, yes while they may be good drivers in a certain aspect - the ignoring of commonly known road speeds is why they are, in fact, not good drivers.
I tend to get motion sickness and eye strain when I try reading or working in a moving car. Maybe TV would be a bit easier on the eyes, but I think it's not a healthy habit in general.
I can't read for long in a moving car. I can, however, draw or paint, knit, watch tv, or any number of things like this. It helps to take breaks often to look around or focus on things further from your eyes so you aren't noticing every bump.
I've had that experience as well, but this is different. I don't generally get motion sickness in vehicles - but reading will make me get an upset stomach, regardless of who is driving. I just learned how to minimize some of it and realized the stuff I can do on long trips.
In my experience a lot of BigCorps in my area (SFBay) already do that. The inevitable morning meetings filled with status updates, daily standups and plans for the day. Also with people beeping in and out, "sorry I was on mute" and all the associated nonsense that goes with it. No work gets done until after 10am...
That's an excellent (unintended?) consequence. I would certainly be more amenable to a longer commute if I could work remotely from the car and count it as work time.
We're not gonna reduce speeds. If anything we'll increase them because it's a near zero cost (in terms of taxpayer $) way of increasing capacity/throughput without more infrastructure.
I fully welcome pizza delivery at triple digit speeds.
The cost of travel (time, money, $otherMetric) is basically economic dead weight.
How would we increase speeds? Self-driving cars aren't magic and can't defy the laws of physics. In most cities, roadways are at capacity already, and convincing people to switch from dense modes of transit to sparse modes of transit (in terms of people per square meter) would only exacerbate those capacity crunches. Surface roads with at-grade intersections can't be sped up meaningfully, and there's no space for relatively high-speed highways, not without spending exorbitant amounts of money on building them.
If anything, the fact that humans tend to do things like drive too fast for conditions or leave too little a gap for safety means that properly safe cars will go slower than the average driver.
Most commuting in the US is done on highways, no at-grade crossings, and most congestion is caused by the imperfection of humans driving (look up rubber banding effect on traffic). Self driving cars can use the same 4/5/6 lane highway MUCH more efficiently than mere mortals can...
If you watch the track (not at a station) of a "dense mode" of transit, say BART, what you will see is multiple minutes of zero density and then a few seconds of high density. If you can have some density all the time, one can have a car like amount of space for each person and have the same throughput as current, standing-room-only, train systems.
But cars need lots of space as well too. If you actually run the numbers in practice, a section of track is actually more likely to be physically occupied than a section of roadway.
I envision, with automatic cars, a situation where the cars run bumper-to-bumper, like train cars, and then can detach and exit the track. Best of both worlds.
If cars run bumper-to-bumper, that mean the entire line must come to a halt when you detach one car, then the front half must move forward to give some clearance, the car can then leave, then the rear must speed up faster to catch the front half.
The experience from the train world is that having to switch trains on a line severely crimps performance: you can get 40tph on an unswitched line, but only about 30tph on a switched line. The limiting factor on trains tends to be ingress and egress.
if they are physically coupled then yes, but if they are independent it likely would be more flexible and you could create and close spaces with smaller speed adjustments. Surely someone has already modeled this?
The biggest problem I see with approaches like this is how it's going to get going while they have to share roads with individual traffic. Street infrastructure isn't designed for long convoys that have to stick together, otherwise something like this would be a good start e.g. for trucking: Lead truck can even be human-driven, and a group of trucks in automatic mode follows as a closed unit while their drivers rest.
We would increase speed limits. Self driving cars will likely be limited by the law. Nobody wants their self driving taxi to be going 55mph when the humans are doing 80. I described this in reply to a different comment.
>Surface roads with at-grade intersections can't be sped up meaningfully, and there's no space for relatively high-speed highways, not without spending exorbitant amounts of money on building them.
Every red left or right arrow could be replaced with a blinking yellow or solid green for starters. I think we'll see laws about stop signs being equivalent to yield if turning right.
There's a lot of inefficiency baked into the system as a means to mitigate stupid people at high volume points.
For example, we don't bother stopping cross traffic for someone to take a left on solid green but we do at an intersection with a dedicated left turn lane/arrow because of volume. The volume in the latter case is high so we bother to include redundancy to protect the several people per year that would cause problems if they tried to take the left without cross traffic stopping. It's not worth it to have a lane and an arrow to protect at an intersection with a lower volume of left turns so we use a solid green and left turning traffic yields. As self driving cars decrease the number of people that wouldn't be able to take the left on a solid green or blinking yellow then we'll gradually see more of those and less dedicated turn lanes.
>If anything, the fact that humans tend to do things like drive too fast for conditions or leave too little a gap for safety means that properly safe cars will go slower than the average driver.
What's "too fast" isn't defined by us. It's defined by how people actually drive in the conditions. A self driving vehicle that drives like a student driver and gets from A to B no faster than the letter of the law allows is at a huge disadvantage. Nobody's gonna buy a car that's 20% slower everywhere and frustrating to watch operate. Nobody's gonna recommend it to their friends if they have to press the override button every time they want to pull out into traffic in a timely manner or if they're always getting honked at.
Sure, you can hand wave about it not being a problem with full adoption but you need to get to full adoption first and that requires a product that's better in at least one way and comparable/acceptable in all the rest. Look at electric cars. They didn't exist in the high end until Tesla made one that was better in a few ways and comparable in the rest.
> We would increase speed limits. Self driving cars will likely be limited by the law. Nobody wants their self driving taxi to be going 55mph when the humans are doing 80. I described this in reply to a different comment.
Higher speeds results in lower capacity. At higher speed, you need extra stopping distance for safety.
> Every red left or right arrow could be replaced with a blinking yellow or solid green for starters. I think we'll see laws about stop signs being equivalent to yield if turning right.
There could be sensors placed at the intersection that could broadcast information about pedestrians to approaching cars. If a car cannot accept those signals, not compatible etc, it must stop, if it can, then it can include that data in it's decision process. We have offline tools like this currently with mirrors in garages, and blinking signs when a light will be red if on a fast moving highway.
That is completely unrealistic and downright dangerous. There are no sensors or pattern recognition algorithms which can detect pedestrians (and animals and bicyclists) reliably enough to even consider doing something like this. A false negative would literally be deadly.
There are also currently no mainstream self driving cars. I believe the context is brainstorming ideas for things obtainable in the (near) future, not limiting ourselves to current tech. Although I feel a $50 motion sensor from the JC Penney dressing room may cover (or come close) to what you mentioned. We’ve been to the moon! Let’s not be so bleak.
The car in front is not going to instantaneously stop. If it could then you can prove by induction why this conversation is irrelevant ;)
The effective road space occupied by a car is the car plus the distance in front of it. The distance in front increases with reaction distance, not stopping distance. Assuming everyone is (on average) always following at the safe minimum following distance for that speed then you get the same capacity and less travel time.
However, at lower speeds people (on average) tend to follow at greater than the theoretical safe minimum so as speed increases the average following distance gets closer to the safe minimum so higher speed = higher capacity. This is basically a long winded way of saying that following distance is determined by reaction time, not stopping distance and high speed traffic negates the outliers who use high speed following distances at low speed.
> The car in front is not going to instantaneously stop. If it could then you can prove by induction why this conversation is irrelevant ;)
In normal conditions, no. But there are several circumstances in which cars could come to abrupt halts: sudden failures in road conditions (e.g., pothole), crashing into another cars, etc. There's a reason 100-car pileups are a thing.
> However, at lower speeds people (on average) tend to follow at greater than the theoretical safe minimum
What speeds are you thinking about? On urban highways, it is virtually impossible to travel at a safe driving distance, and this is at both freeflow speeds and congested (but not stop-and-go) speeds.
but are they even close to delivering on that promise. Self Driving cars are marketed as a safety feature, safety features compensate for in bad and unpredictable conditions. Think ABS, air bags, traction control, and even seat belts.
Yet how many demonstrations of self driving are anything but well lit and well marked roadways? Self driving won't be a thing until regardless of weather and road conditions are person can have a car take them somewhere. Self driving will have to be smart enough to protect people who are actively driving from making a fatal error.
per that last point, if a self driving becomes a thing cars that can do it will have to intervene or have legislation absolving the system from fault if it does not. even the act of letting a self driving car exceed the speed limit raises legal issues. then comes privacy concerns, if you are not doing the driving then you can bet law enforcement is going to push to do a lot more on the grounds a self driving car has no rights
I personally feel the better option (prioritizing energy savings and time) is to avoid the commute altogether. We can figure out ways to structure workplaces and organizations where physical presence could be substituted by phone/email/skype or VR.
95% of the value to me of self-driving cars is being able to sleep/nap on the way, so I'd agree with you that a slower trip is generally going to be more tolerable.
Well, given skyrocketing housing prices, a self-driving mobile home would be a perfect solution. Living in a vehicle by the side of the road is against the law but living in a vehicle that's always moving from place to place seems like it's legal.
"Welcome to perverse-incentives-R-us, what paradox of this world of plenty can we serve you today?"
If self driving cars go mainstream, people's attitude and thinking of the car as only a transport vehicle may change. Even in congested traffic one may view it as a secondary home where they can work and eat breakfast in and not care about the time it takes.
It will create group of hyper-commuters who travel to work longer than it standard today because it will become slightly more convenient. Nothing I'm looking forward to.
That has interesting implications for real estate. It might singlehandedly revert the trend of living in dense urban areas.
It wouldn't be as bad if cars become fully electrified and cars become smaller as they shift from individual car ownership (where you buy a 4-5 seater for edge cases but only use 1-2 seats 95% of the time).
When you order your automated fleet car to pick you up you can say I only need one seat, which will massively reduce car sizes and energy expenditures.
Self driving cars can potentially link up like trains on highways to further reduce congestion and reduce energy use.
I could just as easily see it going the other way. With self-driving we could free up tons of urban land from being parking, as all cars can essentially valet themselves to a more optimal, consolidated parking location that may not be adjacent to your destination, or put themselves in “Uber mode” and run errand while you’re not using them. Uber becomes even cheaper, thus making it even easier for people to live in cities without a car, and greatly improving transport in places where public transit is not great.
Thus the reduced need for car ownership and all that parking that is now on the market for redevelopment should lower the cost of living in cities which is a huge deterrent that keeps people out right now.
I imagine that there will be some flux in both directions, with the less desirable suburbs being the losers in that equation.
Urban parking lots would essentially become high traffic charging stations. Those cars will need to charge every 100-200km, so owning parking lots will become highly profitable soon.
If I was a long-term real estate developer I'd be buying up a bunch of urban parking lots and converting them into charging stations. Then selling a stock of them to Tesla/Uber in 5-10yrs at a high mark-up.
Apartment/grocery stores/office/etc buildings with underground parking could technically lease out their land as well.
It would still very likely free up far more land than is needed for charging, as so much parking space is used for far longer than the average charging/waiting-for-fleet-calls would need. Not to mention the AI space optimization they would utilize.
More like every 400-500km. Even current electric cars generally have ranges in the 350km+ range, and they're getting better every generation. Unless my work provides me free charging, I'll be charging at home where my electricity is supplied by my solar panels.
Oh wow, it was actually 200 miles for a Tesla 3 which is 350km. And it's only going to improve.
That means in an urban environment they'll be able to drive for quite a long time before charging... that's interesting as it means even less real estate needed for charging.
The implications of this technology on daily life in the next decade will be nearly as interesting as the deployment of consumer internet in the 1990s. There are so many possibilities, it's very exciting...
Yeah, the demand for parking will collapse, which would bring down the value of parking lots, that's going to be a far bigger trend than space for charging.
I don’t see why they’d need to charge near demand though. Charged vehicles can move into dense areas, drive around for a couple hours and then grab one rider to go out to a charge station in a cheap area, then grab someone else and head back in to the city center.
Seems like there will be plenty of fungibility in charge locations.
It's not just charging though, space is also needed for waiting for a fleet call while conserving energy. I'm sure the usage time will be highly optimized but there are still going to be significant differences in demand/number of cars on the road between surge/peak hours and usual demand. Potentially meaning extra vehicles needed for surge are not being used 75% of the day.
It makes no sense for an individual to buy a self driving car, honestly. Remove the cost of paying for a driver from a taxi ride and taking taxis everywhere becomes much more economical.
My commute is currently around 2 hours each way. If I had a self driving car Im still probably not moving much further away, even though it would be a significant cost of living difference.
On the other hand I could see people whose commutes are 30 minutes tolerating a 1 hour self driving commute especially if it means they are paying 1/2 their rent.
I don't and won't understand either. My job is ~5 minutes by car, 15 by bike and I did that on purpose. To spend 4 waking hours essentially unable to do anything other than listen to music/podcasts/books while putting up with other drivers and weather would drive me insane.
I have to imagine that there are extenuating circumstances that force people into these absurd commutes.
Congratulations on your good fortune! In my case, I would either need two incomes or significantly reduce my family’s standard of living (live in a cramped apartment or townhouse) to move closer. Welcome to the Bay Area!
Self driving cars will be life changing. They can’t come soon enough.
Or maybe people who live in a self-driving van; it can park someplace out in the country in the evening and deliver you to your work just in time to start..
I heard a statistic that the average commute across the country is 30minutes because people will always move far enough away for cost but close enough for travel time. I agree it could be possible to impact real estate if commuting is no longer stressful and congested.
What, like an incredibly space inefficient train or bus? It will also cause insane traffic congestion if people cared less about the congestion.
At a certain level of congestion people think it will take too long and avoid the journey. All this will do is move the threshold even higher. Most metro areas need to look at transit as the solution. Self driving vehicles may play a role in feeder routes but the idea suddenly congestion will be solved because of self driving vehicles is crazy; and IMO will actually be the opposite.
Buy first class train tickets then? On DB you can even get a private office on ICE trains. They're really expensive; because that amount of space is by its nature expensive.
The US has largely abandoned new road construction which IMO is a large part of the issue. There is an old saying you can't build your way out of congestion, but it really does work in many areas. Just look at a small town building a bypass.
The problem is when congestion is so bad it's suppressing trips there is a middle ground when improvements are less obvious. That and city's need to be designated to absorb highway traffic or highways just end up as expensive parking lots.
Remote work requires eye contact, which doesn’t exist yet. It will happen rapidly once we have VR with eye tracking.
The other thing missing from remote work tools is collocated workspaces (can’t fit that on your monitor). Existing VR tech solves that problem though.
Instead of working on Google Docs in your office, you’ll work IN Google Docs in VR, and your coworkers will be there nearby, and you’ll be able to talk face to face without leaving your workspace.
Certainly not all remote work requires eye contact, including the vast majority of white collar jobs--especially technical jobs. This argument is silly and never comes with anything but anecdotal evidence and arbitrary standards. In reality, managers want people in the room with them because it's easier on them.
Yes but you've probably got 100 people on the bus if it's packed in a few square metres of road space. If you had a self driving vehicle that was comfortable, you'd need probably 100s of square m of road space to take all the people. Which means even more immense congestion.
Personally I'm not convinced. I wish we could just skip the self driving car phase and go straight to personal self flying drones.
Less edge cases to deal with, and since most people are not pilots the drone must be 100% automated, turning it into something more like a horse-drawn carriage from days of old. Since air traffic is more tightly controlled they do not have to compete much with piloted aircraft. No pedestrians to worry about, minimal possibilities for accidents. Flying away and parking somewhere away from your location is not just a feature – it's mandatory since our world has no accommodations for consumer aircraft parking. Since most landings can be done on the rooftops of tall buildings, it's conceivable that we may reduce the amount of time we spend walking on dangerous or dirty streets, and also opens the door for more creative skyward architecture and retail. And of course there's the magnificent views you'll get of a city even in a boring daily commute.
I will probably not own a self driving car given current technology, but I can't wait to see the age of true flying transports for the masses.
We already have the infrastructure for self-driving vehicles. How do you handle a mechanical failure of your personal flying drone? Does it drop you to the ground or on a group of pedestrians nearby? This surely only increases anxiety of people walking below. What is the recommended height for a self-flying drone? 100's of ft above ground? Not to mention there are no drones (to my knowledge) that can legally transport a human today in 2018, how long until we develop/regulate that technology? (I understand it's the same with self-driving vehicles but we've been working on this for _years_ already and things have improved significantly in that time period)
While I like your outside the box thinking, I disagree with the notion that there are less edge cases than self-driving cars, at minimum it's on par with the same amount of edge cases as self-driving vehicles.
I also believe there are far more edge cases to deal with: specifically the edge of the blades on the propellers.
Whirling blades with an engine powerful enough to lift and transport a human plus cargo whizzing through downtown areas, and everyone owns one? It only makes sense in sci-fi movies.
If your drone fails catastrophically such that it no longer has enough working propellers to land safely, then you crash and die. Simple as that. The same can happen in a vehicle, or any other aircraft. In the end, the loss of life amounts to somewhat of a handful of people. No worse than car accidents which kill people daily and all the time.
Recommended height? It's already laid out, just look at aeronautical maps.
The technology will come. What we require is more battery density and perhaps better materials. Prototype drones for human transport are already out there.
It is certainly a lot less edge cases. But again, when it comes to aircraft people seem to exaggerate dangers and make it seem like a big deal. Aircraft are safer than cars when you know what you're doing.
Because humans also don't currently fly drones to get around, you don't have to deal with all the human cultural and behavioral quirks that come with traffic. Good luck with your self-driving car on India's streets.
I have come to believe that by the time self-driving tech is truly viable at level 5 autonomy, personal drone transports would have long been viable before that.
Driving a horse-drawn carriage fast and safely takes more skill than driving a modern car. Pulling a carriage isn't a natural behavior for horses. They have to be carefully managed. It takes a fine eye and careful touch developed over years of experience.
Very few tall buildings can accommodate aircraft landing pads. There isn't space due to HVAC machinery and antennas, plus they weren't designed to support extra weight.
Flying requires more energy. In a world where energy efficiency is only increasing in importance, that's a big problem.
Also, if everyone is commuting to work by landing on rooftops, then streets will no longer be dangerous: rooftops will be because they will be crowded with drones dropping off thousands of people.
Flying cars will never see the light of day as a mass transportation option over land. The noise is far too great. Given the way in which cars already take away from the human experience of moving through the world, imagine how much worse that would be with cars + drones always on top of you.
Personal self-flying drones won't happen unless the technology becomes near silent (compared to now), meaning different technology will need to be developed before this happens.
I don't think it'll be fundamentally different than taking a black car for example or any uber for that matter. If you think you can dine and work in black car then you can in a self-driving car too, but I haven't seen that happening yet. There are challenges to that that are not just the fact you are not driving, like the fact that the car moves constantly, making turns, etc. are things that are not very friendly to a dining experience or getting work done.
Perhaps if there was a dedicated lane where self-driving cars would move slowly like a train that would be more plausible. Or maybe even freeway driving, no stop and go traffic though.
I recently finished watching the anime "Legend of Galactic Heroes".
In an early episode, a character summons a roaming [empty] driverless hover-car taxi [hailed with a transmitting credit-card-sized hailing device no less] to help an injured compatriot on his way. Watching the scene was surreal; when it first aired no doubt it was totally SF, but now it's actually within the grasp of reality.
The entire vehicle's interior was, basically, Limo seating. No room for any driver at all. The car lands, they get in and close the doors, and the car takes off all 5th Element but driverless to their destination -- one character laid out injured, the other consoling and helping whilst sitting upright.
I just have to believe that interior configurations are going to change similar to this.
I don’t know about “dining,” but people here in Los Angeles eat entire meals while driving every day. Having hands free won’t necessarily be like hanging out at home, but there is a whole lot you can accomplish if you’re used to it.
If anything it’ll probably be less likely for majority of people to eat for example while riding one, specially if there’re fleets that are operated by a company. Think subway, you’re not allowed to eat or drink, yet currently people do eat and drink in their personal cars.
If self driving cars are individually owned then there may be a trend towards a different experience but that may come much later after we have self driving cars hitting the road, as the cost and risk to operate one makes it unfeasible for the manufacturers to sell them at scale to the general public instead of operating their own ridesharing fleets.
Current cars aren't exactly designed for "leisure." In the future we will probably see more car designs centered around leisure such as sleeping, working, or eating.
Not sure we have space for everyone to have a mobile home on the road. We hardly have enough space on the roads for the cars/commuters we already have.
That will happen immediately for sure. Super tinted windows and a nice work setup in the vehicle will make society significantly more efficient. It will probably get sucked into corporate profit rather than improvement of lifestyle, but there is room for hope.
i just wonder how that will affect the people (i.e me) that get very nauseous in cars, especially when looking down/reading something. i dont know the science but its a lot worse when it's stop and go traffic. can self driving cars be programmed to drive and react much smoother to help us folk that are sensitive to this?
There is the short story "The Roads Must Roll" by Robert Heinlein which -- as a plot device or similar -- has these vast roads which enable such things as mobile restaurants. Some characters commute over a business dinner, for example.
We can see the beginnings of this for those of us who ride transit. Reading, "sleeping", A/V media on tablets/phones... and then to make-up and phone calls... anything reasonable and semi-reasonable in public that's not active driving [and non-reasonable for some of the fringes].
Even though this is what car manufacturers might want, I think it's more likely that autocars greatly reduce the rate of car ownership. Why own a car when you can just call one whenever you need it? In this scenario autocars are more like Taxis or buses, i.e. not secondary homes.
Actually a self-driving RV is quite luring if one can work remotely. Traveling cross-country while not actually having to drive. Could see people making them their primary home.
Like electric trains. And to make infrastructure costs low, make virtual trains of self-driving electric cars on dedicated lanes.
I really believe the US will get self-driving cars first, as a first form of pervasive public transportation. I call it a second-world technology effect: just like we in Eastern Europe have much better internet and cellular service (not to speak of online banking) because it was built after the first-world countries already heavily invested in obsolete technology, the US can use the latest technology (SDC) in building its first public transportation system.
The US has public transportation, it's just not as effective as it could be because most of the cities are designed around cars. SDC could solve the last mile problem (especially if paired with Musk's idea of underground high-speed sleds for cars.)
There are two use cases that particularly excite me:
* For commuters, get a ride from your own vehicle to work, and then have it park cheaply outside of town (instead of paying for a parking spot close to work, which is common where I live).
* Have a self-driving van with a bed, have it drive you to a skiing resort while you sleep. Go to sleep in the car Friday night, have it drive at safe speeds using smaller roads, wake up where you wanted to go. Go to bed Sunday evening, wake up back at home. Shower, dress, go to work.
> For commuters, get a ride from your own vehicle to work, and then have it park cheaply outside of town (instead of paying for a parking spot close to work, which is common where I live).
Not sure if I'm excited about that. This would double the traffic during rush hour. I'd sincerely hope they introduce a tax on vehicles driven without driver.
...if there's ten of them. If there's 1000 of them, now the rush hour traffic is both ways. That is the expected scenario, not "I have my very special car like nobody else has."
The self-driving cars are not in a hurry to get to their parking space. The commuters don't want to be late for a meeting, but the self-driving cars don't care about being late for a parking space.
Yes, and...? Doesn't change the fact that they're creating congestion (not only for themselves but also for other road users); plus the return trip can be expected to be "pick me up at $location at $time", which is again a scheduling problem.
Rush hour traffic in that single direction might get heavier if the drivers of single occupant cars can do other things with their time. It simultaneously removes some of the disincentive of a long commute, and also removes much of the benefit of not being the driver.
That might work in theory but I doubt it'd work in practice. You already now get Google Maps congestion of small roads that happen to be the fastest according to Google maps. I don't think vendors will be able to develop a fleet that intelligently coordinates routes. Especially because that'd make routes of some cars slower, not something people would accept.
> I'd sincerely hope they introduce a tax on vehicles driven without driver.
I sincerely hope they do the exact opposite. Taxes are often added to behaviors the state wants to limit (ie. Sin tax on Alcohol/Cigarettes/etc), it would be in everyone's best interest to convert to driverless cars so the tax should be on people without driverless cars and/or more importantly on people who don't carpool.
Sorry, I meant empty cars. Even though that would create a job market where you pay people to sit in a car to avoid taxes (hire teenagers for that?), so maybe just tax cars by miles driven instead of on a fixed basis.
The only reason we don't tax by usage is because we don't have the data. If cars have that data, it would be by far the superior system. Why tax someone driving 100mi the same as someone driving 100k miles?
I think the intention was to say self-driving vehicles with no passengers, though I could be wrong.
The reason being that a car driving itself around with no one in it is essentially wasting resources and spewing out emissions. The example comes up often in what-ifs about self-driving cars.
Ahh, well that I can get behind IF they are in congested traffic because one of the big advantages of driverless cars is mine can drop me off at work then go park in a charging garage across town. I wouldn't want to be taxed for my car driving on a wide-open road to a more efficient location.
What we really need to do is encourage self driving cars as being a form of shared transit, not something that people sleep in, with massive taxes on usage not involving high occupancy.
Self driving vehicles allow ride-sharing to be far more effective as you won't have to pay for the cost of the driver. Right now bus routes are studied, and are used less often than they could be due to the last mile problem, and should we grow the scale of HOV lanes, ride sharing will be further encouraged. We could take a fleet of minibuses and focus on the real time optimization problem, rather than making car travel more luxurious but also make congestion far worse.
Converting on-street parking can help to increase capacity but I don't think you can convert a lot of parking lots into roads. And where real estate prices are high, they're often underground anyway. Making more efficient use of the roadway is great but they'll probably not more than double efficiency (as long as not all cars are self driving and the road is also used by pedestrians), so doubling the number of trips will still increase congestion.
Regarding your first point, would it have to be your own car? Not to derail the conversation into whether or not you should actually own a self-driving car, but what if you just used a shared car for your commute?
It would mean the car doesn't have to drive off to park somewhere, but to just go pick up the next person. I'd imagine it would be possible to have enough cars on the road so the wait is negligible, but still have markedly less cars on the road. Might this be a realistic future?
In before the usual "imagine how people would treat those cars": car sharing already exists in many cities, and the driver is usually asked before starting the journey if the vehicle is clean, and if it isn't, the previous driver will be billed/fined, so that's a solved problem.
Park-and-ride schemes are a much more sensible solution to the former. Public transit runs at high frequencies from parking places on the outskirts into the city centre. You park, walk a short distance to a tram stop, bus stop or whatever, get on, ride to work in the city. In the evening you reverse the process. The density possible is much higher with this approach than if everybody has a car, even a self-driving one.
have it drive you to a skiing resort while you sleep
Everyone within ten hours of a ski resort has already thought of that. Prepare for ski traffic to get several times worse, and lift lines to stretch up the mountain.
Food, lift tickets, etc will also probably get more expensive if it becomes easier to avoid lodging, which is a critical revenue component of functioning ski towns.
Is any Californian citizen here that is worried that the IT companies will move fast and break things?
Are the laws in place to determine who is guilty and who will pay the damages in case of accidents?
As a developer I am always worried when I have to put some big update live, I think I personally could not handle the responsibility of doing such an update for self driving cars.
I've been concerned about the "move fast and actually break things" aspect of autonomous devices for the past year and a half because I view malcode pushed from the update server as a mass damage event / equivalent to weapons of mass destruction if intentional.
I'm not sure how it is now, but according to someone talking to me privately at one company early programmers were SSHing into fucking moving cars with passengers and running commands on them and had it not been for the layers of redundant code would have crashed the car.
We need to regulate autonomous devices and we need to do it before there are billions of them. We need a public effort to get policy makers open source outlines of regulations; otherwise the car companies are going to be writing them and they aren't going to write in protections against black swan events and we're going to get the autonomous equivalent of a BP oil spill or worse one day.
I think if there is a large enough penalty for crashes then the car companies would take quality more serious, if the costs are too low I can see them pushing updates with just a few hours of testing,
There is also the issue with adversarial inputs to the AI, you could have someone put stickers everywhere that would crash the AI .
I am not so concerned with guilt and liability as I am that this is TESTING. As in, not sure yet whether it's easy for production. Looking for bugs. In a risky environment, among the general public. Normally, beta testers are volunteers, who know what they are getting into, and nobody else is involved. This is playing with lives. Every pedestrian the car approaches is an unwilling bug checker.
For one, the human has a brain and human sensors are less easily fooled despite being able to "see" less. LIDAR technology combined with ML is so poor that it can easily be fooled by a stop sign with a sticker on it.
A human has context where a machine does not. A human automatically knows that a ball being thrown in a yard can end up in the street.
A learner's permit also requires that a licensed driver be present. So there's a built in redundancy for decision making.
Machine learning or not, computers are simply profoundly stupid.
I'm not against autonomous vehicles but I'm fairly skeptical about them. They've improved a lot in a basic way but they don't handle corner cases very well. Unfortunately, driving involves a lot of corner cases due to the complexity of the environment.
And how many pedestrians/passengers were killed by autonomous cars during this entire testing period? How many by human drivers? How can you trust human minds so much after the answers to those questions?
For an European this feels horrible,here some people take the exams even 10 times until they pass or just give up. you must learn all the rules for driving in intersections and all other relevant rules and after that you get a test in the city with a policeman. Also before taking the exams there are a minimum number of driving lessons (I think 40 hours ) with an instructor.
What you describe can be the cause of the large number of accidents.
Unfortunately the driving lessons won't prevent the young drivers acting out and doing accidents in their first years as a driver, because of those bad young drivers you get very expensive insurance for all young drivers so some young persons would register their cars on the parents name, here the insurance is paid by the owner not by the driver.
My insurance was also very expensive that first year, $4600 a year. I avoided cities and traffic for the first couple of months while I taught myself and have never had an issue since.
Careful when saying statements like "Not in the US."
Growing up in Michigan we had lots training with an instructor (who had his own set of brakes) before getting a permit that allowed us to drive with our parents. Then after a year or so you could get your full license.
Our learning AIs will have an instructor, just remotely
> the DMV will require that those testing autonomous cars without a driver present have a dedicated communications channel that ties the car to a remote operator, who can take over if needed.
>Californian citizen here that is worried that the IT companies will move fast and break things?
I'll only have faith in driverless-car technology when, the companies who make the cars, have their own families drive by the cars and walk around in mock-city testing facility for an extended period of time. SITG
Are you implying that the scenario where someone dies is strictly better if there was a human behind the wheel? Do you require a human to punish if an accident happens?
No, the implication is that there is that there are an enormous amount of edge cases that a human would appropriately respond to (i.e. no death) that the "A.I." cannot.
The rational part of my brain agrees, the parent in me does not, and I'm not sure either part is wrong.
Yes, I'm aware that a computer will statistically be a lot less accident prone than me (even though I'm completely accident free knock on wood). But even so, if my child died in an AI edge case where I know without a doubt I would have avoided the accident, it would matter a hell of a lot to me.
I'm not sure if it's right, but we will expect a lot more of AI drivers (and other) than we do of ourselves. Failing to see a stopped service truck in the middle of the highway isn't something a human does. And any death resulting from that will be highly scrutinized.
I expect a lot more from AI. Years ago I could play Quake III Arena and get pwned by bots.
I expect modern AI to recognize every car, pedestrian and physical object. I expect it to recognize all hostile targets and avoid them when feasible. I expect it to predict every possible move for both automated and manually controlled vessels and pedestrians. My AI needs to have an anti-gang mode that will protect the occupants even if that means acting as a weapon against hostiles. After all, people may be sending kids to grandma alone in the car (you know they will).
I expect all of this to occur without the need of network connectivity. My vehicle is to be prohibited from communicating without my express permission until after reviewing the data it wishes to upload.
I would like to add, if the vehicles entirely depend on their AI, then there should be at least 3 or 4 compute nodes and a voting system. Should a node fail or have false information, the other nodes should override.
I think, I agree that we should expect more from AI drivers. But how much more?
Even if they managed to decimate the number of fatal accidents, I'm sure that a large portion of people would still be really irrational about it and that worries me a lot
> I'm sure that a large portion of people would still be really irrational about it and that worries me a lot
I used to think that. The Tesla AutoPilot accidents have inspired no such reaction, though. The moment might come more when a self-driving car is used by a terrorist or school shooter as a getaway vehicle than because of someone getting mowed down by an errant algorithm, the latter being a cause of death we've been relatively numbed to.
Even if the cumulative amounts of fatal accidents is reduced, certain types of accidents happen less, if different types of accidents happen more often, say accelerating into a wall uncontrollably, that is not optimal. I worry that we will accept 'slightly better' as good enough, along with the consequences that come with it.
At some point, economics and actuaries come into play. It will be worrying when 'AI is safer than humans!' can be pointed to as a reason to not correct expensive, perhaps intractable, AI failures.
That’s all in a hypothetical future in which autonomous vehicles work as intended, are well tested, and are better than people. We’re still nowhere near that kind of lvl 5 automation, and just testing on open roads.
Be careful of the utopia trap: “anything is acceptable now for the rosy future imagined.”
Absolutely, you're right. But as a singular entity, I'm a human, and an AI is "other". I know I perform better in 2 cases, and AI statistically perform better in 100 others, but I believe I'm more special than statistics in those 100 other cases, so I only care about the cases where I could have made a saving difference.
I guess I'm suggesting there's bound to be a (n irrational) paradox where everyone wants other drivers to be replaced by AI, so they themselves can drive safely and avoid the edge case accidents.
I’m looking forward to having a self driving car. You value control so much that having amortized years of your life back isn’t worth it? How many hours of your life have you and will you spend driving when you could have been surfing the web, talking to loved ones, reading, eating or literally anything else?
The problem is not self driving cars but the unsafe driving cars, you know the move fast and brake things, self driving AI is not ready yet, when it will be ready and safe I will accept them, I wish we get safety at the level that NASA or airplanes not the safety we have in web apps and regular software.
It could easily be the other way around. Right now cars are murder machines. Your child could be hit and killed by a human driver where an AI would have avoided the accident.
Or a self driving bus with children could get confused by a weird sticker and jump of a bridge, hit a giant truck stopped in the middle of the road.
We are speculating here, we need numbers to see if this self driving cars are better, how do we get this numbers though? Is it fair for the citizens of a city to be forced to be part of this tests?
Are we sure the correct number of incidents are reported from this tests? are the tests good enough for real world (are they testing in all conditions are avoid some streets or some weather to keep the numbers looking good)
They're testing in California. I.e. flat-ish, wide roads, relatively nice weather. Sure, it's ideal for baby steps, but passing this off as "look at our safety record, therefore it's safe anywhere" is disingenuous at the very least.
Humans cause accidents in certain cases, mostly by carelessness.
AI causes accidents in certain cases, mostly by its stupidity.
These are different cases they fail in. You shouldn't be comparing a driverless AI to a unaided human, but a human aided by the driverless AI acting as a driving aid (you know, like automatic emergency breaking). this way you get the best of both worlds, the attentiveness of the AI to dumb things, and the smartness of humans.
I am sure a driverless AI if good enough won't beat a human aided by the same driverless AI acting as a "autopilot/supervisor"
This would mean that the car companies won't invest in improving because they have less accidents then some number, maybe they can put some money into lobby so that number is the largest number possible obtained with some statistic gymnastics.
Than number of car crashes is not the same in all countries,states or regions so I would hate that a safe city will be less safe in the future because of this.
> This would mean that the car companies won't invest in improving because they have less accidents then some number, maybe they can put some money into lobby so that number is the largest number possible obtained with some statistic gymnastics.
I really really doubt that! The car manufacturers will fight tooth and nail to be seen as the company providing the safest cars
Them competing to be the safest means that none of them are as safe as they could be working together, though.
If every company has their own secret suite of test cases then different companies can specialize in different aspects of safety, and different AIs will be tuned to watch for different conditions.
Imagine if instead of that they all worked together to define a rigorous test suite. Then they would all be striving to excel against all of the tests that the best of them could come up with. Wouldn't that result in more rigorous testing than any individual company would do? Especially if the results of all of the tests were public?
To go another step further, imagine an open carAI platform that had the aforementioned test suite and a full simulation platform for testing changes, with different car manufacturers represented on a committee that oversees the carAI platform. Separate the smarts from the base car a bit and have some sort of abstraction layer between the smart bits and the car bits. As long as the abstraction layer is configured properly then different AIs would be interchangeable/upgradeable on the same base hardware. All car companies (and tech companies, and interested individuals) could collaborate on building the best, most efficient, safest car AI possible. People on older hardware would get all the same safety improvements as people on newer hardware (though hardware improvements would obviously improve things like sensor quality and quantity and the like), there wouldn't be fragmentation between ai ecosystems with poorer people trapped on older releases with lower safety standards while the rich get the latests and greatest and safest cars, etc.
Obviously competition is better than nothing, but is it really better than an open, collaborative alternative?
I agree, I also think this self driving component should be standardized, maybe more then one standard but you should be able when you buy the car to decide if you want the AI or not, and if you want it to chose the AI package from company X,Y or Z.
Maybe making the component open source would be the best for the citizens.
Yes, competition is better than collaboration. Collaboration gets bogged down in committees, with each shareholder trying to protect their turf. Competition leads to improvements as companies try to find an edge/advantage over their competitors.
>I really really doubt that! The car manufacturers will fight tooth and nail to be seen as the company providing the safest cars
We seen competition not working in many sectors, like ISPs is US, operating systems for desktops or mobile phones...
IF the self driving component in the car could be swapped so you can buy say Tesla but get the self driving package from BMW or from Google then that could help competition, but without this if all companies have a similar failure rate then they can concentrate on competing on marketing,horse power, price, efficiency and not adding some extra safety, at least until some important person or a big number or children get killed by a very stupid issue like a car hit a wall because it did not see it because it got confused by some drawings on the wall.
I don't think car manufacturers is one of those sectors. Even if you take into account the periodic scandals (emission test, having to recall cars because they were literally killing people[0], etc) that pop up, cars have gotten massively safer over the past 50 years
The market doesn't bear this out with current cars, so why should autonomous cars be any different? Cars are marketed as powerful, sexy, vehicles that imbue with strength to traverse the harshest conditions in the worst weather. Macho vehicles. The only car maker that seems to have a sterling reputation for safety is Volvo, and they're a niche manufacturer.
On a larger utilitarian scale sure, but once a driverless car kills a person many legal and ethical issues will arise that you can't just brush off with that argument.
I am considering the case where a company would refuse to upgrade the sensors(or the software) if this would costs say 25% more then paying for the killed people.
I think the point is that the car would not have been in traffic otherwise, thus avoiding any death.
Though it does lead to interesting questions about whether the deaths that may occur now are acceptable if traffic deaths are reduced in the long term as a result of the testing.
I'm not californian but the crash test culture is terrifying. Companies are currently driving driverless cars on public roads with safety features turned off. They have not received any consequences even after a highly publicized video of a near accident. That terrifies me.
Interesting development, happening earlier than what I had assumed!
The TechCrunch article states that companies running fully driverless vehicles will need to enable the ability for secure remote access and control of vehicles, but I wonder whether:
1. Companies developing autonomous vehicles have focused much of their efforts on remote piloting
2. Latency won't be a killer, especially if a quick intervention is required
3. From a safety point of view, this regulation seems too earlier as some companies with lagging tech may attempt to offer driverless services without safety driver to generate PR, please investors, etc. - basically "fake it 'til you make it", but playing with people's lives this time.
I guess their hope with latency is that cases where they have an issue with latency and cases where they have an issue with the car behaving erratically in a way which needs direct human intervention won't tend to overlap. I'm not sure how good an assumption this is...
> Latency won't be a killer, especially if a quick intervention is required
Probably not controlling them but observing vehicles and steering in case they stop working. E.g. if a car stops on a busy motorway because of a software fault, a remote controller could drive it to the shoulder or next exit to avoid congestion.
My assumption is that during these trials the remote control will be based in another moving vehicle following / observing the self-driving one. That allows for direct RF control, removing latency as a factor.
If this requirement continues past the trials stage, I guess we'll have some issues to work out.
Nope. Also, handwaving - "let's assume that there's perfect 5G coverage, now a quarter our issues is gone, success!" In reality, this would be more of a "move fast and break things" scenario.
I doubt remote piloting is intended for quick intervention. Probably more if the thing is stuck in the middle of the road because it's closed off and similar.
My prediction: A new kind of lifestyle will emerge.
Nomads 2.0, i.e self-driving car travellers: People who are constantly traveling to new destinations while being able to work from the comfort of their car.
I wonder if it'd be easy to steal one of these cars? Drop a black-cloth Faraday-cage net over the top of the car as it is driving by... It'll lose sight and radio communication, so will naturally stop and be at your mercy! I'm not a car thief, so I'm not sure what I'd do after, but it was just a thought...
What do you do after you've stopped it? Now you have a parked car, you don't have the ability to drive it away. (If you did, you would have skipped the faraday cage net.)
So you need a tow truck to actually take the car into your control where you can do with it what you will (sell it, etc.). But if you had a tow truck, you could just steal any parked car, so how is the self-driving car worse off?
This is cool tech, but more cars/miles travelled is bad, not good. We would be healthier, happier, less harmful to environment, richer, have more friends, be better connected to our communities, etc. if we built mobility around walking, biking, transit, AND cars... not ONLY cars.
I have just read an article that talks about this matter especially when there's accident with self-driving cars involved, at https://www.lemberglaw.com/self-driving-autonomous-car-accid.... I think regulation will be the most important part that should be done before publicly release these cars.
The main draw for me is saving time by not needing to drive. It is currently luxury to have own chauffeur so that one can focus on more interesting things than to keep attention on the road during daily commute. Mass transit allows this but it doesn't really allow you to fully focus on something else while you are traveling.
And from engineering perspective, it's absolutely awesome to work on it, I wish everybody could experience the feeling when a self-driving car you programmed makes its first drive on its own (I have experienced it).
Ethical risks are real though, both about who is responsible in case of fatality as well as what if the technology is going to be abused for military purposes (I already rejected to lead self-driving part of an off-road military vehicle in the past).
What's the guarantee that a taxi driver will take you where you want to go? You can't inspect his software either. A taxi driver could feasibly just leave the taxi and take his keys with him but practically that doesn't happen. Also you would have a cell phone so could call their support or arrange another car
The taxi driver is a human, humans have different, less spectacular failure modes. In a world with self-driving cars, it's plausible that there's a bug/hack that makes EVERY car on the road do something wrong or malicious like take you to the wrong destination/drive in to a crowd of people etc.
You could ask the same question about any car made in the past 5-10 years. They all run millions of lines of code, and none of it is open source. It turns out not to be a problem.
IANAL, but it seems straightforward to me. The vehicle will have to be fully insured. While this is a strange concept in America, where each driver is insured for a specific vehicle, this is how insurance is done in many countries in Europe; an insured vehicle is insured no matter who drives it. So American insurance companies will just need to start insuring vehicles by themselves, and if something happens then the insurance company will pay up.
But is not only about the money, humans can lose their freedom from major accidents, what if you get a bad update or bad sensors and your car kills some people, some engineers or managers should have to pay if they did something wrong and not point to the insurance company.
I do not want some manager to think something like "don't worry about that bug, it is very rare so insurance will cover us in case something bad happens, we need to push the update now"
> But is not only about the money, humans can lose their freedom from major accidents, what if you get a bad update or bad sensors and your car kills some people, some engineers or managers should have to pay if they did something wrong and not point to the insurance company.
Software automation has already happened in other areas. If the bug was introduced on purpose or if there was insufficient oversight/negligence, people can go to jail. Not not any error automatically is a crime. If you kill someone with your car that doesn't mean that you'll be charged even if the accident was your fault.
I understand but is there something in place to handle this issue, you would need people that will analyze the car software, the updates, the practices and decide who is responsible, my question is there something in place or we need to wait for this laws to appear ?
Right. How many people will be horribly injured or die in ‘the incident’ that leads to the lawsuit that gets these companies to take this stuff seriously.
> some engineers or managers should have to pay if they did something wrong and not point to the insurance company
That's already how insurance works. You pay by paying for insurance premiums. Why do you think people shouldn't be allowed to use insurance to cover accidents in just this situation?
Say the manager knows about a bug in the AI but it consider that is more expensive to wait and properly fix it then push an update with the bug present. Should the decision of fixing or not a bug that can cause lives be influenced only by how much will insurance cost?
I am not arguing that a person should pay with his freedom for a mistake, people do mistake, I am afraid that the bad behavior from software industry will apply to self driving cars, the move fast and brake things mentality,ship with known bugs etc.
> Should the decision of fixing or not a bug that can cause lives be influenced only by how much will insurance cost?
But the price of the insurance takes into account the risk of a bug costing a life. They know if they make riskier decisions their insurance premiums will go up.
Insurance doesn't magically remove costs. It just amortises them. Loss adjusters have been experts at this for centuries.
But this isn't a new problem at all. Loss adjusters have already been doing this for many industries where someone can be killed. Insurance for normal cars already does it even! Why are self-driving cars any different?
I am sorry I am not that good with financial business practice (I was raised to avoid them and never trust the banks or similar institutions, I am not from US)
It is an interesting point how would an insurance company calculate the risk when there is not that much real data.
If an airplane crashes all similar airplanes won't fly until the cause of the crash will be found and fixed in all the affected airplanes. I want similar responsibility from cars.
I see how easy they get approved to test self driving cars without the citizens having to vote if they should be test subjects and I am worried that this companies have a powerful lobby and that the rules and laws won't be fair for the citizens.
Other thing that worries me is that you allow self driving cars without a radar or any sensor that can detect a front collision, this also feels as letting companies to test their software on the expense of people lives.
This are my opinions, I hope you can understand what some people like me think about this situation and why we think that this is something new that needs good and fair laws and rules for the citizens.
Ok, that's financial liability. What about criminal liability? What if the car breaks rules and kills someone. Will the programmers be responsible? The CEO? Can companies be criminally convicted in the US? Or will it be on par with other "machine accidents", where there is typically no criminal responsibility?
I don't understand this obsession with finding someone to hold responsible for an accident before the accident happens. When an accident with self-driving cars actually happen, if someone is found to have committed a crime, that person will be charged.
If you run a red light and kill someone, it's pretty clear who is responsible though. If a driverless car runs a red light because of a hardware defect, who is? No one? Or do you think there will be a committee trying to establish what exactly went wrong and where? With planes, when autopilot does something stupid, Boeing/Airbus will ground all planes of the same type until they can figure out what the issue is. I can't imagine we will ever hold cars to the same standard as we hold airplanes, that's just impossible.
We already have a framework in place to deal with this.
If your car's brakes fail and you run the red light because you had no way to stop, you are not at fault. Neither is the engineer that developed the braking system, unless they were criminally negligent.
If the software that runs the red light fails and gives everyone a green light and you run it and get in an accident, you are not at fault, and the person who wrote the code will not be going to jail, again unless they are found to be criminally negligent, or malicious.
If a systemic fault is found at any point, a recall will be done to remedy it.
This isn't some new class of problem here, this is the same old thing we have been dealing with for many years. You already rely on many many people doing their job correctly every day to not die, this is no different.
If your car's brakes fail, it's probably a mechanical fault due to natural wear and tear, for which it is easy to avoid assigning blame. If a car's autonomous capability spectacularly fails, it's far more likely to be due to a bug at design stage. Many of these bugs are likely to cause vehicles to behave in ways human drivers would certainly be prosecuted for, and we really haven't got a whole lot of precedent for auditing responsibility for bugs in complex software performing varied life-threatening interactions with other humans which replicates tasks humans themselves perform imperfectly and are frequently prosecuted for negligence over.
It would be a strange system which punishes humans for individual serious driving flaws but merely requires them to remedy it for introducing systematic serious driving flaws.
At which point they've crossed the line into a crime, so liability is a crime. These concepts are already in play in other industries that use software on devices that could harm or kill.
You mean other than, "Your vehicle software is defective and unsafe and therefore banned for public road use by NHTSA as non-conforming to federal safety standards"?
A machine that damages property or causes injury via its own ineptitude is defective, not accidental. Hitting a deer that jumps out from roadside cover is an accident. Plowing over a bicyclist in a well lit environment isn't. There is going to be a lot of the latter with self driving cars.
>> this is how insurance is done in many countries in Europe.
Not all though. For example in Poland you insure the vehicle and anyone with a valid driver licence is allowed to drive it and is fully covered. As a matter of fact it's enshrined in law that insurance companies have to provide cover no matter who drives the car, as long as there is insurance on the car.
I know, it was really strange to me when I moved to UK - what do you mean, I can't just let someone else drive my car? I have to add them to my insurance? Why?
I mean, I understand how it works now, but to me it still seems like an excuse to extract a lot more money from British drivers.
I'm not sure about that. The flip side of that is that my standard insurance insures me on my car, but I'm also insured to drive someone else's car too without further premiums required.
Well.....that's not quite true - British insurance usually covers you for driving other cars, but only gives you 3rd party insurance....so I would never risk driving someone's car unless it was some really old beater not worth much.
It seems reasonable to me. The risk is primarily determined by the ability of the driver, and can be gauged by their claim history and factors like age.
E.g. a 65 year old female driver with no claims and a low insurance premium. If she arbitrarily allows her 18 year old male neighbour to drive her car, the insurer would be exposed to a high risk. The only way to offset that would be to raise premiums for everyone.
I does not. In fact, it explicitly assumes that the risk depends only on the car - insurance on a 500bhp car will be a lot greater than the one on a 100bhp car, even if the owner of both is a 50 year old lady with a perfect driving history. As I said - the law guarantees that the insurance cannot differentiate between drivers, so letting your 18 year old son drive your 500bhp range rover has no impact on your premium.
Tbf - it self regulates to some extent. My dad wouldn't let me drive his car, because he knew that if I crashed it he would lose all of his discounts and pay a lot more. So even though I technically could drive his car and be insured, he wouldn't let me as that was a financial risk to him.
Not my particular area of expertise, but you can probably go after everyone involved:
Operators and owners--negligence for using car without a driver, strict liability for ultra-hazardous activity
Manufacturers, distributors, suppliers, retailers, and anyone who makes products available--are liable under products liability for any manufacturing defect, design defect, or failure to warn of dangers.
Potential criminal penalties for anyone involved (including engineers) if they put out cars knowing they aren't ready yet) or fake results, etc.
One thing that keeps most car accidents from going into full scale massive litigation is that the drivers causing the accidents are usually not worth that much money.
But if I'm a plaintiff lawyer and my client got killed by a Google car? I'm not asking for the 500k dollars the insurance coverage provides. I'm asking for big numbers because juries are going to make Google pay for each death.
The manufacturer is the one "driving" the vehicle so I would assume that they have all the liability. As an occupant, it's certainly in no way my responsibility if a vehicle I'm just sitting in hits a pedestrian or otherwise gets into an accident. It also looks as if there will still be a safety driver; they will just be remote vs. being in the car.
That prompts an unusual edge case for passenger liability. Idiots doing things to the vehicle as it's driving. Inevitably across a zillion miles, passengers will do everything that can be done to a vehicle. They're guaranteed to cause wrecks in the early years (even if it's extraordinarily rare), until or unless the manufacturer idiot-proofs everything in the style of the inside of a McDonald's. So does the manufacturer carry liability for the behavior of the people inside, or will that be pushed on to the passenger. I suspect it'll be a dual case scenario, the manufacturer will have to carry some passenger liability for such events, and the passenger will still be simultaneously pursued.
Damage liability remains with the vehicle owner, just like if you loan a vehicle to a friend. You don't insure your ability to drive safely; you insure the vehicle for damages to it and caused by it. Fault will likely be determined exactly how it is for a human driver.
I don't know how common it is, but professional liability insurance for engineers definitely exists. I expect that it's more common for consultants and others operating their own businesses than it is for employees of large companies. But there are certainly areas like construction where someone with a professional engineer license has to sign off on drawings where it may be more of a thing.
That sounds fair, you would have a software lead that will be responsible for the software, then the management can't push him around to release fast, he would make sure to follow all the rules (we would need rules if those are not there yet, maybe NASA or airplanes industry has such rules).
Human emotion in general causes dangerous situations. When we drive now, we're restricted to driving and driving alone. We can listen to the radio but for the most part we are restricted. Because of this, we tend to want to get to our destination as quickly as possible. We drive fast to get the restrictive experience of driving over with.
However if we were free to do other things, like get work done or sleep or watch TV while in our cars, I think we would generally be much more tolerant of travelling at a lower speed. While eventually our self driving systems may be capable of being very safe at very high speeds, I think reduced speeds tolerated by people could offer an extra layer of safety.