This is great video to study on many levels beyond self-driving car. Some observations:
* The car behind didn't show turn signal so probably driver wasn't paying attention.
* A small car escaping with not too much damage while SUV rolls over many times
* From physics point of view, the car behind produced almost perfect torque that started horizontal spin on SUV
* Collisions with barrier at high speed can produce lots of rolls. Many time we see cars upside down in accident and wonder how did that happened. This is how.
* The middle barrier did its job wonderfully even against massive inertia of SUV. Cars on other side remained unscathed!
The biggest thing I noticed was just how easily SUV's flip over. This has been a problem forever I'm just surprised that regulators don't take more notice of it. Rollovers are far more likely to result in injury or death.
Most cars just don't flip, all the traction their tires have is simply not enough to do it on flat ground. Yeah anything could flip over once you go off the road but the SUV in this looks like a rolling death machine.
Because people think these SUVs are "safer" due to their size. And then it becomes an arms race; you see the highway full of large SUVs, and you feel vulnerable in a tiny compact car, because you think they will crush you in a collision, so others start buying SUVs to "protect" themselves. These land-beasts might be comfortable but they are unsafe, take an excess and unfair amount of space on the road, consume too much fuel, and trigger arms race. They need to be regulated to death for most users.
Yes. IMHO insurance costs should have to figure in the externalities to society of their increased risk to other vehicles on the road, even when the SUV driver isn't at fault. Crashes will happen, but even when the root cause is another vehicle, an SUV in the mix increases the likelihood of a crash due to worse braking distance and increases the severity of the damage to other vehicles due to its mass. SUVs make the road measurably worse for all other users of the road due to their negative externalities.
This is one of the reasons I ride bicycles and motorcycles. I take on more personal risk, but the overall effect is a reduction in accident, injury, and mortality rates for road use.
Anecdotally, my former gf managed to flip my Taurus by striking a parked car on the road at 15mph. Police verified that she wasn't moving faster because she practically flipped it in place, rather than ending up a significant distance away. Just a case of hitting at just the right angle and riding it up, I guess.
I was wondering how did SUV rolled because for that to happen you need force getting applied on car that is on Y axis (the +X axis being the forward direction) which will ultimately produce the torque causing the roll. After looking at video at low speed, here's how we got that force vector:
1. Hatchback hits on tangent producing torque and spinning SUV by 90-degree.
2. SUV collides with barrier which produces force vector that is normal to collision surfaces and slightly off forward direction.
3. The driver now seems to have pressed on breaks really hard which produces another force vector. But this vector is now quite off original forward direction because car had been rotating.
So the moral of the story is that if SUV is rotating and you press on breaks really hard, you will get roll. This is purely because SUV has higher clearance and so even small force can produce enough torque that will cause roll. It looks to me if driver hadn't pressed on breaks really hard, the rolls could possibly been avoided or reduced.
I've seen a crash that was very similar. A tiny car hit the rear bumper of a SUV, on the right half, not really even a hard hit. The SUV got turned to the left and hit the slightly raised center of the road (maybe 10 cm high?) and spun and slid on its roof. Everybody was okay in the end.
I was in a very bad accident when a GMC pickup t-boned my 98 Subaru Impreza 2.5RS at ~55mph. My car crumpled and was totaled but took the hit and went for a 4 wheel slide across the road. A nice balanced car takes hits really well. And I'm very glad I wore my seatbelt.
That also depends on the momentum (weight ratio, impact speed angle, etc.) The lesser the weight, the worst off during impact - we've seen that too often in NHTSA crash tests.
Flipping depends entirely on geometry. SUVs have high suspension, have a lot of weight at their top, and have an overall higher height to width ratio. Also, their aerodynamics do not push as much down as on lower cars.
I'm feeling rather dumb, but can't really intuitively understand the physics here. I see a small car hitting a pretty huge one and yet spins the huge one several times. I always thought those bigger cars are safer / stronger / more resistant to hits?
Is it purely because it hit it at the corner, so all the force went into the "spinning axis"? (did I say I feel dumb?)
> I always thought those bigger cars are safer / stronger / more resistant to hits?
Physics would have you start with a spherical, uniformly dense car, but cars aren't that. Crumple zones engineered for passenger safety requirement mean that intuition falls short unless you really dig into it and add stuff like center of gravity to the model.
Also think about why you believe that - looks can be deceiving, and marketing, especially for rare expensive purchases like cars, very much plays on emotional appeal rather than actual science. ("Ultimate driving machine"? Pft, really?). The original SUVs were designed off truck platforms and did quite poorly in safety tests, and their higher center of gravity doesn't help at all. (Though it looks like a minivan in the video.) Just like two objects falling will fall at different speed in the real world due to air resistance, bigger != better with cars.
You can't see crumple zones just by looking at how pretty the car is; vehicle safety is one area where government regulation has made cars safer by requiring all new cars sold in the US to meet a very high safety bar, and that's improved over the years, evident in this video of an intentional crash between a 1962 Cadillac vs a 2002 model. https://youtu.be/O-WYKYrq5FI
SUV center of mass is probably higher and easier to tilt.
Both cars braking, center of mass shifting toward their front and lowering. Meaning the small car momentum was peak at the collision point, while the SUV rear end was already lifted thus lighter.
Direction of the collision helped transfer the energy into torque. The barrier provided the final help to start the car tumbles.
The SUV had forward momentum, yet it was suddenly facing sideways due to the side-back impact from the smaller vehicle. That means the tires were skidding and "gripping" the car at the base rather than "rolling", while the whole body wanted to go forward. Hence, it toppled and kept rolling because there was nothing to stop it.
The way i see it the small car just got the SUV to ram the middle barrier. The spinning momentum resulted from that crash and facing normal to the dominant moving direction.
Take two cereal boxes. Put one flat and the other on its end. Smack the flat one into the tall one. Which one begins to roll? This is severely exaggerated but you get the idea of what's happening between the car and SUV.
A) main that SUV happened (just before starting sommersaults) to position itself orthogonal to its previous way of moving. In this position all wheels of SUV created massive 'resistance' to continue move forward
B) it happened partialy because of the middle barrier. WOuld it be missing, SUV would probably move (on the arc) without sommersaults.
Change of orientation was caused by small car, but look that up to moment A) SUV more less was moving fine.
meaning that small force applied over a long length can produce huge torque! You can move a huge rock if you use a long enough metal stick for leverage. This is why Archimedes said,
Give me a lever long enough and a fulcrum on which to place it, and I shall move the world.
Large angular acceleration due to large torque is then directly result of Euler's equations.
The SUV was pushed into the crash barrier and small car wasn't, so it's a highly unfair comparison.
We did a lot of safety research before buying our current round of cars, and you need to not generalize about cars based on classifications like "SUV". Look at actual data from actual cars on actual roads, e.g.
Note that you can find vehicles like the Honda Pilot that are enormously popular and have in effect ZERO fatalities. What's horribly unsafe to everyone is passenger cars built on truck frames.
> What's horribly unsafe to everyone is passenger cars built on truck frames.
Have any examples? The vast majority of new SUVs (which the one in this incident appears to be) are unibody rather than body-on-frame ("truck frame" style).
> Have any examples? The vast majority of new SUVs [...] are unibody rather than body-on-frame
Apparently there are 18 models of Body-On-Frame SUVs that are being sold in the US in 2016:
Cadillac Escalade
Cadillac Escalade ESV
Chevrolet Suburban
Chevrolet Tahoe
Ford Expedition
GMC Yukon
GMC Yukon XL
Infiniti QX80
Jeep Wrangler
Land Rover LR4 *
Lexus GX460
Lexus LX570
Lincoln Navigator
Mercedes-Benz G-Class
Nissan Armada
Toyota 4Runner
Toyota Land Cruiser
Toyota Sequoia
I'm not sure what you mean by "the car behind didn't show turn signal".
If you're referring to the pink hatchback that ultimately rear-ended the SUV - the pink hatchback seems to be making a lane change to the right after passing. The driver seems to have signaled and looked to the right to clear his/her blindspot at the same moment the SUV ahead slams on the brakes. It really seems like bad luck, that traffic would suddenly stop at the exact moment the driver of the pink hatchback looks to the right to change lanes.
> It really seems like bad luck, that traffic would suddenly stop at the exact moment the driver of the pink hatchback looks to the right to change lanes.
-Pet peeve of mine: "Bad luck" is incredibly rare in traffic. This accident occurred because the hatchback was too close to the car in front of it; not because of bad luck.
At 0:00 the pink hatchback is well behind the car it's trying to pass. At 0:02 the brake lights of the SUV illuminate; the pink hatchback is still not quite caught up to the car it's trying to pass. Instead of braking, the hatchback keeps accelerating trying to beat the closing gap. No signaling at any time. Looks much more like bad judgment than bad luck.
Well no -- that wasn't bad luck at all. He simply was way too close for the speed.
Back in the day when I got my driver's license we were told to have 3 seconds of distance. One second to react one second to brake and one second for safety, if I remember correctly.
Around here they recommend 2s. As part of a science fair project, I measured the following distance of cars on a nearby highway. It's been about 15 years but IIRC, the mode was ~0.5s.
Watch the video again; the little red car turns on their blinker and does a very slow shift to the right before clipping that stopped SUV. You're right that he wasn't paying attention, but his reaction time is incredibly slow - it looks like that car had just enough time to slam on their brakes or swerve to the right, yet they did neither.
I disagree that the pink car driver was not paying attention - he or she was likely turning to look over their right shoulder before making the lane change. The drift to the right isn't an avoidance maneuver executed lazily, but just a normal lane change. Bad luck that the SUV ahead slammed on the brakes at the exact moment the pink car driver checked his or her blind spot.
If it's any indication, you can read 113 km/h on the dashcam overlay. Besides, the Opel Corsa, at least for the few seconds before the crash, seems to roughly maintain the same distance from the Tesla, so it should be travelling at a very similar speed...
It's odd that the turn signal is only located at the top of the car. I guess the mid height one was broken ?
I think the situation is not entirely unclear, it seemed that there was a problem ahead of the SUV, that made a bunch of cars slow down altogether. I still don't understand the smaller red car driver decisions.. maybe he didn't know what to do and hesitated and tried to swerve as a last resort. Or maybe he was just not watching. The SUV break lights were on quite a long time..
One of the main causes of accidents are vehicles left on the road and other vehicles impacting them or the people as they take for granted that traffic will be fluid and are distracted or whatever.
You should never leave your children and family on a car on the left lane while you leave, never.
I have a neighbor that is paraplegic because of that. His entire family went to the hospital after a vehicle crashed from behind after a minor accident.
Your first priority is putting your car and family on a safe place. Proper triage priority number one is avoid another accident, by removing obstacles and signaling other cars.
In fairness the guy is saying that in the video. Stop, wait to see what the cars do behind, etc. Presumably he gets out when traffic has stopped behind him.
Exactly, he waits until the situation behind him is safe enough to get out. He also mentions a truck, so I think by the time he gets out, there's a reasonable 'impact' buffer between his family and any moving vehicles (far) behind.
Thanks you both. I didn't hear what the driver said, and I didn't really realize the potential danger of the left lane. Somehow I was only worried about people on the right side because that's where the flow of car could still move.
This is just incorrect. The NHTSA recommends to stay in normal riding position in the vehicle if you need to stop your car on the highway. Yes, you can still get hurt in some circumstances but your neighbor could be dead instead.
I think he meant you shouldn't stop in the fast lanes to begin with. Clearly if you have stopped and cannot go then stay in the car with all of its safety gear and crumple zones, but don't stop if you don't have to.
at some point there is going to be cars stopped in the road right? or is every car on the freeway behind the car accident suppose to pull off the road?
they clearly said in dutch to wait and see what the traffic behind is doing.
I still think they should have tried to safely move to the shoulder like the other vehicles, although in their position they can keep their hazards on -- and traffic has already jammed behind them so they'll be visible. The best option would be to move and then go back with road flares, but very few people just carry those in their cars.
In the Netherlands road flares are not allowed. Only police or rescue-workers are allowed to use these in only certain situations.
Here, when accidents like these occur, our automated road information system directly displays how fast you may drive there (it has sensors to monitor traffic speed). And, within minutes, a traffic monitoring group will close down the affected roads fully and signal to which lanes people have to switch to by using arrows pointing towards the lane next to it.
Not just for car accidents, but this is true for emergencies in general. Always check your surroundings!
For example, if you find a guy laying on the sidewalk, he could have stepped on a down power line, slipped on ice, or recently gotten attacked by some punk who might still be nearby. If you don't know why he is unconscious, proceed with caution.
I thought this an interesting video when Elon retweeted it. Basically the car is responding faster than the people do and keeping things from escalating. That said, having been on 101 when the cars in front start dancing like that I have to say my biggest worry isn't stopping in time its having someone behind me not stop.
The car responding faster allows for a less steep braking, which makes a rear-end less likely as the car behind you has more time to react (even assuming it doesn't have auto-pilot).
This is why I always try to brake early and long. Nothing annoys me more than drivers leaving it to the very last second, giving the person behind them as little time to react as possible.
I expect an autopilot could deal with that situation much better than a human as well. It should be able to predict if the car behind won't stop. Certainly better than a human driver could (if the human would even notice the car coming from behind with that happening in front). Then, assuming there is some space open in front, as there was in this case, it could accelerate to avoid the collision from behind. Now I'm curious whether Tesla's autopilot is already programmed to do this.
These were not on Autopilot, but human-operated maneuvers.
Why would Autopilot have a warning for frontal collisions but not for rear collisions? And why would Autopilot be speed-limited and all that if it was able to use launch mode to avoid a rear collision?
I think the consequences of this would be too significant for Tesla to silently implement. Given an unavoidable rear crash, it could cause a multi-car accident and increase the collision's severity relative to braking.
I was discussing the case of an avoidable rear crash. Vehicle approaching from rear, either not beginning to slow by a minimum expected distance, or slowing, but with apparently insufficient space to come to a stop. Space in front, allowing for safe acceleration. Seems logical to move out of the way, and I don't see any reason why it would increase the chances of a multi-car accident.
Sometimes it would misjudge an avoidable crash, though. When
an accident includes a vehicle in front, the middle car is oftentimes liable for following too closely. It might seem logical, but that "safe acceleration" could cause a pileup, and push it into an intersection for example.
That's why I usually break fast initially and check the rearview; extend the braking distance to allow extra room for the person behind if they are too close.
> That's why I usually break fast initially and check the rearview;
Great habit, but after seeing over 10,000 charts for chiropractic cases (was working on IT software to categorized it) I learnt another trick that many doctors tell their patients (unfortunately after the fact):
Once in a emergency breaking situation, before you look into the mirror, secure your neck/back of your head as hard as you can into the headrest cushion. This trick will "unite" your head with the seat and in rear-end, it will push you and the car both at the same time, instead of - from what I seen in 80% of Chiro cases - your seat head cushion hitting your head in a moment when its pulled hard to the back due to G-forces related to being in a rear-end.
I have to say seeing some photos I quickly got into a habit of first doing that, next checking mirror - if I assume someone might not make it and I still have room upfront, I go forward as fast and as close as I securely can.
When the car behind me is driving too close, I leave even more space than usual to the car in front to give myself time to brake slowly if anything happens.
I've actually found a really good solution to tailgaters: put very slight pressure on your brake pedal, enough to get the brake lights on, but not enough to actually cause any substantial deceleration. This will cause the tailgater to also slow down, but more than you because they have no way of telling how hard you are braking. The result is increased distance between you and them.
Do that a few times and even the most persistent tailgater gets it and either changes lanes or stops riding your ass.
When I'm driving, I'm usually always thinking about having escape paths. I get nervous if I don't have a shoulder or a clear lane next to me, unless traffic slowed to a near stop.
On freeways, each lane tends to develop its own speed, and at 4+ lanes the cognitive load starts to overwhelm me, especially when the relative speed difference between neighboring lanes is very high. At this point anyone in a slow lane may decide to try to pull out into the fast lane, or someone from the faster lanes might try to squeeze in to the queue of waiting cars; meanwhile if you try to unilaterally lessen the speed difference, the people behind and around you will get antsy and aggressive, often defeating the point.
These situations are some of the most dangerous that most driver experience: high speeds, large discrepancy between speeds, and limited escape paths.
I don't own a car but have a license and occasionally drive one. I'm not used to driving on highways with more than two lanes and when I spent a few months in the US I had panic attacks the first few weeks when riding at speed limit and there was no escape path. I had to leave the highway twice to recollect.
That's a good habit - however we don't know in this video if the SUV that got flipped over was braking suddenly, which can also cause collision without the rear car's anticipation (e.g. the rear car driver might be checking the right lane right before passing on the right etc. and missed that information for a split of a second right before collision.)
Please don't do this in the snow or in wet conditions. Sharp actions are what will make other cars change direction or magnify the effects of drivers that you have caused to panic. Putting on your hazards and slowing at a deliberate pace is safer for everyone behind you.
Also sudden hard brakes can cause injuries as well when the seat belt it tighten suddenly, but the software based braking will brake more accurately than a human, as you said in a timely manner.
I have no idea if it looks behind, or uses that information for anything. Ideally, it would time it such that when it was rearended, it was touching the car in front. It would significantly increase the mass and decrese the risk of whiplash.
Most Tesla's on the road don't have rear facing radar. I don't think the rear facing camera is used for anything other than parking. So, that leaves ultrasound, which has limited range.
I would think in theory they could use the camera alone to detect an approaching vehicle, and take some remedial action.
When the car is motionless (e.g. immediately after it has come to a stop from the emergency breaking), does the computer engage the breaks just soft enough so the car stays put?
And would this reduce the impact from behind if it were to happen?
I'm curious if you really want the brakes on full when you get hit from behind. It seems the cars would start to crumple in place until firmly together and you overcome friction, getting launched. With the brakes off, you would immediately accelerate, but perhaps with a lesser initial burst.
That's the decision I had to make one day. I saw a car coming to hit me from the rear. I took my foot off the brake and gripped the steering wheel, leaned back and relaxed. It worked out well, but I've always wondered if the brakes make any difference one way or the other.
I think you misread me. I was suggesting the computer don't put the break on full when stationary, but only enough to keep the car still (as the road may not be flat etc).
Well I would hope it would choose getting rear ended, particularly because that is the fault of the driver behind you. If you choose to slam into someone instead of letting someone slam into you, you are now at fault.
You can't control cars other than yours. On the bright side, if the car behind you is also a Tesla with AutoPilot 8.0+ with latest radar sensors, then you should be just fine :)
Yesterday, I was saying that autopiloted trucks would take over the long-haul trucking industries.
This is why.
Driver errors with semis on the highway kill thousands every year. This kind of technology will save lives - not just of the Tesla drivers (or other autopiloted drivers), but of the vehicles they don't hit.
It's not just driver errors. Apparently the crash detection system kicked in during the Berlin ISIS attack and stopped the truck before the attacker could kill more people
This seems to be a fresh news story but I hope it will be more widely disseminated. Among so much aimless hand-wringing over terrorism, security theater "designed to stop it", and dragnet surveillance and intelligence gathering, here is a clear example of a new technology reducing the impact of a terrorist act and saving lives.
It kinda goes against the click-generating narrative of autonomous driving technology being half-baked gimmicks rushed to market, fatal accidents waiting to happen, and Tesla and its celebrity corporation ilk flying too close to the sun on homemade wings.
The 26% statistic is misleading in this case-- just because the death didn't involve truck driver "error" doesn't mean a different movement pattern for the truck wouldn't have made a difference.
For example, in the original video, if the Tesla autopilot hadn't braked, and the car had been involved in the crash, it's not clear to me that error on the part of the Tesla driver would have been listed as a cause.
You're implying that minimum following distance <= stopping distance. A brief Google search yields a stopping distance of 315 feet for travel speed of 70 mph.[0] We don't train drivers to leave a football field of space in between cars traveling at highway speed. At least where I live, the legal standard for minimum following distance is that a driver "shall not follow another vehicle more closely than is reasonable and prudent".
If Driver #3 can't see Car #1 past Car #2, and Car #2 is traveling at highway speed, I think it's a stretch to say that Driver #3 could be faulted.
Yeah, for sure. I was implying that the other 74% will be avoided by cars with self-driving features, since the parent comment was suggesting that getting driverless trucks was more important.
Autopilot would take over trucking even if it was no safer than humans under the current safety regime applied to human drivers, simply because it could drive nearly 24/7 without degradation, improving asset utilization.
The trucking "industry" is probably pretty okay with losing drivers - its the only class of employees affected and they've had a shortage of qualified drivers for years [1]. Heck there will be a whole new class of higher paid mechanics and technicians maintaining the auto-pilot systems. There will also undoubtedly be a higher bar government mandated maintenance for self driving trucks.
They can still use the owner-operator model if they want to, and have the vehicles owned by a different entity than the freight company.
But in all seriousness, although I understand that people like to out-cynic each other, over the road trucks seem to be pretty well-maintained. Whether that's the result of safety regulations and enforcement or simple economics (a broken-down truck is not a moneymaking truck) is left as an exercise to the reader, but either way you do not see a lot of barely-making-it or broken-down commercial OTR trucks on the road. That level of poor maintenance is generally the province of private vehicles.
I'd hope a poorly maintained automated truck which crashes would result in extremely heavy fines and lawsuits, to the point where it could seriously threaten to put a trucking company out of business. Make it uneconomical not to maintain the trucks.
I agree. My point being, trucking companies don't have to maintain drivers, they push this onto drivers themselves. So if they simply s/driver/autopilot/, they'll have to start maintaining.
No way. I use semis as a highway safety feature. They can see farther down the road and are much less likely to get in an accident statistically. When visibility is poor, I follow one (with a large follow distance).
Good. Some things to keep in mind: Semis can clear obstacles you probably can't - and at least around here are trained to not break because of some road hazards. Semis are not manuverable, and if they lose a tire the last thing they want to be doing is applying a now-uneven breaking force.
My car, with myself and a friend inside, almost got ran over by a truck when the driver didn't even bother to slow down for the yellow light and crossed at high speed the red light. Luckily he didn't hit anyone.
I don't know your situation, but my experience driving big heavy stuff is that sometimes a yellow light does not provide enough warning to stop 55mph * crazy weight, and you sometimes find yourself with a dilemma of entering an intersection just after a light has changed, or a few seconds after the light has changed.
It sounds to me like the driver shouldn't be going 55mph in the first place, if they can't stop before the yellow light turns red.
It's similar to icy roads. If you go the normal speed limit and you slide into another car, it's your fault. The fact that the speed limit was higher than your speed doesn't release the driver from responsibility.
It's the responsibility of the traffic engineers to make sure the yellow light is long enough to accommodate reasonable braking time and distance for all of the traffic that can be on the road, travelling at the posted speed limit. They can lower the speed limit, increase the yellow light duration, or put up a sign limiting the max vehicle weight.
It's not reasonable to expect every driver to know detailed information like yellow light times for every intersection and section of road.
In this particular road and crossing there is a sign with flashing lights indicating "PREPARE TO STOP WHEN FLASHING". It was flashing while the crossing light was still green to indicate that it would turn yellow pretty soon, but the driver still didn't slow down.
In my state, the minimum yellow light length is 4 seconds, but it's obviously not uniformly applied. You can get a device to time the yellow lights from the state if you get a ticket, and then use that info to dispute it. Not much help if there is an accident.
It's a big enough problem that there's a group trying to do something about it.
Icy conditions are maybe predictable (ok, it's under 0°C outside), but yellow light times (edit: the time between the light turning yellow and the light turning red) are not.
Yellow light turns are predictable in that you know that they'll happen, just not when. So if you have a heavy load or other adverse conditions you should slow down to a speed where you could stop at the correct distance. Software should do an even better job of predicting the weight to speed to stopping distance ratios.
You don't know how long the yellow light will last before turning red. That time is variable from light to light and essentially unpredictable to drivers.
but the problem is you don't know how long an yellow light will be. They are all different. State to state. Town to town. Hour to hour. No standardization.
Chicago was caught making their yellow lights purposely short on some intersections to make more money.
Right. I actually wonder how self-driving Semis will handle such a situation.
Given that it's all math at that point, I wouldn't doubt if regulations surrounding minimum yellow-light timing are updated to more accurately represent the stop time of heavy loads. Otherwise it should be easy to prove in court that stopping in time would be impossible given the speed limit & weight restrictions (or lack thereof) in place at a given intersection.
You shouldn't be driving a vehicle through a non-freeway (i.e. a road with stop lights) at a speed where you can't stop in time. In many jurisdictions that is considered reckless driving. If your vehicle is too heavy then you (or the software) needs to slow it down.
I sincerely hope there will be such a technocratic solution to this problem. Recall that there was a trend in the late '00s where cities shortened the yellow light time so they could reap greater fines from red light cameras. Having a definitive length of time that yellow lights 'should' be will be helpful.
This road is primarily used by heavy trucks. There are signs with flashing lights indicating "PREPARE TO STOP" before the crossing exactly for that reason. That driver sure knew what he was doing.
Some were arguing yesterday that there'd be a strong negative emotional response when a driver-less car causes someone to die. I hope there will be an equal and opposite reaction when people are saved by this technology.
Not the least because truck autopilot don't play videogames while they drive!
A too common practice in Europe is to drive by ear. The truck driver will not watch the road unless the wheels are running over the side lines which makes a noise.
Here is a video by Bjørn Nyland demonstrating the ability of the Tesla radar to detect the motion of the car in front of the car you are following: https://www.youtube.com/watch?v=cG3Jp5GyPoc
It shows, how the radar detects, that the lead car breaks first - watch how it gets highlighted in the dashboard. This ability helped the Tesla to handle this accident event.
Cool video. I guess while self driving cars may have inferior analysis to humans for a while they may outperform through better sensors like the radar here.
I think this accident could've been entirely avoided had the small pink hatchback been a Tesla.
My read on what happened here is that the pink car was making a passing maneuver, and just as the pink car driver checked their right-side mirror/blind-spot and hit the right turn signal to get out of the passing lane, the car ahead abruptly stopped. That explains why the pink car drifted so slowly to the right - it wasn't an avoidance maneuver, it was a simple lane change while unaware of the stopped car ahead. Presumably an autopilot in the pink car would've been able to see the stopped traffic ahead and hit the brakes even while the driver was looking to the right to change lanes.
A safer following distance is crucial too, fwiw. It honestly feels like 90% of the vehicles on the highway travel at insanely close distances to each other.
My mom got in an accident due to driving too close to the car in front of her. It caused me to really take heart to my Driver's Ed teachers word of wisdom to always leave enough space in front of you to be able to safely brake to prevent that. I wish everyone would do that. It also makes driving in traffic less stop-and-go
yeah i try to do that but then other cars just merge in between the space i leave in front of me and force me to be on their tail. meanwhile the guy behind me never leaves enough distance either. safe driving doesn't mean anything when others drive recklessly
Safe follow distance isn't an evolutionarily-stable strategy. The number of accidents avoided by people following it is more than offset (at least psychologically in the minds of the drivers) by the perception that people are constantly cutting you off if you adhere to it. Since punishments for failing to adhere to it are few and far between, drivers get sloppy.
It's the sort of thing you could, however, program into a robot and expect the robot to adhere to it.
> It's the sort of thing you could, however, program into a robot and expect the robot to adhere to it.
I've often wondered what people will initially do when being driven around by a robot. How anxious will it make them feel, not to give up control, but to see a robot driving "so slowly". Not going 10 over, letting other drivers in when possible, always slowing to keep a safe distance, not passing aggressively if someone is going a few miles under, etc.
In the far future, i suspect people will be so used to being driven that they won't care how fast they're going. They can browse, work, surf, do whatever they like. But the first wave of drivers being driven.. it's going to be interesting to see how people respond to it.
And that's not even accounting for the drivers who still driving, but seeing robots on the road. I bet they'll always want to pass robots asap.
I think that you were fooled by the news coverage. It's only on non-divided highways that the new restriction applies, and there was already a restriction to speed limit + 5mph on such roads. On divided highways, you can set any speed up to 90mph.
The freight vehicle or bus is pretty similar to a robot today. They have much stiffer penalties for speeding, much more difficulty changing lanes (including often being banned from passing lanes), and much more difficulty stopping. Experienced truckers and bus drivers do all of these things.
Other peoples poor driving shouldn't make you drive worse. In the Drivers Ed classes I've taken, the recommended advice is that if someone is tailgating behind you, you should leave EVEN MORE space in front of you so that if you need to hit the brakes, you won't have to hit them as hard and thus won't be as likely to be rear ended.
That's not a reason to use the handbrake over the brake pedal. Typically the brake pedal is more effective (all wheels, ABS) than the hand brake (sometimes 2 wheels, probably no ABS). The handbrake is just for emergencies and parking.
When I learned to drive in the US (Washington state) I received advice similar to that, however when I took lessons in the UK, I was advised that when you are rear ended, you will have a hard time keeping pressure on the break pedal so you should do both.
When people behind don't respect the safety distance (and, for me, 1s is my level of tolerance), I switch on my warning lights. I keep them on for 20km if necessary. I know it's illegal, but if that's what it takes to make others respect my safety distance, I think it's worth it. And it's better than other (bad) solutions like hitting the brakes to test the guy's safety distance...
We should just be allowed to carry police-verified dash cams which issues fines in live ;)
> yeah i try to do that but then other cars just merge in between the space i leave in front of me and force me to be on their tail.
Which is perfectly fine - it's safer that way. Not wanting to feel like a 'loser' is understandable - but when it is life and limb that's on the line; I know which side I'd rather be. On the other side is road rage and stupid games of chicken with your life on the line.
Well first and foremost, i don't think a safe following distance is "just barely enough for one car to squeeze in in-front of you". If a car merges in front of you and you're suddenly in danger mode, you probably were already following too close.
Secondly, even if you are following a safe distance to the car in front and someone merges, causing a now less than safe distance, it is expected that you'll have to move back to a safe distance.
The only way that could be any different is if you somehow always kept 2-3 "safe distances" in front of you, and that way you were never at an unsafe distance if someone merges in front of you.. but that seems a bit of a slippery slope.
Regarding "how is it safe"... it's a safe following distance... i don't understand how you could even not consider it safe. What you seem to be suggesting is that "it's not safe enough, so i never want to be safe", which just seems a bit asinine. Ie, because if someone merges in front you now feel unsafe, you'd rather feel unsafe all the time by never leaving enough room for someone to merge and always driving at an unsafe distance.. that is just crazy, imo.
It likely is slower, you are correct, but safe driving isn't about shaving off 3 minutes from your morning commute. It's about safety, pure and simple.
It's kind of hard to tell with the low camera and no depth perception, but it looked to me like the hatchback did between one and three unsafe things there. An autopilot would presumably just not do these things, thereby avoiding the accident. A good driver would do the same, though.
1) Approaching the SUV too closely (can't be sure of this one, it's possible that the SUV moved into the left lane just prior to the start of the video). Even at the beginning of the video, the hatchback was too close to the SUV.
2) Attempting a lane change while too close to the vehicle in front and the vehicle to the right. Lane changes are the most dangerous maneuvers in highway driving, because they divide your attention and may not be anticipated by other motorists. You want more space than you would normally need both in front and to the side you're merging toward.
3) Accelerating during to the lane change. I'm not sure that this is what happened; it's difficult to tell without stereoscopic depth perception. What I think happened is that the driver of the red hatchback accelerated during the lane change, approaching the speed they wanted once they were in the clear lane. This is very commonly done, but it trades away a lot of safety in exchange for a little time. In any significant traffic, you should complete the lane change first, then accelerate to the desired speed for the new lane.
That's not what I got from watching the video. It looks like traffic was slow/stopped in the passing lane. You can see brake lights in front of the pink car, especially in the slowed down clip. The driver of the pink car seemed to be in the process of passing, but didn't pay attention to traffic ahead of them, and thus crashed into the SUV.
Autopilot isn't even necessary. Any car with Forward-Collision Warning With Braking would likely have not gotten in an accident, or at least one not nearly as severe. Fortunately, many major manufacturers have committed to eventually making this technology standard.
This is exactly how Elon described the system. Forward looking radar now bounces radar under car in front, getting back location of 2nd car in front, and responding to sudden braking etc.
Exactly so. The Tesla Model S dashboard display shows the car in front, and the car in front of that. The little car cartoon goes blue when that car is controlling the speed of the Tesla, and red when that car is causing an emergency stop.
I've seen this happen, thankfully without the crash and rollover, in my car (Model S). My car has stopped when the second car in front decelerated fast.
This kind of forward-looking speed control can also contribute to the melting of compression wave traffic jams.
Radar's a tough signal processing deal: a soda can on the road can mimic a large vehicle.
Possible next step in systemic forward-looking safety control, and much cheaper than radar: a few bits of data embedded in stoplights could announce the state of the stoplights of the next car forward (and even the one in front of that). Stoplights are all LEDs now, and they're bright enough to be seen by a really cheap camera even through mist on the glass.
> a soda can on the road can mimic a large vehicle.
This might not be a bad thing... in my early twenties I saw what I thought was a paper cup in the road and ignored it. When I hit it, my car felt like it was thrown up in the air - turned out to be someone's starter motor. I ended up with a cracked wheel rim and a flat tire.
Just brilliant technology at play there. I had to rewatch a couple of times to really grasp how early the Tesla recognized the emergency braking scenario.
One thing to note is that most dash cams, mine included, shoot at a fairly wide angle. The Tesla was likely following a good bit closer than it appears from the video alone.
This is also an interesting visual demonstration of why that Armco barrier between the opposing directions of traffic is so terrific. It's quite possible that having that in place saved several lives in this accident. It clearly absorbed some of the impact energy and prevented the SUV from crossing into oncoming traffic.
30 vs 11 actually, one of the breaks is inside a blockquote.
I wouldn't really want 'break' becoming an acceptable alternative, considering how cars braking and them actually breaking are likely to turn up in the same context and be really confusing.
That may very well reflect reality, but you also point out that it might not. Would it be better, therefore, not to post the claim in the first place?
(I'm not trying to accuse you of doing something wrong. This is just a minimal, relatively harmless example of a larger and more concerning phenomenon in this age of information spreading so quickly with so little verification. I suspect we're going to need to take a sharp turn towards self-censoring those little unsupported claims, among other strategies.)
It may well be better not to post the claim, but you also seem to accept that it may be. Would it be better, therefore, not to post your claim in the first place?
(Personally, I see nothing wrong with posting anecdotal evidence as long as it's clearly identified as such.)
I was asking a question, not making a claim. Though I obviously made it clear what I think the answer likely is, I'm genuinely interested in the thoughts others can contribute. I'm not sure what the snarky response contributes, though.
I'm just illustrating that you're doing the same thing you're advocating against, so it's rather self-defeating. If this sort of vague, personal thinking is OK, then your post is pointless. If it's not, then your post is bad. Either way, it doesn't make any sense.
There's a deep difference between a conversation inviting disagreement and those with solid evidence to step in, and a strident claim that will only be addressed and corrected by people who are willing to be confrontational.
Are you thinking I said your comment is a strident claim that requires confrontation to disagree with? I said the opposite: both your comment and the one you originally replied to are conversational and easily amenable to contrary views.
Fair enough. We're making different judgments. And it occurs to me that there might be another important distinction there: I was bringing up a question of value judgment, whereas the commenter I was responding to wasaking a claim about the actual frequency of a phenomenon.
What puzzles me here is that the SUV flips over after being hit in the right-rear corner by a relatively small car and bumping into the side barrier. Flipping over in a case like this looks more likely to happen to an SUV than a lower-centered car, and makes me question the safety of SUVs.
SUVs are not as stable as cars with a lower height. Think about the typical race car which is designed for turning fast, something that requires stability: light, low and wide. SUVs? Heavy, high and narrow relative to their height. Electronics can do marvels, but with the same electronics physics wins.
I remember I could easily outperform a couple of BMW SUVs on a highway in a descent from a mountain pass. They had passed me on the way up, more HPs and gravity helps handling. Then I kept turning at 130 km/h on the way down, where gravity is detrimental to handling. They were suddenly much slower, especially one of them (a prudent driver, I guess).
I came down I-70 into Denver one time in a rented Ford Expedition. Wasn't going that fast, but when I hit the first curve, the body and frame started to really lean onto the outside suspension. I was scared to hit the brakes and cause it to pitch more, so I just barely touched them and had to let it drift into the next lane a bit. The turn went on forever!!
On the very same road I wrote about, I've been driving another car that couldn't turn at 90 km/h uphill without drifting outside. It was not a SUV. Suspensions setup is very important for safety: a car that can turn fast is a safer car. I tested an Audi TT on ice at low speed (ice tires) and it was like turning on asphalt. Other more normal cars, not really.
Hitting the brakes hard is typically the first instinct, but is rarely the correct course of action. Ice is also something that afflicts drivers in Colorado.
It's the standard speed limit on highways in Italy. It can be lower when the road or its conditions suggest prudence.
It can also be 20 km/h faster on wide straight roads but I still have to see that. I guess nobody is risking the bad press after the first accident at 150 km/h.
The Coquihalla highway (BC-5) is notorious for it's steep hill and the speed limit is 120 km/h in ordinary conditions. I could probably find a lot of highways in BC that are similiar.
This is a commonly known problem of SUVs, their center of gravity is a lot higher so they're a lot easier to flip over. That's why I'm puzzled by people who buy urban SUVs, that is, jeep-like cars that they never intend to use off-road.
1: Station wagons that seat more than 5 people basically don't exist anymore (last I checked only the Mercedes Benz could do so in the US).
2: Minivans have an image problem (still associated with the "soccer mom" stereotype for Gen X; millenials appear to be less averse to them). In addition, most minivans are elevated so that they can have a flat floor, so they similarly have an elevated center of gravity.
Yeah, if I was willing to drop $90k on a car that would have been an option. The E-class station wagon is cheaper, but still more than we were looking at. The Mazda 5 was an amazing car, but is now discontinued in the US, meaning there are now no compact cars that can seat 6+.
They flip over easier but they are more heavy, which is safer in crashes. In a head to head crash between a car and an SUV, the SUV is significantly safer. I believe SUV's have lower fatalities per mile driven.
> I believe SUV's have lower fatalities per mile driven.
I would be interested to know how this compares with other vehicles if 'fatalities per mile driven' included fatalities that involved non-SUV occupants.
Height is the big advantage of SUV's and minivans. Kids stay in carseats much longer now and the seats can be quite bulky. It's a lot easier to load them into something with some height.
Is the assumption here (flipping over = very bad) correct? Is it possible that [edit: the force] can dissipate more easily if it flips (provided the car is built correctly to withstand the force of a flip)?
NB: I know nothing about this topic, genuinely curious to hear from someone who does.
I think you're on to something. The people in the small car were almost certainly harmed a lot more than those in the SUV. Just look at how hard it slams into the SUV. Provided the seat belts are able to hold the passengers in the SUV in place, I think they survived without much harm. I'm more worried about the passengers in the small car.
I'm genuinely curious about this too. In my uninformed opinion, for a car that's heading toward an obstacle (e.g. car, truck, pillar, wall, lamp post) there are a few possibilities:
1) heading straight while braking: probably the best case as you have the collapsible trunk between you and the obstacle, and you can reduce the speed, if by little;
2) spinning without control and crashing with the back or the sides: worse than 1), but not so bad as there are side airbags and the headrest. The car is at least somewhat protected on all sides, plus there's some tire friction for reducing the speed;
3) rolling over: less friction as the car is rolling, plus the car may crash into the obstacle with the car's roof or floor. Both are certainly more than a layer of tin, but not as protected. Plus, objects flying around the car and limbs in an unsafe position when the crash happens.
Modern cars are incredibly goo at absorbing shock from the front and rear directions. They are much worse on any other direction, wether it is lateral or vertical. The flipping means everybody in the car had a huge shock the first time it felt, probably worse than anything people suffered on the other car.
Flipping also does not help stopping faster (in case there's something dangerous ahead). Instead, the faster way to break is by using the wheels just like the car is designed to do.
My understanding is that a lot of recent year car models at least in Europe come with a safety feature exactly like this. Here is the video promoting Skoda's "Front Assistant" feature: https://www.youtube.com/watch?v=ounFkvTuobY
If Skoda has it I am sure that all other models in the VW Group's line (VW, Audi, Seat etc) should have it.
That's not the same. The Skoda video only shows detection of the car in front of you and responding to that. In this case the Tesla also bounces the radar under that vehicle and responds to the motion of the car in front of the car in front of you.
If you watch the video again carefully, the Tesla alerts before the car in front brakes, because the car in front of that is braking.
You are right, this video only hints at this at the end but if you read closely the specs Skoda seems to promise exactly that - automatic breaking to avoid collision. Here is a blurb on their Front Assistant feature (http://www.euroncap.com/en/ratings-rewards/euro-ncap-advance...)
"Skoda Front Assistant is a system designed to help avoid or to mitigate accidents into the rear of preceding traffic. A long-range radar, positioned at the front of the car, can detect vehicles up to 80m ahead which the car is likely to hit unless action is taken."
Keep in mind, Skoda is not an upscale brand, they are on the budget end of things. The VW and Audi models might/should have better systems.
The new Peugeot 308 also offers a 'Emergency Collision Autonomous Braking System' (http://www.peugeot.co.uk/showroom/308/5-door/p=safety/) but on closer look this is actually 'an automatic brake application to reduce speed by a maximum of 12mph to reduce the severity of impact'. Much less useful, but at least moving in the right direction.
Even a 12 mph reduction can greatly reduce the severity of a crash. Every foot per second makes a difference.
One problem with suddenly braking hard to avoid a collision in front is then you get rear ended (happened to me). Sometimes the best approach is to swerve onto the shoulder instead (also happened to me, and I watched the guy behind me plow into the guy in front).
yea, the Tesla can see where the driver cannot because the leading vehicle is obstructing the view (it doesn't brake so you don't even get alerted by it's breaking lights...)
I'm sorry but is everyone here sucked into the "reality-distortion field" that Tesla Auto Pilot seems to be generating? Nothing happens in this video that a competent driver wouldn't have done.
Detailed Explanation:
In the UK part of the driving theory test is a "Hazard Perception"[1] exercise that test candidates awareness of hazards around them by playing short video clips and getting the candidates to click when they first spot a hazard they would need to respond to.
When watching the linked video I 'click' at 0:04 when I see the multiple brake lights though the car directly in-front. This coincidentally is when the Tesla responds with its audible warning.
The factors that lead to the Tesla not being involved in an accident in this video were not related to Auto Pilot but due to a competent driver: 1) Maintaining appropriate breaking distance from the car ahead to i. be able to stop in-time but without being tail-ended due to fast breaking ii. have 'thinking distance' to allow for slowed reaction timing 2) Watching the road ahead and noticing solid breaking of ahead vehicles though the directly in-front vehicle. There is NO AUTO PILOT MAGIC IN THIS VIDEO
I do not dispute that in other circumstances and perhaps other videos Tesla Auto Pilot HAS prevented and accident that a human would not have. This video is NOT such an example.
You comment describes exactly why this is so amazing - the autopilot executes the same as a competent driver! It's magic because it's the first time in human history that we've had such skills available in a car autopilot.
It's like saying cruise-control is worthless because any competent driver can maintain their speed - you're missing the point.
We're achieving parity between autopilots and human drivers, except the autopilot will never be distracted or tired, and always operates with the skill of a competent driver (and many drivers are not competent).
>You comment describes exactly why this is so amazing - the autopilot executes the same as a competent driver!
I can agree with this. Too bad it isn't the default sentiment instead of over-the-top optimism or cynicism.
I don't think anyone here is surprised that people get into stupid, easily avoidable car accidents all the time. The argument seems to be about where the technology currently sits. In this video the Tesla braked and avoided rear-ending the colliding cars; so did the vehicles with no autopilot in the right lane. This is "impressive" to some people, but it's also the bare-minimum level of acceptability for self-driving vehicles.
> it's the first time in human history that we've had such skills available in a car autopilot
I doubt this assertion is true: lot's of cars with plain old cruise control can select a safe following distance and detect when the car ahead is braking. Roof-mounted LIDAR can see several cars ahead (and behind) and it would be negligent to to apply the same collision avoidance logic to car n+1
I think we'd all agree this isn't magic; the point is that it works in spite of the driver. A "competent driver" is going to be especially alert during an exam, but over the course of countless errands and commutes, their guard could be down, their reflexes may suffer due to lack of sleep, etc.
Autopilot doesn't get tired, and never lets its guard down. That's what matters.
My intention was in no way to underplay or diminish the achievement of Auto Pilot but to simply make the point that: In this video, of this event Auto Pilot adds nothing. In this case the driver was "alert and competent". This video does NOT showcase or demonstrate Auto Pilot preventing an accident, reacting faster or better than a human.
That is my point. People seem to be extolling how Auto Pilot saved that day in this video. It didn't.
How do you know? Driver could have been looking back at his kids for all we know when the car beeped and he then paid attention.
And yes, I would say that the autopilot reacted faster than a competent human driver not expecting a crash while watching a 30 second clip. Probably by time measured in seconds.
I do agree safe following distance made the auto-breaking pretty irrelevant in this specific case as there was plenty of time for a human to react. But it's still impressive to see things working, and a bit of positive public PR hype over this tech can't be an entirely bad thing.
I agree that some of the headline around this video are a bit overstated, but it is nevertheless a deserving credit to Tesla's autopilot system for handling the incident so well. I also agree that an "alert and competent" human driver would handle the situation the same way, but I don't think it's safe to assume that all or even most drivers are usually "alert and competent".
Imagine if you could be confident that the cars sharing the road with you were all "alert and competent" at all times because they were using a solid autopilot system.
You have highlighted a fundamental misunderstanding people have about Auto Pilot in its current form it is an SAE Level 2[1] driver assistance function.
It is not supposed to be functioning in spite of the driver. The driver is in charge and should be responding to events. It is dangerous to behave otherwise - examples include pretty much all Tesla Auto Pilot attributed fatalities.
[1] Level 2: The driver is obliged to detect objects and events and respond if the automated system fails to respond properly. The automated system executes accelerating, braking, and steering. The automated system can deactivate immediately upon takeover by the driver.
Well, yes. The amazing part is that the Autopilot does things that a competent driver would do.
Very often, humans do not do those things. And despite the carnage that results, over more than a century of experience with automobiles, we have not managed to make people significantly better drivers, nor come up with a licensing scheme that does more than weed out the most obviously medically-disqualified or incompetent people. At least here in the US, you take your drivers test (which is a ridiculous, comic farce) exactly once in your life, after which you can basically drive until you are blind, comatose, or--very rarely--you kill someone. But rarely the last one, not because people don't kill each other, but because it rarely results in a license suspension. We've just tacitly accepted that people are bad at driving.
The UK driving test is significantly more rigorous than the US, and it shows: UK drivers are amongst the safest in the world (2nd lowest road fatalities per 100k people; top five per 100k vehicles). The US isn't even in the top fifty (and for calibration to the American reader: Canada is in the top 30, but hey, at least the US beats out Mexico). At the end of the UK practical examination process I would expect all new drivers to be "competent" by this standard.
As a Brit, though it would be nice to think we drive better, I think most of the effect is down to road engineering. Roundabouts have lower fatalities than traffic lights for example.
> Nothing happens in this video that a competent driver wouldn't have done.
If you go by the street definition of driving competence, then you get exactly the accidents like the one on this video.
Safe driving distance. The primary thing most "competent" drivers don't give a fuck about. Second being speed limits.
And yes, I understand many people drive that way not because they want to, but because other drivers aren't leaving them a choice. Ironically, this area is already heavily regulated. What I believe is needed is much, much stronger enforcement of those regulations.
I was very confused about this video until I watched it with the sound on. The Tesla beeps at the driver clearly before the accident has occurred, but doesn't appear to decelerate until afterwards.
I disagree. Watch the horizon, e.g. by putting your mouse on it. During the beeping, well before the collision, the horizon moves upward suddenly, indicating that the nose of the Tesla is dipping due to sudden deceleration or braking.
Most automatic systems brake proportionally to danger. On their own (without driver assistance) the goal is usually to bring an accident down to a "safe" speed like 30mph rather than prevent the crash completely.
I guess the point is to strike a balance between trust in the automatic systems and trust in the driver to make the right decisions.
Another way to look at it is that the system is designed to protect you, not the car. Use automatic braking to bring the crash down to something the car can protect you from, the rest is up to you
After watching the footage several times, it seems to me like the car is breaking either just before or immediately at the point of impact, certainly way faster than a human driver could.
What I'm curious about is whether the autopilot predicted the crash due to the relative speed of the two cars ahead of it, or if it was reacting solely to the change in speed of the car that braked.
If you look through the red hatchback you can clearly see the brake lights of the 4x4 go on (visible from 0:02, though not yet clear), and brake hard to a stop. The hatchback barely seems to react..
I imagine it'd get pretty annoying (and dangerous) if it slammed on the breaks every time two cars got close. It knew what was happening and was ready to break when the accident actually occurred.
That's not what the article says is happening. The Tesla is alerting because the radar sees the (black) car in front of the (red) car in front (of the Tesla) is decelerating, even if the red car isn't, it's alerting because of that. It doesn't "know" they'll be an accident, it just predicts that it will have to slow.
Relying on error-prone, distracted, slow humans to guide massive killing machines is playing Russian roulette. AI driving will no doubt be eventually better than the best human driver within a decade, in nearly all instances.
The glorification of tech/denigration of humans juxtaposition is getting old fast. This attitude may seem benign, but it's insidious, carries a hint of elitism, and spills over into labor, class, and other aspects of human life.
Why can't we appreciate tech advances without devaluing people?
Because certain types of technical advancement requires an element of devaluing people.
"Calculator" used to be a job performed exclusively by humans. Moving to mechanical & digital calculators didn't only require gains in efficiency, but also a recognition that this was a job humans were simply worse at than machines, so it was irresponsible to task a human with it.
Similarly self-driving cars can't simply be a nice thing to have, eventually we have to have the discussion that it's irresponsible to have human driven vehicles on the road.
Promoting the tech and respecting people are not mutually exclusive.
I'm not arguing that we should endanger people and my comment is not about this one instance. It's about a general attitude of devaluing people. Where do you think that road leads?
When Uber vehicles were found to have run red lights a couple of weeks ago, Uber's response was that these were human operated and served as an example of why they must rid the roads of error-prone human drivers as quickly as possible. This, as if humans were a scourge that must be eliminated. There is a strange rising anger with people for being human.
And, once we've ridded the roads, labor force, economy, etc. of all the pesky humans, then what? Who owns the tech? Who has the power? Will they be as benevolent as the machines we've learned to worship? What is the value of a human life by then?
Quote whatever stats you like, but normalizing this attitude of fallible, dispensible humans is dangerous in many ways.
What do you even mean by "devaluing"? Of course a more error prone, dangerous and expensive driver has less value.
Is pointing this out somehow wrong? I think not pointing it out devalues human life.
The road of getting rid of fallible humans from boring automatable jobs leads to Utopia, and I reject your notion that this is somehow inherently different from e.g automation in agriculture or computing
A nice sci-fi trope is that the unpredictability of humanity, and when/how it will "fail" when something better would "succeed", results in the preservation of humanity itself. It's nice to consider in the wake of the elitism.
Well, I think there's something to the notion that there is value in humanity, in spite of--or perhaps even because of--its flaws and non-binary approach to the Universe. And, of course, there's the whole "I Robot" deal, wherein the robot elects to save the life of the adult vs. the child based purely on a sterile odds-of-survival calculation.
That's not really what I had in mind with my comments here, but it is an interesting point.
And maybe it is another reason to consider prudence as we rush headlong into a world increasingly reliant upon technology.
> Well, I think there's something to the notion that there is value in humanity, in spite of--or perhaps even because of--its flaws and non-binary approach to the Universe.
We are approaching a very scary time when this conjecture will be subject to cold empirical evaluation.
I think we'll pass the test. But wishing that humans were good on things we aren't do not help anybody.
> I've been clear about my meaning. Can't simplify it any further.
You really haven't. This entire thread is you taking disposablezero's "Relying on error-prone, distracted, slow humans to guide massive killing machines is playing Russian roulette" comment out of context as some moral statement about human beings.
There's a bunch of things humans are worse at than machines, you're the only one asserting that this somehow "devalues" people without any further clarification.
I'm also worse at shoving nails into wood than a hammer is. How does that devalue me?
> [Because certain types of technical advancement requires an element of devaluing people]. That's your quote. What did you mean?
The set of things machines are better at than humans is only increasing over time. Today nobody disputes that a combine harvester is better for the task than a team of humans, but there's a tendency to overvalue human work as machines are getting better.
I've noticed that changing with "handmade" within my lifetime, and we're probably about to see it change with "human-driven vehicle" within the next decade.
> There's a difference between preferring tech to do the driving and saying the human has "less value". You don't see that?
No, I really don't. You haven't defined what you're talking about when you're saying "value" or "devalue", but I will. I'm talking about value in the economic sense. I.e. if you've made a combine harvester instead of needing 50 people to do manual labor for 16 hours a day you've optimized your economy and added net value to our civilization.
I really don't buy this argument that saying a human has less value is a negative. The logical conclusion to that argument is saying that we should unwind civilization and technological advancements until we're all hunter-gatherers again. Because surely the existence of the food industry devalues hunters everywhere.
> In any case, a hostile regard for human fallibility that approaches promotion of the dispensability of humans isn't a good thing.
No, it's the exact opposite. It's saying that we shouldn't waste a human being's potential on some menial task like manually picking corn in a field or driving a car.
To someone living in the future we're hopefully heading towards the notion of someone needing to make driving a vehicle their full-time job will be as ridiculous as the notion of of a human being needing to manually carry freight is today, as opposed to using a truck.
Well, I guess I haven't. Perhaps that's because there are really a myriad of contexts for the word "value" here. For instance, you've chosen to focus mostly on "economic value".
And, sure, I have an argument with that too. For instance, you say this:
>It's saying that we shouldn't waste a human being's potential on some menial task like manually picking corn in a field or driving a car.
Sounds great, but nothing in recent history bears out a desire for this as a driving force or even a significant consideration. In fact, the reality is so far off that this argument is intellectually dishonest. That is, we're not handing over millions of people to a life filled with purpose, now that they are released from the "menial tasks" that were once destroying their potential. Instead, we're really just leaving them rudderless (and penniless) in a society where our value as humans is significantly defined by our work/economic-output.
Further, the economic benefit of this automation accrues to a relative few at the expense of those "newly freed" individuals. It is dramatically redistributing power and wealth and threatening self-determination as well as the democratic model. In short, it tacitly promotes a very real form of oppression.
So, piling onto that with a further denigration of the value of those being disenfranchised is unhelpful to maintaining a healthy society--to say the least.
>The logical conclusion to that argument is saying that we should unwind civilization and technological advancements until we're all hunter-gatherers again.
You have to know that's classic reductio ad absurdum. I've never advocated that we don't leverage tech. But, if you really want to carry an argument to a logical conclusion, consider the very real eventuality that all jobs will be displaced by technology, including "knowledge-workers". That's the difference between the current reality and the relatively low-tech technological displacements of the past to which you keep alluding: here, even skilled labor is at-risk.
This is happening on a continuum. So, I'm suggesting that it is becoming increasingly more important to define our relationship to technology in healthy ways.
Instead, what I'm noticing overall is that there is a certain hostility towards humans (devaluation) when championing tech these days. Whether it's in the labor force, with autonomous vehicles, or otherwise. Until more recently, we've known about and relied upon various tech to keep us safer and perform other critical functions. But, they played those roles as enhancements to humans. Increasingly, however, we speak overtly about replacing humans and we juxtapose the value (any value) of a human with that of technology. It's a dangerous framing.
But, it seems we just disagree here. You think there's nothing wrong with that. I do.
People tend to hate on anything which threatens their job/identity, even if it rationalizes not addressing preventable causes of million of injuries and deaths. That is the definition of a SJW.
That's actually not the definition of SJW. You may want to look up the history of the term and consider what your eagerness to bash people over the head with it says about you.
This is something that kills tens of thousands of people per year in the US alone- already greatly reduced by technological advances, and still falling. Excuse me if I don't shed a tear for the feelings that might be hurt among human drivers who consistently and measurably overestimate their abilities.
>Excuse me if I don't shed a tear for the feelings that might be hurt among human drivers who consistently and measurably overestimate their abilities.
Indeed. If only we could find them all and kill them before they strike again.
My point was in how we frame these things--the human relationship to technology. Specifically, it's about how we can raise the important point that humans can be made safer by technology, without it being a value statement about the human race.
I don't know how one can relate how good humans are at a task with how good a state-of-the-art machine is at a task without making some sort of value statement about the human race.
Either way there is a huge difference between "I'm fine with hurting people's feelings" and "I'm fine with killing people"
Every death in my family that I'm capable of remembering was auto-related and the driver was at fault. All of them were easily preventable if the driver was not distracted or wasn't straight up breaking the law. That is 6 family members I would still have if not for complete human incompetence.
The romanticism of "human control" is not an idea I am willing to support. Humans are terrible at most things they do. We get tired. We get distracted. We become complacent. We bend the rules. We forget steps in the safety protocol. We make incorrect decisions. If there are only 99 ways to do something wrong a human will invent the 100th way of doing it wrong.
If mechanical failures and software errors can be proven to be a safer risk than human error - then humans should be exchanged for robots. Jobs and hobbies be damned if there are lives at stake.
I'm a lot less inclined to take that attitude in a space where there were 35,000 deaths due to vehicular accident and the most common causes of accidents are distracted driving, drunk driving, speeding, and reckless driving. That's basically "humans," "humans," "humans," and "humans;" I suppose we could spin that by pointing out all the millions of miles per year people drive successfully without killing themselves or each other, but that doesn't change the fact that removing humans from the equation should push the death numbers down further.
Driving is at that special saddle-point of attention-thirsty and tedious that people can be good at, but not perfect; the models of how attention wanes in a tedious environment are pretty solid. It's precisely the sort of thing we should be trying to automate if we want to drive the fatality rate below the current 11 per 100,000 population number in the US.
Humans weren't built to drive. We can't support our attention during an entire trip, we can't deal with several variables at the same time, and we have all those emotional heuristics that are completely counterproductive when driving.
Joking? People can't fly an f-16 or most rockets, but computers can. People get drunk and plow into traffic in the other direction, run over a crowd of people because they're nuts. That's the truth, get used to it.
I am sure that Tesla didn't publish this video, an individual person did :-) (from what I have read till now). This is indeed good PR for self driving cars
>why does lately every Tesla-related post have a token "PR stunt" comment?
Why does every single bit of Tesla "news" make the front page? I see 50 videos a day on Reddit of people miraculously avoiding accidents, never once on HackerNews. This is kind of cool, but I'm going to remain skeptical that it "predicted" the accident. There were other cars that also stopped in time. Did we expect it to bulldoze into the back of the cars?
I mean, do people actually believe that Tesla has developed self-driving AI that is actually "predicting" accidents? Get enough cars on the road using the technology and we'll see amazing things that are hard to explain.
> I mean, do people actually believe that Tesla has developed self-driving AI that is actually "predicting" accidents?
I don't understand what would be difficult to believe about this. It doesn't take super-sophisticated math to predict car X and car Y are too close and moving too fast to avoid an accident. It's mostly a matter of having the sensors to build an accurate map, and it seems like Tesla has that.
>I don't understand what would be difficult to believe about this.
It seems like it could instantly react to the car in front drastically changing speeds (read: colliding with the car in front of it), and brake immediately. But predicting seems like a different issue, with far more false positives (Tesla hitting the brakes when it anticipates a potential collision). Is that happening?
You could be right, but this video certainly hasn't convinced me.
See the video again, the Tesla car stopped way before the other cars stopped, probably it is because the speed wasn't more. the car at the extreme right stopped _after_ the accident car turned twice, tesla almost stopped in an instant.
Does a Tesla continuously record all conversations within the vehicle? The sound recording is clearly from before the warning beep of the impending crash.
It's nice to know that Tesla can do that, but sorry, I just don't see anything "magical" in this video -- speed wasn't that high (~110 km/h), the gap was quite wide and the two cars on the right lane also managed to pull over and stop in time too. I have personally avoided much closer encounters driving both car and a motorcycle. In such situations, you should pray that there's no truck behind you.
"speed wasn't that high (~110 km/h)" -- That is as fast as you can go on major roadways here in Brazil> I would say it is pretty fast...
" In such situations, you should pray that there's no truck behind you." -- So true. Wouldn't the autopilot account for that before braking that fast? It would cause a much bigger problem if not...
Where is that? Where I live there is a 100km/h limit, and the local police sometimes post on social media when they ticket someone going 160km/h+.
That said, 100km/h made sense in 1970, but now with the safety of modern cars, there's no reason that 130km/h shouldn't be the standard everywhere modern cars are prevalent.
Right lane cars can see breaking SUV, you can't from this POV. This could be the moment you reach out for the candy, check the thing on the right seat, or even check the rear view mirror longer than usual and 1/2 seconds will all it takes at 110 km/h.
Isn't that because the Tesla can see the brake light on the car two cars ahead? As soon as the red car swerves slightly to the right, and the brake light becomes visible, the Tesla brakes. Moreover, the speed of the cars before it are suddenly slowly down.
Also, I think the driver deserves some credit. Note how he keeps calm and rational and tells the other adult occupant in the car to not rush out into the road and cause another accident. His first thought is to make sure he understands what is going on behind him. Cool head, good job.
This is the bit driving schools rarely (if ever?) teach and which many drivers never get a chance to think about until they find themselves in a stressful situation where they can end up making dangerous decisions. (And then, of course, most people do not think).
Road and Track says no.[1] "Except, if you watch the video, you can clearly see cars stopping ahead of the driver who isn't paying attention starting at the two second mark." What you're seeing here is that radars are much better at range rate than vision. Range rate data from a radar is as good at long range as it is at short range.
Fortunately, they were right behind the accident. If the wreckage was stopped and partially blocking the left lane, Tesla's autopilot probably would have plowed into it. Like this fatal crash in China[2] and this non-fatal crash in Germany.[3]
I don't understand why people continue to believe that SUVs are safer just because they are huge. Seems like a very common myth.
Also Tesla's radar which can detect vehicles 2 cars in front of you by bouncing signals off the car in front is pretty neat tech. Who else is doing this? I'd assume most of the luxury car makers with auto pilots have this too?
I think the most exciting thing (and we're way behind schedule on this, a lot of this could have easily been done literally 30 years ago, no problem at all), is caravan functionality, where a group of cars moves together. If you imagine 20 cars at a standstill for some reason, if they begin to move forward slowly but at the same time (less than 50 millisecond apart) because they're coordinated then they can get up to speed almost as fast as if it were only just one car (let's say they have to speed up a bit more slowly for a safety allowance). But if you factor in human response time, it can be a phantom traffic jam. There is no reason for a traffic jam because there is clear road all the way ahead of the group - yet a traffic jam slowly makes its way back anyway. Like this: https://www.youtube.com/watch?v=goVjVVaLe10&t=1m55s
coordinating this stuff doesn't require computer vision or truly self-driving cars or anything like it and could have been done 30 years ago (1986) over an AM radio standard and some kind of coordination between cars - just nobody sat down and designed that standard.
today wireless coordination technology offers dozens of choices (in the gigahertz domain) and caravans could be assembled all but "trivially", cars can know where they are via gps with no problems at all. That would be an exciting move forward. Of course, it only works when everyone is doing it, but if just a few traffic jams instantly disappear (the ones where are the cars happen to have caravan functionality and coordinate) it would improve things for everyone: the traffic jam will disappear whenever it bunches up in a way that happens to consist of coordinated cars. (e.g. a 3-car phantom jam slowly moving backward will disappear whenever it crosses 3 caravan-enabled cars - not every car has to have this functionality). But I haven't read anything about Tesla coordinating even with other Teslas, let alone some standard of intercar coordination. A shame - this stuff is way easier than the self-driving stuff Tesla and others are doing. It's very low-hanging fruit.
I think you're trivializing even the required effort to coordinate two cars, let alone multiple cars.
How do you direct the signal to only cars in your lane? In your direction of travel? How much efficiency is lost if all of the drivers don't give themselves enough space between the car in front of them and their own vehicle? Now they have to wait to move forward. How do you get all drivers to roll forward together at the same rate of acceleration?
You could have the last few questions handled by a computer, but you'd still have to overcome a lot of other hurdles (namely location awareness, peer-to-peer communication, infrastructure changes to road signals if you don't have self-driving cars which can detect light states, etc) before you could even accomplish something this simple at scale.
it's absolutely trivial compared to self-driving cars which have to perceive their environment, and I don't need to give details. I'm not an automotive engineer.
it's trivial. To show you this is trivial without pretending to be an automotive engineer I'll use the example of a screaming auctioneer who sits on top of your steering wheel. A conversation with him might go: "HEY I NOTICE WE'RE STUCK IN A JAM IS IT OKAY OF I SIT ON YOUR STEERING WHEEL AND OBSTRUCT YOUR VIEW AND THEN I'LL START PRESSING DOWN ON THE GAS TOGETHER WITH ALL THE OTHER CARS IN THIS TRAFFIC JAM SO WE ALL START MOVING AT ONCE OKAY OKAY OKAY JUST TELL ME IF YOU'VE HAD ENOUGH READY OH LOOK WE HAVE CONTACT BETWEEN THE BUMPER TRANSMISSION LED AND RECEIVER LED AND THE CAR AHEAD OF US'S REAR BUMPER TRANSMISSION LED AND RECEIVER LED OH THIS IS AWESOME THE CAR IN FRONT OF US ISNT'T THE ONLY ONE WITH THIS EQUIPMENT THE CAR IN FRONT OF IT ALSO HAS THIS AND LOOK THE CAR IN FRONT OF OF THAT ONE ALSO HAS IT SO WE'RE ALL HOOKED UP AND AND WHEN THE FRONT CAR STARTS MOVING WE'LL START GOING AHEAD TOO ALL RIGHT JUST PRESS THE BRAKE IF YOU WANT OUT OF THIS BECAUSE WHEN THE FRONT CAR STARTS MOVING I'LL PRESS THE GAS AND MOVE TOO! HEY IT'S MOVING!!! I'M PRESSING THE GAS!!! PRESS THE BRAKE IF YOU WANT THIS TO STOP!!!!! WE'RE MOVING!!! WE'RE MOVING!!!!"
it's an absurd example but shows that it's trivial. The only thing necessary to get a stalled phantom traffic jam moving is for all cars to slowly start moving together. they can maintain distance between each other.
it's trivial if it requires special equipment such as bumpers that communicate with the car in front or behind them, proximity sensors/detectors, a way to transmit this information intercar and a way for the system to collectively brake or apply gas (like cruise control).
if you're an engineer I'm sorry if you can't see how absolutely trivial it is COMPARED WITH SELF-DRIVING CARS. It could easily have been done in the 80's with zero environmental perception, zero awareness of road conditions, nothing, and not hooked up to the steering wheel in any way. You don't need the system to handle any steering, period, in order to be able to break up phantom traffic jams.
if you've never invented anything in your life then you'll have to take my word for it that this is trivial, sorry. We're 30 years behind schedule on it and have suffered 30 years of phantom traffic jams that could have disappeared whenever they made their way to high-end cars that could have been equipped with a limited caravan system to start moving together with the drivers' permission. A phantom traffic jam can disappear at every point. Even if just 5% of cars were equipped with the standard.
you don't get what a huge deal it is for a few cars to be able to start moving together, coordinated, instead of one after the other. You don't get how 500 ms - 2500 ms of human driver latency means that traffic jams form which otherwise wouldn't and that once it is formed it is impossible to brake up and snakes its way back - but could be broken at ANY point by any group of cars that briefly acted as a caravan. you'll have to just trust me that the technology for this is easily in the realm of what could be done in the 80s. self-driving cars are lightyears ahead of that.
We don't need to speculate as to the speed or proximity of the two vehicles ahead. The data the autopilot system used has it. Does Tesla own that data, or the vehicle-owner? It's collected on the vehicle, which is _owned_. I'm sure Tesla gets a copy -- do they retain the rights as well?
I would not be surprised if Tesla were subpoenaed (or otherwise requested of) for the data, in a case like this (similar to the recent Amazon Echo news, or cellular providers with geolocation in abduction cases).
Very impressive demo. I guess this is possible due to great sensors (radar/lidar), computing power (tesla has the most computing power in a car) and good trained deep nets and algorithms (again tesla,cruise & waymo have the biggest investment here)
Does anyone know how I can get my hands on a nvidia drive PX2 board and a quanergy lidar? It seems the lidars and such nvidia boards are only sold in huge quantities to auto makers and not as single units available for people to make their own robots.
I don't think Tesla use lidars, and I don't think the small Quanergy lidars are shipping yet. PX2 isn't widely available but you can pick up a TX1 which is pretty powerful, if you need something small/low power but still fast enough for your own robot. Runs Ubuntu and CUDA so you can run the same deep learning frameworks (and possibly models, if you optimise them) as on your desktop
> software update enabled Teslas to “see” through vehicles traveling immediately in front of them. In this case, however, the Autopilot seems less like X-ray vision and more like straight-up clairvoyance
I don't believe the car is clairvoyant, but my guess is radar bouncing off the asphalt under the cars.
And maybe a glimpse of the brakelights?
Yes, Tesla is using signal processing to detect radar reflections of the car in front of the car in front of you. They showed this when they introduced the firmware with this feature.
It reacts to the car breaking in front of the red car - the trick is to use radar reflections of both the red car and the one in front of it, which propagate below the red car. This feature was advertised with a recent firmware update.
I have seen plenty of 3 car accidents and the last car involved in the accident is purely because it could not brake on time and the first 2 cars involved in the accident are due to human stupidity.
I live in Dubai and it is a mix of many cultures and people drive like crazy here, I have seen a single accident involving 6 cars because most of the rash drivers will drive tail-gating
At least Dubai really needs lot of Tesla cars to avoid such accidents which are very common here
I watched a similar video recently, but a car ahead avoided an accident and the car where video was taken from was rammed in the back because of a sudden stop. Which make me think maybe Tesla's response wasn't the best possible.
The best human driver would have the same constraint. The best you might do is flash the brake lights to get the guy's attention behind you, if you smell trouble, and then use medium braking force instead of full. A swerve to the shoulder or adjacent lane might be available, and a tesla would have the upper hand here on situational awareness, since it always knows what's alongside.
At least with the original sensor suite, Teslas are not aware of cars behind them, except for within a couple feet with the ultrasonic sensors. There is no rear radar and the rear camera data is not processed.
the Tesla obviously expected a crash due to the small red car changing lanes crossing another small car on the right. Had nothing to do with the SUV i think.
The underlying software for this mainly plugs into the Instagram servers. When it detects unusually high activity in proximity to the car, like commenting, liking, and especially video posts, it makes a cautionary beeping sound.
I believe it's onboard, they have an onboard GPU - cloud-assisted would have latency problems and would also have problems in tunnels and areas with bad reception.
To be pedantic, GPUs are no better than CPUs at multitasking. GPUs are best at performing one task on a moderately large set of floating-point data (SIMD).
The Tesla vehicles use cameras, ultrasonics and radar. So they are processing graphical data (a video stream) and signal data. That's exactly the kind of task that GPUs are good at.
However, this only gets you as far as perceiving the world. The vehicles also have to decide which action to take -- following the traffic laws of the jurisdiction they are in -- which is exactly the kind of branchy imperative task that CPUs are good at and GPUs are terrible at.
What I heard was an alert go off after the first (red) car slightly veered off to the right, allowing sensors to detect the second (black) SUV's abrupt stop.
Are you sure you're viewing it with the sound in sync? There's no veering when I watch until well after the audible alarm. But if you look through the red car's tinted windows you do see the break lights of the cars in front - presumably it's that breaking action which the radar unit is picking up.
My take on this is the red car wished to switch lanes beforehand and was distracted by that, thus not reacting quickly enough to the breaking. The right indicator goes on and it slowly starts to move across just before impact (but not at a speed / timing which would indicate an effort to avoid impact)
I work on these units as a hardware engineer for a manufacturer. They are very clever bits of kit and a lot better than the other offerings on the market.
And to support your comment - yes, this isn't exclusive to Tesla by any means!
I drive one (C350e), and indeed it monitors traffic and brakes when another car in front does too. Now the question is: "does it really matter that the car can see through the car in front"? Because if the car is self-driving it will keep such a distance that it should be able to brake & stop safely if something happens in front of it. Meaning: the Tesla should still be able to come to a safe stop even if it only detected the nearest car was braking - just because of a -say 2 second- distance between the cars.
Man: Wacht even wacht even eerst kijken wat er achter ons gebeurt (Wait, wait, first take a look what is happening behind us, don't get out yet.)
Man: Blijf zitten blijf zitten. (Stay seated, stay seated.)
Woman: Niets doen niets doen niet eruit gaan alsjeblieft. (Don't do anything, don't do anything, don't get out of the car yet please.)
Man: Nu kunnen we kijken wat er aan de hand is. (Now we can check out what happened.)
Woman: Maar zet hem eerst even aan de kant. (But first put it [the car] on the side of the road.)
We now see the man running to the car in a blue shirt on the left side of the screen.
Woman to the kids: Jongens jullie moeten even in de auto blijven zitten. (Guys, you have to stay in the car for a little while.)
Kids: Ja. We willen er ook niet uit. Moet jij ook blijven zitten? (Ok. We don't wanna go out. Do you [mom] have to go out as well?)
Woman: Ja ik moet er uit, ik ben een arts. (Yes I have to go, I'm a doctor).