Hacker News new | past | comments | ask | show | jobs | submit login
New video of Tesla crash demonstrates the problem of semi-automated driving (theautopian.com)
297 points by frxx on Jan 12, 2023 | hide | past | favorite | 664 comments



I'm trying to reason "why the Tesla did stop?", in the second video on TheIntercept website, I can notice a left exit before the point where the Tesla stopped. Something I noticed with Google maps ( I don't have a Tesla, and maybe the navigation system of the Tesla is similar ), is that sometimes thinks tunnels can access the road above them ( in a tunnel near my home Google maps always tells me "turn right" in the middle of the tunnel without understanding that I need to get out of the tunnel first and take the ramp ). With this information I think the navigation system of the Tesla could have been similarly stupid and tried to turn left on a ghost road just to discover there's no road and so decided to stop.

This is just a possible reason, and I'm definitely not suggesting that this is what did happen.

This is Google maps saying "go right" in the middle of a tunnel: https://goo.gl/maps/G89cyQT2APUQuu6g6 ( the two roundabouts are connected by a tunnel )

PS: If you use street view you will see how it was 3 years ago, before the construction of the tunnel


Yeah, seems plausible. Tesla should implement a confidence-based system: if it detects an area with high chance of errors (e.g. tunnels), it should put the driver on high alert or disable the system.

They probably already have a heatmap of driver interventions, which could be a starting point. The data might be usable for training a generalized confidence map.


> it should put the driver on high alert or disable the system.

The precedent is there, prior EAP did this. One interstate I frequent; it would do the interstate merge on ramp/junction fine going north; but heading south (with a 270 degree right turn) it would beep loudly and force a driver takeover.


I found it easier to understand your example by comparing to openstreetmap, which seems up to date at that location: https://www.openstreetmap.org/#map=18/54.64443/25.05401


In this OpenStreetMap example, both crossing ways are separated by a `layer` tag. This means that https://www.openstreetmap.org/way/919604270/ has `layer-1`, essentially putting it underground, while https://www.openstreetmap.org/way/921763546/ has no such layer tag, making them clearly not connected.

Should Teslas autopilot get confused because of this that would be really bad.


I've definitely seen my Tesla think that signs / lights on an overpass were on the freeway (that I was cruising down). I think I've also _not_ seen it make that mistake in a long time, since that spot is one I regularly pass, but I haven't made a study of it so :/


I have an X and a 3, this is exactly how the car behaves when it has missed a turn because of a mismatch of map/vision data.


To confirm, when the navigation system fails, the entire call halts?

That seems undesirable, since it would lead to rear-ending, in situations like this.


Correct, it behaves very stupidly when it fails, it's like the outputs from the network are directly controlling certain characteristics, things like the tires turning back and forth in place, over and over, like it is stuck in the process of trying to make a decision.


I have a Tesla, when using just cruise control it has slammed the brakes:

because of shadows that confused it

because of concern about cars crossing the intersection ahead where there is only the need to gently slow

because of a car crossing an intersection ahead that has already crossed!

because of a steep bridge that confused it

we can’t use cruise control in the car basically, we decided it is too dangerous at worst and jarring at least

because there is so much hype around Tesla people get the impression their software is “hardcore” but a lot of it is absolute garbage.


The worst part of phantom braking is not just the need for a sudden human intervention. It’s that the required human intervention is extremely non-intuitive.

Intuitively when a car suddenly slows down it feels like the right response is to press on the accelerator: however in my experience this puts you in an unstable regime where you are “fighting” with the car. That is, autopilot continues to try to stop but temporarily accepts your override, but only as long as you maintain consistent pressure on the accelerator. Unfortunately if you even briefly let up on the accelerator, the car proceeds to violently slam on the brakes again. This can lead to a feedback loop where the car’s rapid acceleration/deceleration pattern makes it difficult to maintain consistent pressure on the pedal, and so the experience is like being in a rodeo. Worse, there’s no obvious way “out” of this cycle except to take your foot off the accelerator and let the car (briefly) win.

Counterintuitively, the “correct” way to deal with phantom braking is to avoid the accelerator entirely and instead dive right for the brake pedal: this instantly disengages cruise control. But this is not an intuitive response, you have to learn it the hard way.


> Counterintuitively, the “correct” way to deal with phantom braking is to avoid the accelerator entirely and instead dive right for the brake pedal: this instantly disengages cruise control. But this is not an intuitive response, you have to learn it the hard way.

It is absurd that Tesla is allowed to sell cars that have this problem. Why isn't the NTSB (or whoever) insisting this is fixed and/or removing these cars from the road? They're unsafe. (I had a Tesla; glad I don't now)


> this puts you in an unstable regime where you are “fighting” with the car

This and your following description sound a lot like analyses of the two 737 crashes caused by MCAS.


It is interesting that in all the talks with my friends that have a Tesla and have tried other driver assist features from other companies. Is that Tesla is only one we are worried about phantom braking aka a false positive and as such we keep the foot near the gas instead of the brakes.

I’ve also developed a reaction to reach for the stalk with my other hand and disable it at the same time so I’m no longer fighting it.


Protip: Apply the accelerator override, then up on the right control stick to disengage automation. It's the way to smoothly transition from autopilot to human control.

(Then learn when/where/why the car thinks it should be behaving that way, drive it past those zones, and re-engage autopilot. Smooth ride!)


There's a button on the steering wheel of my Impreza that I can press to turn off the cruise. I press that and then brake or accelerate as needed


I don’t think CV is good enough to rely on for self driving features.

My 6 year old car uses radar for adaptive cruise control and has only tried to (arguably) improperly stop or slow when someone crept over the line into the lane I was driving in. I have no issue turning it on and leaving it for hours at a time.


Have a 3 year old Jeep Cherokee Trailhawk, and before that was exact same model but older by 4 years. On any sufficiently long road trip (800 miles+) a minimum of one time, and sometimes more adaptive cruise control will randomly SLAM the brakes and beep wildly during the day, on a freeway going 60+ with literally nothing in front of me but empty road. Was true with the older one as well and I've never been able to figure out why. It's never done anything more than scare me, no close calls with a vehicle behind me or anything but yeah... all these systems have a ways to go.


> because of shadows that confused it > ...

After all, a Tesla is a pretty good substitute for a horse !


I think what's interesting here is it's likely the first instance where Tesla FSD has been involved in an accident which affected other drivers. [Edit] From the video the Tesla is making a lane change and stopping simultaneously which means there could be a case of the Tesla FSD/driver doing a unsafe lane change.[1]

Most of the time FSD just wrecks the Tesla itself or injures the driver of the Tesla (i.e. running into trees/dividers, running into much heavier freight trucks).

It will be interesting if Tesla comes in to provide monetary support for proving the legal case that Tesla FSD is not at fault or the Tesla driver (and his insurance) will be left to fend for themselves.

In the short term I could see Tesla not supporting the driver and absolving themselves via fine line/TOS, etc.

But the long term effects of not legally supporting any driver with Tesla FSD accidents will be that new customers won't trust this $10000 upsell product offering that's highly profitable for Tesla.

I could also see 3rd party (non-Tesla) insurance companies refusing to sell coverage to Tesla FSD drivers.

It could also make Tesla 1st party insurance also untrustworthy to customers and could become a huge liability for Tesla.

It seems like it will be a great litmus test to see if Tesla has the guts to step up for its own product.

[1] First video shows a potential unsafe lane change https://theintercept.com/2023/01/10/tesla-crash-footage-auto...


> think what's interesting here is it's likely the first instance where Tesla FSD has been involved in an accident which affected other drivers.

Autopilot has killed multiple motorcyclists, and is suspected in many other cases, totaling 19 fatalities. This isn't the first, our regulatory bodies are just incredibly slow at this.

https://arstechnica.com/cars/2022/08/tesla-faces-new-probes-...

https://www.theverge.com/2022/7/27/23280461/tesla-autopilot-...


But motorcyclists rank just above bicyclists, and just below deer. Nobody actually gets in trouble for hitting them on the road.

I would love to see manslaughter charges for more accidents. If I do a whoopsie and stab someone in my home, I'm not going to get off with a "oh my god I'm so sorry! I was tired and it was foggy." People driving should be extended the same courtesy.


Honestly it feels like a car is the perfect murder weapon, if you want to kill someone you can ran over that person, especially if that person is on a bike, and you will get a reduced sentence compared to if you did that with what is normally considered a weapon ( knife, gun, ... )

I think if you endanger someone’s life with anything, be it a car, motorcycle, knife, gun, hands; you should always get the same sentence


This came up a lot in my criminology program in university. Criminologists (especially students) spend an inordinate amount of time thinking about how to kill people and get away with it. General consensus is with a car.

The crazy part is that whether or not you knew the person you ran over in your car can factor heavily into charges and sentencing.


> The crazy part is that whether or not you knew the person you ran over in your car can factor heavily into charges and sentencing.

Knowing the person leads one to question whether or not there was motive.


Right, but if I walk out my front door and shoot the first person I see, I'm not going to dodge prison time because I didn't know the person. Someone is still dead.


We cannot omit the context: in most places, people need to drive to work, to get groceries, and everything. A car is tremendously useful but a 5 second distraction can result in a death. You can't expect people who drive every day to be alert every single second, so there's a leeway when accidents happen.

That's a tradeoff society made.

But I'm all in favor of practicing more careful driving, and banning the shit of these incomplete automated driving mechanism!


Or at least take their license once they have proven they are a hazard to others.


Or just solve the problem by banning motorcycles. Motorcycles should stay on race tracks where they belong, not on foggy roads with tired commuters.

Same with normal bikes, by the way: cycling on the same road with fast-driving cars should not be an actual phenomenon. I cycle to work every day, and love it, but would never do it if there wasn't a separate bike lane.


You’re right, wherever there aren’t protected bike lanes, we should ban cars


You are being sarcastic - but this mindset change should happen in American cities if we are to move towards a sustainable future. We should have regular roads for bicyclists and pedestrians and "car lanes" on some roads, "truck lanes" on others etc.


Or you could prosecute dangerous drivers who hit pedestrians for manslaughter / attempted manslaughter and work on the assumption that the person I charge of the most dangerous vehicle has a duty of care towards other road users and if they don't discharge that duty they shouldn't be allowed to drive.


I think “attempted manslaughter” is just “reckless endangerment” or similar. Manslaughter means without intent; you can’t very well attempt to do something without intent.

/pedant


Prosecuting bad drivers is very important, at the very least in a "Strike One to Educate One Hundred" way.

On the other hand the reason for most car accidents is bad road design, especially designs that encourage high speeds.


Yes! It's also a policy failure to allow vehicles with such high hoods. I'd be okay with requiring a CDL/professional insurance as a compromise


If harsher punishments don’t deter intentional violent crime, why do people expect them to deter accidents?


There are two different axes: the severity of the punishment, and the predictability of receiving that punishment. Severity, applied spuriously, doesn't really provide a meaningful deterrent. A less severe sanction, applied predictably and reliably, does.

If the way you operate a motor vehicle causes a death, you should be charged. If there were a meaningful risk of jail time for bad driving (including driving tired, speeding, driving recklessly, etc), people would either drive less, or would drive more carefully.


At what margin is this true? Is the claim that at all margins, harshness of punishment never has any effect on violent crime? It seems unlikely to me that people would not adjust their behaviour with respect to a type of crime, between it being unenforced vs punished by death.


Typically the claim comes up when discussing crime; one side usually advocates for harsher penalties for violent criminals and the other side argues that harsh penalties are ineffective and cruel—and usually some stuff about how the criminals can’t help it, they’re a product of their environment, etc. Typically, in my country at least, the side arguing for softer penalties tends to overlap a lot with the folks who are most likely to advocate for severe driving penalties. This seems like a contradiction to me, so I’m curious about how people reconcile this.


Not the person you’re responding to, but typically this mindset is about punishment more than deterrence. It won’t reduce the incidence of bad outcomes, but some people feel better knowing that judicial vengeance will be meted out.


Fair enough, but do those people hold the same attitude about punishing violent criminals? Seems like there’s a lot of overlap between the “we shouldn’t punish violent offenders” and the “reckless drivers should be punished” people, but maybe I’m misreading the situation.


Yes, there's still a categorical difference between "there should be some consequences for recklessly killing someone" and "jailing people forever is the only right response to a violent crime".


So you agree that there is some degree of punishment that is acceptable for violent criminals? And I don’t think any serious person thinks jailing forever is the only right response to violent crime; the much more defensible position is “jailing until they are rehabilitated”, but this isn’t about punishment, it’s about protecting society. As it happens, rehabilitating prisoners is extremely hard and expensive, so sometimes we have long sentences and other times we let violent offenders out to further victimize their communities.


Banning doesn't solve the problem. It just obscures one of the symptoms.

Motorcycles _and_ pedestrians equally take an unfair share of the mortality associated with vehicles. Which hints at a broader infrastructure and design problem, that makes it very much seem like "automated" cars are thrown into this mess without any design changes with the hope that they will also obscure the underlying error.

Aside from that, even my 650cc motorcycle got 52mpg most days. Yes, absurd power to weight ratio, but also absurd fuel efficiency and reduced lane occupancy. Very green.


Or, you know, hold drivers accountable for the carnage they cause. This is much less a problem in Western Europe. It's almost wholly cultural in the US...

We build shit transportation infrastructure, force everybody to drive, don't build enough housing (forcing longer commutes), and then people like you complain that cars aren't given enough leeway?


If someone's impaired driving in unsafe conditions results in an accident, they absolutely should be held liable. I'm tired of car-brained apologists.


That's just further entrenching car dominance. Every mode of transport should be allowed to safely use the roads, and for what it's worth motorcycles are way better for the climate as a mode of individual transport because they weigh maybe 10% of your average modern car.


Motorcycles aren’t viable in many places due to weather, and they aren’t practical for most people (people with kids, pets, large items, etc to transport).

I don’t think “everyone should be allowed to use the road safely” is a statement anyone disagrees with, but the laws of physics make this very difficult to implement in any practical way. Pretty sure that even in Europe, cycling and motorcycling are far more dangerous than driving.

Electric cars are the only viable, general purpose solve for climate change as it pertains to personal transit. We aren’t going to get everyone to start (motor)cycling or taking public transit over the coming decades, but EVs are a drop-in replacement for most personal transit use cases.


> I don’t think “everyone should be allowed to use the road safely” is a statement anyone disagrees with, but the laws of physics make this very difficult to implement in any practical way. Pretty sure that even in Europe, cycling and motorcycling are far more dangerous than driving.

Oh, of course we can practically implement road safety for everyone:

- limit inner-city speeds outside of major influx roads to 30 km/h

- build dedicated bicycle and bus lanes in cities

- build dedicated pedestrian lanes (not an issue in urban Europe, a bit of an issue in rural areas though)

- enforce speeding and distance-keeping regulations

- make sure the quality of the roads and pedestrian ways is acceptable (i.e. no potholes, even surface) to minimize accident risk

- keep heavy haul traffic on highways wherever possible, prevent toll evasion

- build out public transport to reduce the amount of individual traffic

- provide elderly citizens with taxi vouchers or other forms that ensure their mobility without having them drive themselves

- get old vehicles outside of historical preservation interests off the road to increase the amount of cars with up-to-date safety features

- enforce regular technical check-ups (Germany, for example, requires one every two years) so that vehicles in dangerous condition get taken off the road and owners of vehicles in barely-roadworthy condition also get the hint

Countries that prioritise safe infrastructure for bicyclists like the Netherlands fare significantly better in road accident statistics [1] over countries that just say "fuck it, cars first" like the US.

[1] https://en.wikipedia.org/wiki/List_of_countries_by_traffic-r...


Those things improve safety, and we should certainly do them, but that’s likely not enough to bring all modes of transit into parity with regard to safety.


We can get near parity though - in the end it's probably a classic pareto distribution problem. The last 20% of traffic fatalities will be really hard, I agree, but we can reduce the utter majority of traffic accidents very very easily.


Agreed, although I think by and large we’ve already progressed a good ways into the 80%; however, the variance is huge because some jurisdictions take safety very seriously and others ignore it to the extent allowed by national law.


Have you ever ridden a motorcycle? They’re not as efficient as weight would imply, because aerodynamics dominate when you’re not speeding up/slowing down.


Motorcycles typically get 40-50mpg. A little better than cars, but not a lot. And they transport at most two people.


You're outright wrong, I've detailed that here: https://news.ycombinator.com/item?id=34351151

And yes, motorcycles transport two people at the most - but for real, look outside a window and count how many car drives are made by single-occupants. In the UK, for example, it's 60% [1].

[1] https://www.statista.com/statistics/314733/single-occupant-c...


You'd be surprised. They're woefully unaerodynamic, and emissions standards are pretty lax for them.

Something like a CBR300F is about half the emissions of a hybrid and a grom can get lower, but plenty of motorbikes are actually worse than a compact car and getting close to SUV territory.


> They're woefully unaerodynamic, and emissions standards are pretty lax for them.

Still more aerodynamic and fuel efficient than someone riding a full-blown SUV to work solo.

> but plenty of motorbikes are actually worse than a compact car and getting close to SUV territory.

Not everyone drives a Kawasaki Ninja H2 R with 310 hp or whatever the top record is these days. Per German ADAC, the average motorcycle consumes 2-3 liters/100km [1], whereas the average car is at 7-8 liters/100km [2].

[1] https://www.adac.de/verkehr/tanken-kraftstoff-antrieb/tipps-...

[2] https://www.umweltbundesamt.de/daten/verkehr/kraftstoffe


They might be worse for emissions that aren't CO2, but this whole discussion is moot. Ebikes could be the actual future.


When measuring the health hazards of motorbikes/mopeds, emissions are a small part.

Part of the reason is that it's so fun to go fast with them.


You can buy electric scooters nowadays that can drive a lot farther than any car with the same battery. If one was big on emission reduction, one would ban all cars and only allow scooters.


With streets crowded with scooters during rush hour I think we'd have a lot more minor accidents/injuries, but far, far fewer deaths, right? What could be done to avoid the minor accidents?


> What could be done to avoid the minor accidents?

For one, more strictly enforce technical fitness and some form of age requirements. An awful lot of people don't care much about the roadworthiness of their vehicle, and many don't care about technological advances like anti brake-lock systems in newer models as well.

The other major contributor to motorcycle (or bicycle) accidents is road conditions like potholes, dirt and especially oil contamination, bumpiness in roads... a car doesn't care much (unless it's one of those super-flat sports vehicles), but a cyclist can easily lose control.


You can drive 350+ miles on a single scooter charge? Also, I have a hard time imagining anyone taking 400+ mile road trips on a scooter.


> Also, I have a hard time imagining anyone taking 400+ mile road trips on a scooter.

On a scooter not, but on a Harley or Honda Gold Wing? People make trips across Europe on these beasts.


The parent is talking about electric scooters having more range than electric cars; I’m not sure how gas motorcycles fit in here.


e-scooters, ebikes, trains and LEVs are very much the sane answer.


Aerodynamic drag is cD multiplied by cross section. By virtue of about a quarter the cross section of an average passenger vehicle, motorcycles are far more efficient. Don't even get me started on road wear being a function of weight to the fourth power, thus motorcycles effectively cause negligible road wear.

For what it's worth, I've never driven a motorcycle and probably never will, for danger.


I don’t bike to work for the reason you mentioned, even in the bike lane. The health benefits aren’t worth the amortized risk. Cars treat bike lanes as passing lanes way too often. If it’s not physically separated it’s not for me.


Yes, and ban pedestrians. Walking belongs to a shopping mall. We cannot allow those pesky pedestrians endanger our Full Self Driving.


> It will be interesting if Tesla comes in to provide monetary support for proving the legal case that Tesla FSD is not at fault or the Tesla driver (and his insurance) will be left to fend for themselves.

> In the short term I could see Tesla not supporting the driver and absolving themselves via fine line/TOS, etc.

> But the long term effects of not legally supporting any driver with Tesla FSD accidents will be that new customers won't trust this $10000 upsell product offering that's highly profitable for Tesla.

Tesla publicly disparages people who died relying on their products, and refuses to cooperate with the NTSB. I'd expect nothing less in this case. Somehow that hasn't been a big factor in sales.


Agreed. It seems like a great short term strategy to get out of trouble quickly but long term it doesn't seem smart to use the most ardent Tesla evangelists who are literally willing to put their lives on the line as scapegoats.

It seems the take rate of FSD for new Tesla purchases is not as high as it used to be - perhaps due to the increase in price and other Autopilot-FSD bundling/unbundling aspects - but also perhaps due to negative press from the accidents thus far. [1] Definitely something to watch as/if the accident incidents accumulate.

[1] https://twitter.com/troyteslike/status/1586356451639189504?


> refuses to cooperate with the NTSB

This is completely false. Tesla is legally required, and complies every time, to release crash data to the NTSB. This data includes whether or not self driving was enabled.

> Tesla publicly disparages people

Tesla refutes that self driving is enabled when people lie about it. I am sure there is an incident or two of someone being sassy about calling out these lies but there is no trend of "disparagement"


>> refuses to cooperate with the NTSB

> This is completely false.

https://finance.yahoo.com/news/tesla-won-apos-t-formally-140...

>> Tesla publicly disparages people

> Tesla refutes that self driving is enabled when people lie about it. I am sure there is an incident or two of someone being sassy about calling out these lies but there is no trend of "disparagement"

Does any other car company have a habit of responding in public like this?

https://www.tesla.com/blog/update-last-week%E2%80%99s-accide...

Given the deceased is not able to share their side of the story, I don't feel like it's appropriate to air this in public like this. Keep in mind that while they are saying the hands were not detected on the wheel, they can't say if hands were or were not on the wheel and there were many contemperaneous reports of poor sensing. This should really just be in the accident report, not aired by Tesla, regardless of fault. Maybe that's not disparagement, but again, do other automakers push this kind of narrative or is it only Tesla? What did Toyota say during the unintended accelleration issues a decade ago? As I recall, mostly they were blaming floormats, not drivers?


> https://finance.yahoo.com/news/tesla-won-apos-t-formally-140...

You are completely misrepresenting what's happening here by saying Tesla is refusing to cooperate. NTSB is trying to pressure a gag order on Tesla that would prevent Tesla from disclosing information around a car accident to the public. NTSB deciding this is a mandatory part of the agreement is their choice. They could have chosen to go forward without this requirement - they didn't. Tesla still provided all the information they had on the situation, and still continued to provide assistance to the NTSB in their investigation. Just not as a "formal party". This is the NTSB making demands while also asking for help that Tesla isn't obligated to provide unless they want.

You're also misrepresenting the Toyota situation. Toyota actively denied fault for most of this situation, and was fined $1.2 billion for deceptive statements around it. They frequently referred to "pedal misapplication"


In my gut feeling, it's never had been a factor because Tesla had no competitors in good-looking, UX focused EVs. Tesla buyers had specific, unaddressed reasons that competitors were either missing or had been ignorant about, just like how Nokia lost to iPhone.


It’s very unlikely it’s a first time FSD caused accident for others. People using FSD allow it to do reckless things, just to see “if it’ll figure it out”. You can cause accident, without being involved in it, especially if you drive in unpredictable way.

Like this one - if driver would stop sudden breaking and moved forward, cars behind him could have still crashed.

That’s the thing about testing on the public roads - there are many ways you can affect other users.


Most of the fault in this video seems attributable to the other motorists. I was always taught that it was my responsibility to ensure that I have enough following distance, and am paying close enough attention, to stop if the car in front of you abruptly stops. Is this not the case in California? Because the first car seems at least partially at fault in this respect, and every other contributor to the pile up is significantly at fault.


> I was always taught that it was my responsibility to ensure that I have enough following distance, and am paying close enough attention, to stop if the car in front of you abruptly stops.

Brake testing (suddenly braking hard for no reason, what the Tesla did here) has always been a cause for the leading car to be the guilty party when they get rear ended.

In times prior to dashcams, it was difficult to prove the car ahead did brake to cause the accident, so the rule of thumb was the rear car is probably guilty. But if there was a way to show the lead car did brake, they're guilty.

Nowadays with dashcams, it's a lot easier to prove, so brake testers don't get away with it as much.


This is mostly applicable if there is no sudden lane change. True you should always keep a safe enough distance for you to react and brake in any case with the car in front of you. If that car was too close on another lane, switch to yours and brake simultaneously this is a bit different.


The whole scenario would look very similar if the Tesla was a car suffering from a breakdown and moving to the safest lane it can reach safely. In such a case we would also expect other drivers to pay enough attention to prevent an accident like this...


No, it wouldn’t in a breakdown situation in any other car, the driver would try to pull over to the right.

Most drivers gain an instinct to passing slower/stopped objects on the left, and a natural aversion to passing them on the right.

The Tesla pulled over and stopped in the far left lane. No “space” was left to pass it.


It's either changing 3 lanes to the right or 1 to the left. In case of loss of control, in this case with the car coming to a stop very quickly, it might be better heading for the left lane, instead of getting stuck in the middle or second lane from the right.


The first car to collide with the Tesla is least at fault, but obviously still partially at fault. None of the other cars in the pileup have that excuse though.


Was the first car speeding? If not, how were they at fault?


It’s almost impossible to rear-end another car without being at least partially at fault. The lane change from the Tesla was obviously dangerous, but the first car to collide with it had sufficient warning and time to react. You can see that they did react, but they obviously didn’t expect the Tesla to come to a full stop, which is why the first collision took place. I believe the legal term for this is contributory negligence.


Changing lanes and then slamming on the brakes means you are 100% at fault (assuming there is evidence). It's the most common car insurance scam and it is completely indistinguishable from what the Tesla did in this case


Breaking down is also completely indistinguishable from what the Tesla did in this case. The breaks weren’t slammed on either, 7 seconds elapsed between the Tesla indicating for the turn and the collision. It’s very obvious that every participant in the pile up was at least partially at fault. The only confounding factor in this case seems to be that many HN commenters lose a handful of IQ points when any topic relating to Elon Musk comes up.


The video of the accident[1] clearly shows the Tesla start hitting its brakes 4 seconds in, the car that eventually rearends it is a few car lengths behind and in the left lane and hits its own brakes less than a second later and the accident occurs less than 4 seconds later.

Taking into account reaction times, it takes approximately 5-6 seconds to bring a car going 60mph to a complete stop[2][3] The accident was completely unavoidable for the car that rearends the Tesla because the Tesla made an extremely dangerous maneuver

[1] https://theintercept.com/2023/01/10/tesla-crash-footage-auto... [2] https://nacto.org/docs/usdg/vehicle_stopping_distance_and_ti... [3] https://www.edmunds.com/driving-tips/keep-your-braking-dista...


I applaud your self awareness


I overtake you, the instant my car is 1m in front of yours I switch into your lane and hard break.

Who is at fault for you crashing into the back of me?


You, UNLESS THERE ARE CLEAR EVIDENCES THAT PROVE YOUR INNOCENCE, like there are in this case and there will be in such cases(like skid marks).

The fact that the trailing driver is USUALLY found at fault it is just philosophical razors applied to situations without enough evidences. Generalizing that into "trailing drivers are always at fault" is disingenuous, partisan and malicious.


Key word is "almost". Perhaps this is a situation where the fault is primarily with the overtaking car. However, an attentive driver should have started braking as soon as the illegal lane change was started.


You're describing the exception situation that has led to many drivers to install dashcams.


It takes humans up to 5 seconds to even start breaking in the presence of danger.

Even at 40 miles per hour your speed is 58 feet per second (or, at 60kmh you speed is 16 meters per second).

So it's either "the driver had sufficient warning" or "the lane change from Tesla was dangerous". As you don't expect cars that are dangerously changing to suddenly slow down.


5 seconds is nonsense.

The average driver’s reaction time is 0.2 to 0.3 seconds. This is before the driver’s foot moves.

Then there is approx 1 second before the brakes take effect (foot movement, applying the force, brakes responding)

So 1.5 second before the car starts breaking.

There is a rule of “be at least 2 seconds behind the car in front” which gives a safe distance to handle any emergency braking.

Of course everything depends on driver’s and car’s conditions.


Sweden has the rule of "at least 3 seconds". Because you keep ignoring the speed, and the braking distance as well. Edit: and assuming perfect conditions when the driver is perfectly alert and looking ahead. And is young and healthy. And...

However you count, if Tesla did a dangerous lane change and started braking, how is this the fault if the driver behind?


You seem to be missing a couple of point here.

1. Fault in an accident can be spread across every party that was involved, and no matter how negligent one part was, that has no influence over how negligent every other party also potentially was.

2. A driver is typically responsible for being aware of all potential hazards on the road, not just the ones immediately ahead in the same lane. For instance a car that has started indicating to move into their lane from another lane (as this Tesla did ~7 seconds before the collision), and a car in their lane that is coming to a stop (this Tesla had completed the lane change pretty much 3 seconds before the collision).

Obviously the Tesla is really pushing the limits of what would be considered a safe gap, and coming to a stop in that location without a proper reason is obviously negligently dangerous. But the negligence of both parties contributed to this accident, regardless of who was most at fault.


It's absolutely not nonsense, I suggest taking some basic high-school physics. Reaction time is the time for the vehicle to come to a stop, not for the driver to first notice an issue.

At 60 mph it will take many meters and seconds to deaccelerate, assuming the driver can take exactly the correct action:

https://www.wolframalpha.com/input?i=time+to+brake+at+70+met...

Estimate here is about 4 seconds, under perfect conditions. 5 seconds is more than reasonable, considering you also need to realize the other driver is acting erroneously.


Reaction time is the time for the vehicle to come to a stop

You seem to have your terminology confused. Reaction time doesn’t even necessarily have anything to do with driving. I suggest a quick web search next time before assigning people remedial physics.


And I suggest you look into the concept of contextual terminology


> Reaction time is the time for the vehicle to come to a stop, not for the driver to first notice an issue.

No, reaction time means the time from noticing the need to brake to the time brake pressue is getting applied.


I disagree - reaction time is commonly referring to the reaction time of the vehicle in these events, the propensity of software devs to treat everything like a video game non-withstanding.


he said 5 seconds to START braking


I did not interpret their comment that way. That would be unreasonable, however their comment included mention of physical break distance, so I'm pretty certain they didn't mean actual human response time and instead how long it takes for a human to respond and slow the car, which is the metric of interest here. Even for a robot with millisecond response times, you will be breaking in the multiples of seconds, not instantaneously. Observation "reaction time" really is completely irrelevant here.


> It takes humans up to 5 seconds to even start breaking in the presence of danger.

That seems pretty clear. 5 total seconds is really reasonable. 5 seconds before you start is silly.


5 seconds?? Lol. Maybe if the driver is 86 years old.


>Is this not the case in California?

I don't think it is. I've never felt so unsafe in traffic as driving on highways in california. People keep an order of magnitude too little distance to the car in front. That density of cars would make any european highway slow down to a crawl but there they just practically touch eachother while driving 60kmph while doing crazy lane shifts left and right.


> I think what's interesting here is it's likely the first instance where Tesla FSD has been involved in an accident which affected other drivers.

In a pileup like this it's basically never the fault of the front car, unless maybe if they are purposely causing the accident for insurance fraud or something. Maybe the driver will get cited for failing to maintain the minimum speed, but legally this isn't much different than if someone backed into the Tesla while it was parked in a parking garage.


If I brake check someone on the highway, causing an accident, and it’s discovered that that’s what happened, I’d be pretty shocked* if the trailing driver was found responsible.

That video looks like a combined lane change and brake check on the part of the Tesla.

* and disappointed


Yes. Watching the second video, you can see the second car's brake lights come on quite quickly, given the Tesla's actions were completely strange and unexpected. Less than a second from when the Tesla cuts in front and starts braking to the vehicle behind hitting the brakes. They're definitely not at fault. Some of the cars that piled in behind were a different matter though. Some were able to stop while others didn't. Those ones must have been following too close and/or not paying enough attention.


They didn't actually stop their car, though. And it really looks like they could have, had they braked a little bit harder. But that may be due to the unexpected nature of the Tesla coming to a full stop.


Yeah, it's possible they didn't fully slam on the brakes immediately. But you don't expect the car in front of you to do so for no reason either, so I still wouldn't consider the second car at fault.


Why would you be shocked and disappointed? It is the obligation of the driver trailing you to maintain enough distance to be able to stop safely if the car in front of them has to brake hard for whatever reason. The fact that your hard braking wasn't actually necessary may mean that you're also responsible, but it doesn't absolve the other driver.


As a matter of pragmatism, the person intentionally putting other drivers at unnecessary risk by being an asshole should bear the responsibility for the outcomes of that intentional and unnecessary risk they elected to subject roadway users to.


They should, and I said as much.

It does not absolve the other person from responsibility for not following the required safety protocol. The car in front of them might have had to brake for a perfectly legitimate reason.

I mean, suppose that the other driver was driving while drunk. They might have arrived to their destination safely without the asshole in front, but that's not an excuse against a DUI ticket.


How do you maintain safe distance from those in adjacent lanes? Especially if they may come from behind you, suddenly change lanes in front, then break hard?


You do not, but that wasn't the hypothetical posited by OP.

In the actual case with Tesla, the driver of the car that was in front of the pile-up was not at fault IMO. But there were a bunch more cars behind, and at least some of those slammed into each other because they didn't maintain proper distance, according to the police.


Are we looking at the same video? The turn signal is on long before, the lane change of the car starts long before.. you don't see any reaction of the following driver even when the Tesla's tires finally cross the line, which is the latest point where the following driver must adapt apeed and distance, but nothing happens?


I am not sure about the US, but in most countries I have driven in, putting a turn signal on is asking for permission to enter another lane, and there is no 'right' to go into the other lane or expect drivers currently there to slow down for you.

If there is an obstruction ahead, the safest scenario is to change to an empty lane ('empty' including safety distance) or brake in your lane to avoid hitting it. If you have some other problem you should probably brake slowly and change lanes to the outer edge of the road whenever there are safe gaps.

I would probably be off the gas or braking based on movement of the car in the other lane as part of driving defensively, though i don't think there is any 'obligation' for me to do so.

The behavior of the Tesla would also strike me as rather odd (assuming a right hand drive country unless there is a off-ramp coming up on the left) as it appears to be pulling over to the wrong side of the road.


You don't, but as soon as you see the car in front of you change the lane (or just indicating via turn signal), you should? And to be honest, this looks more like slowing down and not a hard break, and at least the first following driver slept quite a bit... agreed the stopping had no reason, so (as someone quoted German laws) for that you would be too blame... but for crashing into the car with that video footage, I'm sure the following driver would get at least 80% if not 100% blame.


our auto laws are founded from a time when catastrophic failure happened much more often. If a suspension bushing randomly disconnects, or a wheel hub disintegrates, or a differential grenades, who is the asshole? That's right, it's the person following too closely


No blame for the person driving unsafe equipment on a public highway?


Blame doesn’t have to be zero sum.

This seems to be 100% the fault of Tesla for selling unsafe equipment. Some amount of blame for the Tesla driver. 0% blame for the car that got cut off and hit the Tesla. And 100% blame for every car behind that crashed into the car in front of them.


Only half of the subsequent impacts in the Tesla video are the fault of the car who hit the car in front of them. I agree with the police conclusion that vehicles 5, 7, and 8 share fault for unsafe following. Vehicles 2, 3, and 6 do not, despite crashing into the car in front of them (subsequent to being hit from behind). Vehicle 4 I couldn’t develop a clear sense of what happened, but it seems was hit only from behind.


Some failures are sudden and unexpected.


I agree. But this should also apply to those drivers that experiment with FSD. After all, they were playing with tech on a public road, so deliberately putting others at risk.


As much as I agree with you, it seems that concepts like 'brake check' and other road rage behaviours make this matter a lot more complicated than common sense dictates.

Simply, drivers don't use their common sense, and just their normal daily driving routine is already endangering to people around.


It does. And then also actually maintaining a safe distance can be difficult simply because people will immediately merge in there (and then brake in front of you, because of course now they're too close to the other car).

Maintaining a safe configuration of cars on the road is the collective job of everyone who is driving on it; and conversely, it takes just one person to ruin it for everybody who is doing the right thing. So it really needs to be a part of the culture of driving for that to work. And we don't have that in US, unfortunately (though, to be fair, there are far worse places to drive in).


Lane change + braking is extremely scary.

I rear-ended a woman who did that to me. When she cut in front of me in my lane there was not nearly enough distance between us for me to fully stop when she then suddenly braked.

I'm not sure what I really was even supposed to even do in a situation like that — I suppose as soon as she cut over I should have just assumed the worst was coming and hit my brakes right away?


Yeah, in cases like this it is just physics. Propably not much you could have done. She cuy into the space between you and the car in front, the normal reaction is to lift the foot to increase distance again. If the new car hits the brakes, no chance to not rear end it...


Or at least have the foot off the gas. I try to have or create if I don't distance between myself and aggressive drivers. But even with defensive / paranoid driving sometimes there's just not much you can do, physics is physics.


In Germany you would be covered because line change and break, without a reason, is a deliberate dangerous action. This means that the one doing it is responsible of the following crash.

For the multiple cars crash, if not clear cut, without video recording, all the insurances pool themselves together and consider the drivers as not responsible. Nobody cannot tell if you stopped, got hit in the back and then the front or the opposite.


Huh? In my (German) experience, responsibility for the crash (and thus for the sentence/fine/payout to each driver) is rarely assigned to a single driver in such a case. Instead, it is distributed among participants with e.g. the Tesla driver being 80% at fault for the second driver's damages and the others ramming the second car for 20% (or something like this).

I admittedly find it a little bit confusing how many people here assign complete blame to exactly one participant.


I asked my lawyer, the fine element is the deliberate and dangerous part of the action without reason. Normally you effectively have shared fault.

The question is if in this case the action can be considered as deliberate. The "software" did it, is it deliberate? Is it considered as a failure of the car like the breaking of mechanical part?

This is new and the justice will have to figure it out and I am very happy not to have to figure it out.


> I admittedly find it a little bit confusing how many people here assign complete blame to exactly one participant.

I'd say there is only one car which did a clearly illegal action (sudden lane change followed by brake checking), so the guilt is 100% on them.

It's also true that the car behind probably could've been more assertive in braking harder to avoid reaching the Tesla (I wish there was a dashcam to see it from a better angle). But they didn't do anything illegal per se.

Also, that second car driver might've been worried of braking too hard to avoid getting themselves rear ended, which is a legitimate concern when having to break hard in traffic. Of course in the end they got rear ended anyway, but couldn't have known that prior to it happening.


I think the intention matters. If you did it to spite the guy behind you then you are at fault. If you did it because you thought something might cross the road and just wanted to be safe then it's not your fault even if nothing crossed the road or nothing was really there.


or lack of intention. If an engine throws a rod, if a driveshaft pole vaults, if a wheel falls off, or a suspension component catastrophically fails.

These are all realistic sudden stop issues that our laws mostly accommodate.


No, those are not "sudden stop" situations either.

In fact the worst thing to do is to stop suddenly.

Even if a wheel falls off, you do not slam on the brakes unless you want to lose even more control.


If a wheel falls off, the driver doesn't have to apply the brakes... A quarter of the car is digging in and melding itself to the asphalt.

If the differential fails catastrophically, it applies more braking force than the brakes do, so you'll have brief period of being airborne and then at least 1 rear wheel will sheer off (seen it).

If the driveshaft or trailing arms fail the wrong way, they result in a steel shaft digging into the roadway at a downward angle in front of the wheels and the driveshaft will stop the differential as above, until it sheers off.


> If a wheel falls off, the driver doesn't have to apply the brakes... A quarter of the car is digging in and melding itself to the asphalt.

That's not necessarily true. I've been in a car where the rear wheel parted ways with the car and passed us. For a few seconds we wondered where that wheel came from until realizing it was from our car. The car was balanced just fine on three wheels, there was plenty time to lift off the gas and make way to the breakdown lane safely.

Even if it's a more heavily loaded wheel (e.g. front wheel on front engine front wheel drive car), it'll slide on the brake disc or the disk cover (whichever is lower, varies by car). A good amount of sparking but you'll have enough steerage to pull over.


Tesla’s FSD/autopilot does not act with intention as we understand it.


No intention, no fault. But definitely unsafe lane change, I think.


I agree with your original example ("good reason => OK", "bad reason => BAD"). But for the in-between we should default to "not OK" as well, since stopping on a highway is just dangerous.

But damn, I really have little sympathy for all the inattentive drivers not being able to stop in time for the pileup (and hope damages have to be payed accordingly). Maybe, Tesla FSD does not make the streets more dangerous than they already are, after all.


> No intention, no fault.

Nonsense.


Imagine it is a human driver saying that shadows on the road made him believe there was a person there so he slowed down to make sure.

He wouldn't be at fault for slowing down (after checking his eyesight).

Lane change is what makes Tesla at fault here, not unreasonable slowing down.


How convenient for Tesla. I doubt we would think this way if your wheel flew off your civic because of a design flaw in the axle.


That's because we'd fully expect them to recall the effected models and fix the design flaw. Since Tesla knows about erratic lane-changes and phantom braking, and has for years, why haven't they recalled the vehicles and fixed the design flaw or at least disabled the feature?


Why is this being downvoted? Intent isn't everything. While it is important, design flaws are worth getting the book thrown at you over.


In the video, the Tesla comes to a gradual stop and the car behind it didn't change speed at all. The third car was able to stop without hitting the second car and that was arguably more challenging. Second car wasn't paying attention. Maybe it was using cruise control?


Not a brake check, that kind of implies cutting in and then a hard enough dab on the brake to freak out the guy behind and force them to brake sharply but not enough to stop.

This was the Tesla braking as hard as possible to a complete stop. The following driver reacted quickly and had they not been paying attention the accident would have been much much worse.


That would still be your fault for following too closely.

Basically if you hit a car from behind, it’s your fault, every time.


> Basically if you hit a car from behind, it’s your fault, every time.

Incorrect. If you brake check someone (what the Tesla did), the front car is at fault.


From what I can tell from the video - there seems to be a grey area because the Tesla FSD is changing lanes and stopping simultaneously. If that's the case, its not a tailgating issue but an unsafe lane change - for which the Tesla FSD (or Tesla driver) could be at least partially at fault.

The grey area will require some defense and it will be interesting to see if the Tesla driver is left high and dry by Tesla.


> In a pileup like this it's basically never the fault of the front car, unless maybe if they are purposely causing the accident for insurance fraud or something.

Quebec woman who stopped for ducks, causing fatal crash, loses appeal

https://www.cbc.ca/news/canada/montreal/emma-czornobaj-loses...


That’s an highway, you cannot randomly stop there. She didn’t crash into ducks.


The tesla crash in question happened on a freeway, in california where stopping on the freeway is illegal.

It was also an unsafe lane change which caused the pile up, not just the braking.


> That’s an highway, you cannot randomly stop there.

Right, same as what the Tesla did, suddenly stopping on a highway (being in a tunnel makes it worse, as there's no runoff room to the sides).


In California, it is illegal for a car to stop on a freeway except in an emergency.



> it's basically never the fault of the front car

Because typically the car in the front stopped or slowed for a reason that does not violate any rules or responsibilities. But when they have neglected to follow rules, or uphold responsibilities, then they can share fault.

Generally speaking, drivers in the US have a legal responsibility to pay attention to what is going on and operate their vehicle with care.

Considering that the police report evidence includes the FAQ page from Tesla for the question “Do I need to pay attention while using autopilot?”, I think it’s clear what direction they’re going here.


It looks like the Tesla driver will be at least as fault for the collision with the car behind it. However, it seems the subsequent collisions were caused by fleshy human drivers driving unsafely. I don't know much about traffic law, so I'm unsure how responsibility for the overall pileup will be divided.

From the police report:

> V-1 made an unsafe lane change (21658(a) California Vehicle Code) and was slowing to a stop directly into V-2's path of travel. This caused the front of V-2 to collide into the rear of V-1 (4.0.1. #1). P-2 did not have enough time to perceive and react to V-1's lane change.

V-1 = The Tesla

> P-4 observed V-3 stopping and applied V-4's brakes. V-3 came to a stop to the rear of V-2. P-5 observed V-4 stopping and applied V-5's brakes. As V4 slowed down, P-4 steered V4 towards the #2 lane. Due to P-5's unsafe speed for stopped traffic ahead (22350 California Vehicle Code), P-S failed to safely stop behind V-4 and V-3. The front of V-5 collided into the rear of V-4 (A.O.L #2). V-4 moved into the #2 lane without colliding into any other vehicles. V-5 came to a stop in the #1 lane after colliding into the rear of V-3 (A.O.L #3).

and it goes on from there...

https://www.documentcloud.org/documents/23569059-9335-2022-0...


Right so if there was a lane change right before the Tesla stopped then it makes sense that they might have at least partial liability for the second car. But they will probably have zero liability for all the other cars.


> Right so if there was a lane change right before the Tesla stopped then it makes sense that they might have at least partial liability for the second car. But they will probably have zero liability for all the other cars.

I know someone who had to suddenly brake due to a pedestrian jumping onto the road. So, for a very good reason in that case. Nonetheless, it was a sudden stop, so their insurance had to pay for the repairs to both of the two following cars which rear ended each other.


> In a pileup like this it's basically never the fault of the front car, unless maybe if they are purposely causing the accident for insurance fraud or something.

Brake checking (what the Tesla did) does definitely make the front car the guilty party. It's usually done for insurance fraud, here presumably done just by AI gone mad. But same result and same guilt.


If you change the lanes to be right in front of the car and then hit brakes, you are at fault. The back driver responsibility applies when they have control over space between you two.


At least in the UK, it would very much be the fault of the driver in front, especially if there was that much CCTV footage available (there won't be, which is why dashcams are important - the US's obsession with blanket CCTV coverage scores a point here).

The second car had left a more than adequate stopping distance. The Tesla changed lanes close in front of it and then immediately braked as hard as possible, deliberately. The driver of the Tesla should lose their driving licence.

The drivers following the second car weren't leaving enough distance or paying enough attention.


the US's obsession with blanket CCTV coverage

Ha ha ha. London, UK?


No more and no fewer cameras than any city in the US.


That's just silly.


It's one of the things I found baffling about the US, is the sheer amount of cameras on everything.

People even put CCTV cameras inside their homes. What on earth?


I don't know what you mean by everything and I certainly agree that the U.S. trend is in the wrong direction with private video doorbells, license plate scanners, etc. It is also pretty crazy to put someone else's camera in your house. But I don't get how someone with a UK background could think that the situation here is somehow worse.

For example, this lists the top ten most surveilled cities. Nine are in China. Number three is London.

https://www.usnews.com/news/cities/articles/2020-08-14/the-t...


Well, mostly because the article that's always cited about the number of cameras in London is tabloid bunk, an utter fiction.

The guy that wrote it wanted a suitably shock-horror piece so he went to the main street of a shitty part of London, counted every CCTV camera he could see everywhere including ones inside all the betting shops, off-licences, pawn shops, cheque cashing places, and so on - all lovely totally-not-dodgy businesses I'm sure - and then multiplied up by the total amount of roads in the UK.

If the figures were accurate then every single-track road that's basically just a cow path with tarmac sprayed over it would have a CCTV camera every four car lengths, which is clearly not the case.


Well, OK. But how is that different than the huge number of cameras that you mention in the US? Aren't the US cameras similar - doorbells, shops, banks, ATMs, etc.


It will be interesting to see if most of the problems with FSD go away as soon as all cars have FSD as well as transmitters to signal to nearby cars what they are doing.

At that point humans will theoretically be the weakest link, and anyone driving "manually" will be a liability because they will lack the information and reflexes to deal with whatever is happening around them in a timely manner.


That would be very interesting. Stand behind your product. If FSD becomes a public nuisance they will quickly become uninsurable or worse.

I was thinking today about the Southwest disaster, not only for customers but for the company’s reputation. But I know a great way to win it back: cash. Promise it won’t happen again, but if it does, offer best in industry cash compensation. Prove that your company gives a shit. I will be very disappointed if they expect time alone to heal this.


> In the short term I could see Tesla not supporting the driver and absolving themselves via fine line/TOS, etc

Imagine there is an 'autopilot' gun, you buy it, and it comes with the contract that says you take full responsibility got the gun.

Then it shoots me and kills me before you have a chance to react.

The prosecutor will go after the manufacturer. If manufacturer wrote code that kills me, you and any contract you signed is not even relevant.

You cannot contract away criminal responsibility. Otherwise I could contract away all my responsebilities to a random homeless guy.


Unless there is precedent and tort "reform" by the given lobby.

https://en.m.wikipedia.org/wiki/Protection_of_Lawful_Commerc...


Tort is civil law. Criminal liability still can't be insulated.


FSDbeta is not enabled on highways. It's not clear to me that it's even possible to be on FSD in that tunnel.

Interestingly, the article is careful to say that the driver "claims" it was on FSDbeta.

More to this story.


FSD does not activate on freeways. This is not FSD. It's the same Autopilot that has been in use for many years.


Wrong. Here’s a video of FSD in use on a freeway. There are hundreds of these. https://m.youtube.com/watch?v=E-RHYbvIZYo


Wrong. You can plainly see in the video (and the hundreds of others) that as soon as the car enters the freeway it switches to Autopilot automatically. The easiest way to tell is the red lines in the FSD visualization disappear when it switches to Autopilot.

The transition is completely smooth and the driver doesn't have to know or care at all, so it's not surprising that they were mistaken. But it's relevant here because people are trying to blame this on FSD. On FSD cars the freeway Autopilot is still the exact same Autopilot that non-FSD cars have (with the "Navigate on Autopilot" feature which is more than four years old now). They plan to replace freeway Autopilot with FSD later this year, but it hasn't happened yet.


My take is that you are probably right in that it is pretty easy for Tesla to place all the blame on the driver of the Tesla and weasel out of any legally required support for the driver via TOS fine legal print in the short term.

But after the 10th (or maybe 100th) scenario where Tesla FSD is at fault but Tesla scapegoats responsibility onto the driver who is branded by Tesla's legal team as a dumb/irresponsible/clueless/reckless driver it starts making less sense.

That driver is likely to have been a highly loyal Tesla fan/customer/evangelist/believer who paid $10,000 for FSD sight unseen. Long term the evangelists might no longer evangelize and may in fact (correctly or incorrectly) spread the message to the general public that FSD is useless/unfinished feature - leading to long term damage to the Tesla brand.


It’s too late, the Tesla fanboy subreddit has been on the FSD hate train for a while now. I’m shocked at how bad autopilot is on my Tesla, my wife’s 2018 Honda CR-V has better “autopilot” in the form of lane keep assist and adaptive cruise control, it can drive the car just like the tesla, but it’s less finicky and operates much closer to the way the writer describes a better level 2 system that monitors you instead of you monitoring it.

The tesla demands you pay attention and drive but penalizes you by disengaging if you so much as try to take a slightly different line around a curve. Meanwhile the Honda just bides it’s time until you let go and seamlessly takes control keeping you in the middle of the lane.


> you are probably right in that it is pretty easy for Tesla to place all the blame on the driver of the Tesla

Whoa now, I didn't say that at all! I'm just talking about FSD vs. Autopilot here. It's entirely possible that Autopilot is at fault! I am curious to see what the investigation will uncover. My point is simply that this situation has nothing to do with FSD. It could have happened four years ago as easily as yesterday, and anyone trying to tie this to recent developments in FSD is being misleading.


If drivers are unlikely to notice the transition and the capabilities are significantly different that seems problematic.


It is not problematic because the capability of Autopilot on freeways is not significantly different to FSD on other roads. Drivers do not need to adjust their behavior based on the system in use. Of course they should keep hands on wheel and eyes on road at all times using both systems, which is enforced with regular nagging and a driver-facing camera.



Is there enough info yet to know if the lane change was initiated automatically? That's apparently possible. Tesla support site:

Auto Lane Change

To initiate an automated lane change, you must first enable Auto Lane Changes through the Autopilot Controls menu within the Settings tab. Then when the car is in Autosteer, a driver must engage the turn signal in the direction that they would like to move. In some markets depending on local regulations, lane change confirmation can be turned off by accessing Controls > Autopilot > Customize Navigate on Autopilot and toggle ‘Lane Change Confirmation’ off.


I believe the car was in "Full Self Driving" and not "Autopilot"?


It’s unclear because FSD is supposed to switch to autopilot on highways. The driver said it was in FSD mode but they may just not realize that switchover occurs.


And here is the whole problem. It’s not good UX.


Automatic switching between FSD and Autopilot is not any kind of UX problem. There is no reason the driver needs to be aware of it. The driver does not need to take any action at all. The only difference is the type of road. But which software was in use is relevant here because people are trying to blame this on FSD. Whenever you're on a freeway, Autopilot is used automatically instead of FSD, so FSD had nothing to do with this regardless of whether the car had FSD installed or not. This is the same Autopilot software that has been available and in use for many years.


> There is no reason the driver needs to be aware of it.

If the person operating the vehicle doesn't need to be aware of the difference, then the general public shouldn't need to either when analyzing an accident.

It seems baffling to me that we could imagine a scenario where it's less important for the driver of a vehicle to be aware of the difference, than it is for the public when reacting to a news article.

I don't think you get to have your cake and eat it. Either there is no relevant distinction, in which case the "well actually" response that "it's not FSD" seems unnecessary, or the distinction is important, in which case it seems problematic that drivers would be confused about it.


The difference comes when drawing inferences about the future development of Autopilot/FSD as these articles invariably do. This is completely different than day-to-day usage of the system. If you believe this was a failure of FSD, then maybe it's evidence we should stop the rollout of FSD or whatever. If it's a failure of Autopilot, then it is not evidence about the future behavior of FSD at all. FSD is a complete rewrite of Autopilot and is planned to soon replace it entirely including for freeway driving.


> The difference comes when drawing inferences about the future development of Autopilot/FSD as these articles invariably do.

I guess I just didn't take that away from this article at all.

It seemed like a more general complaint about Level 2 automation using this instance of a crash with FSD/Autopilot as an example. The distinction seems, to me, relatively irrelevant to the main point of the article. A key paragraph for me was this one:

> While, yes, Tesla’s system was the particular one that appears to have failed here, and yes, the system is deceptively named in a way that encourages this idiotic behavior, this is not a problem unique to Tesla. It’s not a technical problem. You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.

So, I guess I don't see the point in distinguishing between FSD/Autopilot when the thesis of the article isn't specific to Tesla, but rather applies to Level 2 systems across all cars.


If FSD automatically turns into autopilot on higways and driver is not supposed to know, then it is fault of SFD.


It matters not to the driver or to the general public but to audiences like this one where knowing if a 10+ year old mature system not based on ML did this or a beta, in progress neural network based system did this makes a difference in thinking through the implications of such an error.


When responsible for overseeing a multi-ton vehicle, I like to know how it will behave. Cruise control and lane assist I understand. I need to know which one is in control and how it will behave if they can respond differently.


I like how my 25-year-old Range Rover has neither of those. Power steering and ABS is all you get.

Technically it does have cruise control, but I've never worked out what kind of road conditions it would be useful for and until I replace a 25-year-old dry-rotted vacuum hose under the bonnet it's not going to control a damn thing.


There is no difference in behavior when switching from FSD to Autopilot or vice versa. And there is never any ambiguity about which is active because it is completely determined by road type which you're obviously aware of while driving.


Then, other than pedantry, is their a meaningful difference?


Yes, FSD is essentially a complete rewrite of Autopilot which will soon replace it for freeway driving too. Autopilot has been available for many years and FSD is new. This accident could have happened years ago and is not evidence of any issue with the new FSD stuff.


But since there's no behavioural difference, it doesn't matter if it's a complete rewrite or not.


They currently operate on disjoint sets of roads, so they can only be roughly compared, and in rough comparison they do not behave differently. Of course if operated on the exact same road (which, again, is currently impossible) their actions would not be the same in every detail, just as different humans or even the same human driving the same road multiple times wouldn't. But that doesn't mean that there's necessarily any consistent difference that would enable a driver to use the information of which one is active to improve their predictions of the car's behavior.


I guess it matters because a full rewrite is much more likely to have serious bugs.


This sounds similar to Boeing's reasoning for not documenting the MCAS system on the 737 MAX.


I’m not familiar with Teslas but to me the UX problem is that Autopilot and FSD sound like they should mean the same thing


In any case, navigate on autopilot can change lanes on the highway without user intervention so it seems like an insignificant detail.


Not sure what you mean. If autopilot has a bug then there is a good chance a lot of other cars do too, since (afaik) autopilot is not at its core Tesla proprietary tech like FSD.


My point is that as an end user, there is basically no difference between autopilot and FSD if the FSD beta is enabled on the car. It's a technical detail that doesn't seem significant.

If the end user said they were driving on FSD, it likely means they are enrolled in the FSD beta and were using autopilot - in the car there is no distinction between autopilot and FSD. They are the same thing in the car if the FSD beta is enabled.


Sure my point had nothing to do with the user experience but the post-mortem on what to make of what actually happened and what needs to be done about it. In that context it is highly relevant. The person I was responding to seemed to be asking about which actual system was being used to navigate the vehicle.


What's the difference if you have a Tesla that is FSD capable but it is disabled? I bought into FSD because I had the cash but disabled it.


FSD is basically navigation on city streets. If you disable FSD beta, then you have plain autopilot on city streets (TACC + steering to stay in your lane).


I think it is proprietary. Other cars have similar features, but I don’t think they run the same system.


That this is even being discussed means it needed to be covered in ground school. Autopilot mode setting errors are a serious problem in aircraft operations. Read NTSB crash reports.

That's the trouble with this semi-automated mode. It re-creates a known problem in aircraft automation. And without the training pilots receive.


CHP also said it was in FSD mode.


This blind trust in technologies is very very wrong. We are part of the system and things should always be designed to keep us as powerful fallbacks/system-degradation resource.


I'd agree, yet I proceed from green lights with little regard; I drink the water from the tap; I use Paypal; I take aspirin from a bottle from heaven-knows-where.

We've entered the age of 'blind trust in technology'. We can hardly get out of bed without it.


We can't verify everything ourselves, that's why there is a framework (green lights, laws, ...) and society (others do things you can't do).

In Tesla's case, the failure is that they are testing a new way of doing things without properly saying it, Muskito is pushing for "it works" while it actually is only being in the early stages.

This is a failure to society, like the 737 Max and should be judged accordingly.


There is a "pass the test of time" factor involved. If natural selection approves, then you surely can be more confident and even pay less attention because time guaranteed that plenty of diverse cases didn't compromised the survivability of the system and the system needs you less to achieve its success rate as designed.

Then it comes innovation. Which necessarily starts with limited cases. And we use to launch business and products as soon as we can. Hence, with the minimum acceptable cases. This is fine as long as failures cause minor reparable degradations instead of catastrophic unsurvivable cases.

And still, we need systems designed with every layer having the feature of graceful degradations. That enables us to perform as Woody and Buzz Lightyear said: "Falling with style"


Disagree. We used coal to heat for generations, even though it killed thousands. We drive in traffic but 100 people die each day on the road in the US.

It's all about social acceptance, and hardly about safety at all.


Right, 100 in absolute numbers. But that would be what fail rate as % when measuring total success cases of people pleasantly arriving to their destinations uneventfully?


I have a VW Golf and it has what it calls "adaptive cruise control". What makes it different from my older (simple) cruise control is that it will slow down and speed up again as necessary without my intervention. For example, if I set the speed to 70 mph and it gets close to a car ahead that is only going 65 mph, it will slow down and maintain what the software believes is a safe distance. Similarly, if a car in an adjacent lane changes lanes in front of me, it will slow down if necessary.

I do use it, but less than I did the old system: I just do not find it relaxing because I cannot really grasp intuitively when I need to override it. On standard cruise control, it was obvious to me when I needed to take over. Therefore I am more rather than less vigilant than I was with the old system.

I don't want to be too hard on the Golf: it has other safety features I really like, such as lane assist and automatic breaking. But I am not a fan of the adaptive control, and I think the article help me understand why: its a level 2 problem!


I do love the ACC on my car. One of the worst parts of road trips is following cars you don't want to or can't pass. The ACC totally solves it. I don't see how you were really able to use cruise to follow another car at all before? Even 1mph too slow is not great and if the car in front isn't using any cruise control at all it could be 5mph too slow and then 5mph too fast.


If you don't want to pass a car you just drop down the speed on cruise control and widen the gap between the car you don't want to pass and you. It really doesn't matter if you are 10m or 100m behind him if you are not going to pass it.

And if you get too far behind him you might bump up the speed a notch.

And if somebody gets in the gap it's even better because they are going to eventually overtake him just like they overtook you and overtaking one car at a time might be safer than two cars at once.


I can't say I've ever seen a disadvantage in leaving "too big" a gap between me and the car in front.

If someone wants to pass me and then sit up the other guy's arse, fine. I guarantee I'll be ten seconds behind them at the next road junction, or not much more. I don't think people realise that the relationship between road speed and journey time isn't linear - doing 90mph won't get you there appreciably faster than doing 60mph unless you're on a very long perfectly straight road with no junctions to turn off at.


So basically doing manually what ACC does automatically for you


The problem comes from having to mentally monitor an additional complex system (the ACC) in exchange for the meager attention savings gleaned from not having to closely adjust distance to the next car. That's "adjust", not "monitor". You'll still need to monitor distance to the next car either way, ACC is no excuse to drift off. For example, my non-Tesla ACC will randomly brake hard with no car ahead.

I personally use my ACC as an extra set of eyes. I set it and still manually adjust my speed using the +spd and -spd buttons on my steering wheel. If the car ahead changes speed and I somehow fail to notice it, my car will give me one more chance by automatically slowing down a bit. But every time that happens I treat it as an "oops" like airplane pilots do when GPWS (ground proximity warning system) alerts fire. It's great that automation averted the problem, but it became a problem due to the pilot's error.


Personally I tend to trust it and I only step in if I want to actually go faster and pass the slow vehicle (which usually happens, but sometimes I might wait 2-3 minutes "locked" to the other vehicle)


Yeah, just slower, while keeping bigger gap. Still beats fiddling with gas pedal.


I always have to write a novel on HN. This is a very commmon cross country situation where I live: let's say I can't pass a car due to road conditions where the car's speed varies between 70 and 75 mph and my desired speed is 75mph. I know in 5 to 10 miles there will be a 1 or 2 mile passing lane I can use. In order to eventually pass that car I need to stick to within 150-250 feet of that vehicle so I can make use of the passing lane. If I just "drive slow" not only am I not going to be able to pass the car driving erratically, no one behind me is going to be able to either until the next passing lane in 10-20 miles.


I just did a 1500+ mile road trip over the holidays. During the drive, I was trying to figure out why old-tech (2005) cruise control worked so well.

Even a 1 mph differential in speed is a car length change in following distance every 10 seconds. A 1/2 mph difference is 3 car lengths of change every minute. That seems like enough that would be annoying, but it obviously wasn’t when two cars are following each other both on cruise control.

I still don’t have a fully satisfactory explanation for something that I can easily observe (that it’s usually not annoying).


A 1 MPH pass doesn't take that long in absolute terms, but 15 car lengths (to maintain 7 car lengths in between at 70 MPH) still takes a solid 2.5 minutes to complete, during which you are nominally blocking the passing lane. (I say nominally because once there is a 2 car length gap between you and the other car, other drivers will be passing you on the right.)


Agreed/sorry. I was trying to communicate about why it wasn’t annoying to be the following car for long distances on cruise at X mph (always in one lane), where the car in front is also at “close to X mph”, and where my cruise control has 1 mph increments via button pushes. Even being 1/2mph off seems like it would be annoying, but it didn’t seem to be.


I've puzzled over this as well. Surely, the other car's cruise control isn't calibrated to the same 1mph increments as mine, yet it somehow works out. I wonder if it's an interaction with the leading driver. Perhaps they consciously or subconsciously keep distance from the car behind?


In my car without ACC, I just hit the up/down buttons to adjust the speed for the car in front. Not a big deal when there's a small speed difference, but certainly not as convenient as ACC.


Some percentage of people (probably well into double digits) don't know about the up and down speed adjustment buttons.


I do. They aren't responsive enough and the people in front of me aren't maintaining a consistent speed, so I just switch back to the pedal unless I'm pretty much by myself. I rented a car with adaptive cruise control and really loved it.


In my case, the car without ACC is an RV and since driving it is like trying to push a small house through the air, it responds pretty quickly to speed reductions. Plus I tend to keep a pretty long distance from the car in front of me, so I have more time to respond. I like to pick a truck to pace behind since a lot of people don't want to get stuck behind a truck so they don't pass me then pull in behind the truck, even with 5 or 10 seconds of following distance. Another advantage of pacing a truck is that they usually keep a pretty steady speed unless they are going up a hill, but then on the hill I can usually find a truck that's light enough to keep a pretty steady speed uphill.


> One of the worst parts of road trips

this, so much

> how you were really able to use cruise to follow another car at all before?

not op, but you probably just don't, you disengage, cuss out loud and follow them manually until they come to a decision to stop hogging the lane.

PS that's also why I hate the newer-style blocked-off HOV lanes here in sfbay, which seem to be a total trap and a magnet for idiots who drive 30 miles below the speed limit.


There is a reason that blocked-off HOV lanes are used: research has shown that slugging[1] will form spontaneously- and achieve the goal of actually reducing total number of car trips- it seems to need a bunch of things[2], and one of them is the HOV lanes be blocked off rather than easily accessible. Barrier separation discourages cheating, which in turn rewards drivers who carry slugs rather than just cheat, and rewards slugs with faster, better commutes. But the general consensus of road designers is that you need barriers on your HOV lanes to get slugging, and that without those barriers you don't really see as much of a reduction in car-trips- you will see some increase in planned car-pooling, but the slugging is where much of the reduction comes from, because its so much more convenient for everyone to do it anonymously at your own pace rather than having to tether you schedule to 1-2 specific other people. (One of you has a doctor's appointment? One of you has a sick kid? One of you oversleeps by 15 minutes? Now you are all screwed, one way or another. One of you can't get to work, and the rest of you can't take the HOV lanes. Planned car-pooling just doesn't seem to work long-term.)

Note that all of the research I'm aware of is US based, and I have no idea whether it generalizes to other cultures.

1: http://www.slug-lines.com/slugging/about_slugging.asp

2: Houston's Katy Freeway went from HOV-3 to HOT-2 (and from 1 lane to 2) and seems to have gutted its slugging culture. People weren't willing to get into a strangers car anonymously by themselves (or have a single stranger in their car), it looks like 3 people, all strangers, is culturally different from 2. Also, HOT lanes seem to weaken slugging, even in DC where it was most strongly entrenched, the switch to HOT lanes seems to have weakened the culture, because people say 'I'll just pay today' rather than pick someone up. So it can spontaneously form, and when it does can be quite powerful, but is very fragile.


Dallas has these blocked off lanes using essentially plastic bollards. I'm sure they've wasted more time and money via the crashes they produce from people darting between them and the unequal speeds on hov vs other lanes than any carpooling may have saved. I'm not saying that all blocked off lanes are bad but the implementation on US-75 in Dallas has been a horrible mistake.


I rented a RAV4 recently for not one but two cross-US trips (Seattle <-> VA). Its adaptive cruise control (like its lane assist) felt mostly familiar from previous rentals. It takes a lot of getting used to, if you’re from a time when cars just mechanically do what you tell them (failure modes aside).

That said, I have the almost opposite take! The adaptive CC wasn’t perfect, but it had a perfect failure mode all but one time: it just stopped working at all when its sensors got fudged in any way.

The one exception was a near pileup when drivers ahead of me came to a sudden stop in the middle of an urban ramp soup that’s always high risk. Even then it did the right thing: it screamed (beeped) at me to brake, because I needed to control that and it had no safe way of doing it for me.

I’m similarly more vigilant with assisted CC, but I’m more inclined to use it because it makes me more vigilant. I almost never used a plain CC without it because it’s too easy to get lazy at the wheel and make mistakes. Constantly checking on the known-limited computer kept me alert but let me rest my legs for ~10k miles over a month and a half.

And it saved my ass on a sudden ice patch in Nevada, which was the other time it didn’t work exactly as expected. I don’t drive with a computer in control, I only delegate to it for menial tasks like predictable speed adjustment. But I’m grateful I had the computer to help me regain traction coming around an icy bend when the alternative was me and my pup and everything we had in tow were going to roll off a mountainside.


On my Highlander the beeping is actually a warning that basically says "Handle this or I'm going to slam the brakes for you."


I also like it because without having to stay so heavily focused on my speed I can spend more time on situational awareness of the other cars around me than I would with traditional CC or no CC.


I use ACC even in short bursts when merging onto the highway or changing lanes. Turn on ACC, then look over shoulder to check blind spot without worrying about the car in front slamming on brakes while my head is turned.


This is the way. Driver aids should fill in gaps and give new capabilities, not replace driver attention. Sort of like DSC or antilock brakes.


I have an older BMW with ACC and had a Tesla Model 3 as well. The ACC on the BMW rocks, it reacts quicker than I do when a car slows in front of me, but never vigorously breaks. Phantom breaking is unheard of. It only needs attention when the car in front leaves the highway and starts slowing down. But the manual specifically addresses this.

On the other hand, I found the dumbest 'cruise control' mode on T3 downright dangerous, you always had to have foot on the gas to be ready to override it. They just took something that was solved a long time ago and made it worse, I really do not get it.


I have adaptive cruise control on my Subaru Impreza and I love it. I use it on trips around town and on long road trips and it's great. I will never buy another car that doesn't have this feature.


My friend was using adaptive cruise control while towing a caravan. A car merged in front of him, his car applied heavy braking to maintain the desired distance (not necessarily to avoid a crash) but didn't compensate for the weight of the caravan. Almost lost traction, lead to heavy swerving. He's vowed to never use it again.


The use of ACC while towing is explicitly discouraged by my car's manual. There are also a number of other considerations that should be taken into account. These systems are practical, but not magic and users should try to understand them well before using.


From the description it sounds like your friend was driving a car that could not safely perform emergency braking. Is that even road-legal anywhere?


To be fair most people who are driving cars aren't towing caravans, so this seems like an extreme edge case.

I heavily use adaptive cruise control and I've never had any issues with it, though I'm always ready to take over in case anything complicated starts happening. It's a nice feature because it saves a lot of energy when the driving is very easy, and then you just drive normally otherwise.


> this seems like an extreme edge case

Most ships don't sink, but we still have lifeboats. Most jet engines don't fail, but we still train pilots to land without them.

Its edgecases that kill.


In this particular edge-case, it's the user fault for using a technology not supposed to be used when towing.


Manufacturer could include a sensor in the towbar.

Did the manufacturer think about this and include a warning? If not, how should the user know?


> If not, how should the user know?

The standard move is to put this in the user manual.

Especially for something as complex as towing, reading the manual should be mandatory.


I think the new F-150 lightning (and I would think equivalent ICE models) can handle that scenario.

I would think the manual likely said not to use it while towing. I’m not sure why one would be afraid of it when used correctly because it didn’t handle something it wasn’t designed to well.


Doesn't his car have ABS? Even under heavy braking, the car's wheels shouldn't lock up and lead to swerving. Did the trailer have its own brakes?


As everyone is pointing out to you, ABS solves the problem of your car's wheels skidding, leading to loss of control.

It doesn't help with the trailer that you're towing failing to stop as fast as you do, trying to run into you, and trying to turn to the side when it can't.

The solution to the latter problem is to be aware that you have the trailer, and to not brake faster than it can handle. The fact that the trailer makes it harder for you to avoid an accident is exactly why many jurisdictions put extra restrictions on cars towing trailers (for example in CA they aren't supposed to go over 55 mph - even on the freeway).


If ACC speed adjustment let to the trailer losing traction and swerving behind him, then there's something not right with his towing setup, even in a panic stop, I've never had my trailer swerve behind me. That's a serious hazard, ACC speed adjustment braking should not cause any loss of control.

I've seen pickups with too little weight on the rear axle lock up the wheels and get pushed around by the trailer especially when the trailer braking is adjusted too light, but with the advent of ABS, that's not nearly as common.

The problem here isn't the ACC braking (even though many car markers will tell you not to use it while towing), it's trailer setup.


At least in the UK, you're not supposed to have a trailer of such a weight that it would destabilise you if you emergency brake. That's dangerous and should be outside of the vehicle's/driver's specifications. Above a certain weight, you should be using a braked trailer.


Does ABS help if the car is braking faster than the trailer?


I believe you are thinking of skidding, not swerving.


I don't see how the trailer can force a car to swerve without any loss of traction through skidding.


Trailers normally have back-up brakes that engage as soon as there is pressure rather than tension on the tow bar. To allow reversing they also have a block that allows you to disable the backup brake (usually a lever that moves in to avoid the compression of the tow bar). I wonder if your friend had accidentally forgotten to re-enable the automatic brake after getting ready for the trip.


My Mazda has adaptive cruise control - it has absolutely revolutionised my highway driving. It is intuitive because the heads up display gives you a distance to the vehicle in front; and you can set the distance you want to maintain. It tracks your progress/distance so you can see when it will slow/maintain and it behaves gracefully if someone cuts in


I don't know how the Mazda one works, but for Volvo you actually get 1-pedal driving if you want. Set the speed a bit lower than your expected speed (for example in crowded areas), then accelerate as you wish. Once you take your foot off the gas, the car is set at a lower speed, so it brakes for you if needed.

Ta-da! 1 pedal driving :-)


That’s interesting - haven’t seen that


I might be missing something but wouldn't it be perfectly fine to override it in the exact same cases you'd override regular cruise control? Sure, many of those might not be necessary; but if you do it, the experience shouldn't be any different from what you're used to, no?


I think their point being that, being adaptive, it lulls you into a sense that "oh, the system will see the car ahead slowing down and will slow down my own car for me."

Older "dumb" cruise control you had to be ready for the brake at all times — there was no need to second guess the system.


I have to do the drive from Portland to Seattle about once a month. ACC makes the drive so much more enjoyable. I can pretty much zone out until it is time to pass someone. Then, depending on speed differences, I move over and pass without intervention or I press the throttle a bit to pass more quickly.

When I get to the inevitable slowdowns, my car just slows without me having to think much about it unless it’s a sudden one.

I also have a Golf with standard cruise and I can’t stand using it since most drivers around me can’t maintain a constant speed, so I am always adjusting it or canceling it.


I've caught myself doing the same on occasion. I try not to do it anymore. If I may, I'd suggest to replace "zoning out" with "applying my freed mental bandwidth to monitor things that I would otherwise not have been able to pay attention to otherwise". IMO, driving aids should be used as emergency backstops and possibly a method of optimizing available attention. They are best avoided as an excuse to reduce overall attentiveness. For example, when ACC is on I can more reliably look a little farther ahead in traffic to preemptively react to developing risks, because I'm not micromanaging the throttle.


I can pretty much zone out

That seems bad!


In my driving style the ACC helps me by letting me focus on only braking. Without ACC taking care of the throttle, I would have to micromanage my speed to keep it above an acceptable limit so I won’t get annoyed followers.

With ACC I can decrease the amount of foot shuffling and just rest my foot next to the brake pedal in case something happens up front. My car model comes with emergency braking that can be toggled off but it worked perfectly one time a driver slowed down unexpectedly (alerted me first, then braked).


I once rented one of those Golf at the airport, didn't like specifically the Golf ACC (possibly it could be somehow tuned/set differently) because - at least in my case - while driving on the leftmost lane it could "see" a car merging on the leftmost lane (to overtake someone going slower) at a too big a distance, immediately slowing down even if it was unneeded, really plenty of space before my car, then re-accelerating to the set speed as soon as the car in front returned to the right lane.

In slightly heavy traffic conditions on the highway it was unpleasant/confusing.


This is the exact subject of the book; The Glass Cage by Nicholas Carr. Airline industry avoids too much automation for this very reason.

Vigilance is hard to moderate when you start automating too much.


Don’t you just tap the brake to disengage adaptive cruise?


The problem isn't that disengaging the system is difficult. The problem is that deciding whether to disengage is difficult. If you're catching up to a slower car, maybe you should disengage the cruise control. Or maybe you shouldn't. You can't tell.

This means that cruise control goes from being something that relieves stress on you to something that causes additional stress.


Every single adaptive cruise control system I've seen allows the driver to set a follow distance that is far too short. For example, on my Subaru, 4 bars is really the minimum "comfortable, relaxing" amount. But if traffic starts to get a little heavy, 3 bars works a little better, but you do have to be more vigilant. If you are down to using 2 bars or a single bar, you had better be hyper vigilant. I can't imagine a safe way for the car to operate with those settings, unless perhaps they are useful for stop-and-go interstate traffic.

To make matters worse, every single car salesman I've encountered has the feeling that doing everything to the extreme is the best way to sell cars. Which means that is is totally, completely believable that the OP with his Golf had the salesman set it to a minimum follow distance and then not explain that they did this or how to change it!


I only have one car with ACC, it'd be really nice if it had half size adjustments, because N bars is too close, and N + 1 is an engraved invitationn for people to pass me up, but not immediately the car I'm following. But once they do pass that car, the ACC wants to speed up just to slow down again.


In my German vehicle, ACC reverts to max distance every time it's engaged. You have to manually shorten the distance every time with the steering wheel buttons. I hope that this behavior is standard, but wouldn't be surprised at all to learn different systems have a less safety-oriented behavior.


Neither of my ACC cars have this behavior! I agree that it would be safer to do it this way.


I found in a Tesla model 3 that I borrowed that even the maximum follow distance was closer that I would normally drive. It didn't feel dangerously close, just slightly uncomfortable - closer than I would have liked for a maximum.


I'm not getting why you think you need to override the cruise control. I had that concern when starting out with ACC, but I set its follow distance to something generous enough that it would trigger a speed change before I got anxious about it failing to trigger the change. That let me learn to trust it.

Although now that I think about it, I was using the Subaru ACC, so the dash has a visible indicator if it's tracking a car and maintaining speed relative to that. Perhaps yours doesn't have that?


For me I think the issue is trust. Its not much different from what I feel sitting in the passenger seat of a car when anyone else is driving. I don't trust their driving so if it gets slightly trafficed, I start freaking out because I don't trust the driver. I also don't trust the ACC on my Subaru in more dense conditions so when it get busy and I start approaching the rear of another vehicle, I go into a mode where I start paying close attention to the approach thinking "at what point do I cut it off, and hit the brake?"


Got it. I definitely remember having these feelings. If you find them problematic you might, like me, benefit from talking with health professionals about anxiety. Feel free to email me if you'd like to discuss it further.


My car (Honda) tells me when it sees a car in front of me, I assume that if there's a car in front, the ACC will slow down accordingly, and so far, it always has.


Seems like the car should have a UI that says something like "Car ahead is 15km/h slower, I will brake to maintain 50 meters distance between us, current distance: 70 meters, approach speed: 4m/s, will start braking in 3 seconds.", so the driver knows that the computer is aware of what's going on. Use some pretty graphs/animations to make it easier to understand.


I’ve had two cars with the feature.

Both have a very simple there/not their car silhouette to let you know if it detects the car in front of you. Just below that they also have some sort of colored to tell you how close the car is (good, clothes, to close). Last is a silhouette of your car. All three in a simple stack.

It’s an excellent design, perfectly glanceable. I did rent a car once that showed you how many feet away the car in front of you was. It was kind of interesting but I found it completely useless.

Realistically when the car in front of you starts getting close enough that the indicator changes, the car is already slowly slowing down. If it turns red, your car is slowing down faster.

You really don’t need any of the numbers, and they would likely take longer for a person to process than the simple visual imagery does.

I haven’t had an issue of not understanding what the car is/is going to do. The only real prop I’ve experienced is some cars are willing to accelerate/decelerate much faster than I want them to, largely for comfort reasons.


If you are catching slower car, you dont have to disengage. That being said, I simply disengage whenever I feel like, whenever I feel any doubt. It is easy to turn on again.


I would never want that though. It should change lanes and pass.


It’s very useful on long drives since you don’t have to constantly adjust your speed by little bits (manually or adjusting the cruise control setting).

Takes a moment to moment mental load off so you can focus on the other cars around you better.

If the cars ahead of me are slowing down significantly I disengage and go manual until I have a better idea what’s going on.

My car can supposedly stop completely that way. I’ve never tried that, as it gets very slow (relative to highway) it’s willing to get way closer to the vehicle in front me than I am. I trust it (in theory). But it’s not worth testing.


I wonder why this is preferable to just taking the train for longer trips? Wouldn't it be nicer to sit back and let someone else do all the driving for however many hundred of you are all going to the same place anyway? Once you get past the long part of the trip you could always rent a car if it's rural or whatever.


The only train I could find that goes anywhere near my mother's house would take approximately 12 hours, with a 5 hour layover in a small town with ~nothing to do. I can drive it in maybe 1.5 hours. That's why.


Is this considered an acceptable situation to you and your community? What does your government representative say when you demand better public transit solutions?


The train would be nice if it was a reasonable alternative, but in the US outside of very specific trips, it's usually at least twice the travel time as a car, door to door, with two drivers and maybe 10-12 hours of daily driving.

And then you've still got to manage ground transportation from wherever the train station is to wherever you want to go.

It's great if you're a container though.


ACC is nice on freeways around town if traffic isn’t heavy (I like more control when it is). Long distances is where it really shines.

Really the train isn’t viable in most of the US. And then you have no car at the end, probably in a city with bad public transit unless you want to take taxis/Ubers or have friends/family drive you around.

So for long hauls, it tends to be a car or a plane. And that decision is heavily influenced by distance.


Kind of derailing things, but train usefulness varies on area. At least where I live, train journeys are usually more expensive and significantly slower than a comparable flight (maybe similar speed to driving but still more expensive). And for a lot rural areas it's not a viable option


It can if they want it to. It’s a car.


FTA: > the system is called Full Self-Driving

Also FTA quoting Tesla (https://images-stag.jazelc.com/uploads/theautopian-m2en/repo...):

> It does not turn a Tesla into a self-driving car

Is it self driving or not?


Systems where we try to "outsource" the piloting of a vehicle seems to have a big problem with naming. "Auto Pilot" in planes does not mean that the plane pilot itself automatically, it just means you can pilot "less" than if you have 100% manual control.

Seemingly Tesla is copying this, with "Self-driving" doesn't actually mean the car will drive by itself, but that the driver can drive "less" compared to before.

Deceptive marketing at best, fatal at worst.


The thing that really bothers me is the 'full' part. Full self driving. There's nowhere to go, this is self driving to the maximum extent.

Oh and it's not self driving.


There is one additional part in play: the drivers following the Tesla clearly didn't keep their distance. Normally, at least here in Germany, drivers are supposed to keep enough distance from those driving in front of them that, even in the situation that something like a brake defect forcing the car in front to a complete stop or a truck blowing a tire, they do not crash into it.


They weren’t following the tesla, it moved into their lane.


There are 6 other cars that crashed, each into the one infront of it, it looks like nobody was paying attention.

If the Tesla was at the back or the middle I think it's highly likely it would have outperformed the other drivers and stopped faster or kept a safe distance.

A series of problems lead to this accident, FSD Beta gets 5% of the blame here for stopping on the highway, which isn't an exclusive feature to Tesla - cars stop on roads all the time.

What this demonstrates is 7/8 or 8/8 drivers here were driving unsafely.


> if the Level 2 paradigm was flipped, where the driver was always in control, but the semi-automated driving system was doing the monitoring, and was ready to take over [...] but would be less sexy in that the act of driving wouldn’t feel any different

I think this is the the most important point of the article and largely ignored here in the comments who seem to focus mostly on who was to blame for this specific accident.

We know the strengths and weaknesses of both humans and tech at this point in time. Humans are overall better decision makers but aren't 100% focused 100% of the time. Tech gets confused a lot but is never tired or inattentive. So if your goal is safety you let the humans drive but take over in emergency situations when the human is not reacting. Which is what most car manufacturers do right now. Letting the tech drive and expecting the human to provide perfect reaction time every time the tech fails is playing on the weaknesses of both. This is focusing on cool marketing at the expense of safety.


> doing the monitoring, and was ready to take over

This isn't even uncommon. Almost every Honda sold for a while has been a L2 system that will take over in certain ways if the car believes a crash is imminent, such as a car suddenly breaking.

> but would be less sexy

It's so less sexy people don't know that millions of vehicles are sold this way...


If there are fewer accidents per X miles with FSD, then I don't see how you can claim anything is at the expense of safety.


Tesla is primarily at fault for deceptively naming the function "Full Self Driving". It is indefensible mendacity.

I do not understand why the company has not already been sued into oblivion for an obvious lie that has killed people.


Video understanding engineer here.

Tunnels and underpasses are the worst. They are a pain in the ass, because shadows mess with all the edge detection and motion models and anything else visual. Humans compensate by thinking "I'm in a tunnel: things are weird." But without a reasoning model that can take context into account, the computer is stuck.

In the video from behind, you can see the shadow ahead of the car on the floor of the tunnel that it carefully stops just before it would hit. A person would notice that EVERY OTHER car had driven straight through the thing it thought was an obstacle, but that is also context this car isn't going to take into account.


Interesting observation.

I worked on autonomous vehicles (in vision) at Daimler in 1991. During one of our test sessions, on drying pavement, the vehicle abruptly slammed on the brakes and refused to proceed past a point where the vision system could see a symmetric about a centerline set of horizontal edges on the pavement. It tightly fit our (hand-coded, being 1991) model for a car ahead. We had to revert to manual control and drive back to our staging area and wait for the track (set of runways and taxiways at a disused airbase) to finish drying.

Obviously, the state of the art has significantly improved since then, but some fundamental risk of misinterpretations could easily remain.


Is this sort of problem an argument for lidar or similar? I'm assuming you're describing a camera only system.


There are so many tunnels in Switzerland, that using AP is a huge pain. But at least it keeps me attentive and ready to take over at all times.


This is an honest question here but why is the pile up so bad if driving conditions were ideal? The next 6 drivers were not keeping a safe distance or paying attention?


Have you watched the video? Its freeway speeds, a decently busy freeway, at a tunnel entrance that’s curving. Plus the car switched to the passing lane and abruptly stopped.

This wasn’t going straight on a highway stopping in the same lane the driver was already in.

I can understand how the conditions probably made it worse.


This is hardly the "passing lane," it's just following a left lane exit onto treasure island. the entire bridge is a splits into three different freeways when it touches down in Oakland. You pass in whatever lane is moving fastest.

The tunnel is not curved. it's straight.


> Have you watched the video? Its freeway speeds, a decently busy freeway, at a tunnel entrench that’s curving. Plus the car switched to the passing lane and abruptly stopped.

The Tesla car switching and abruptly stopping was a big problem, and I can understand that first car having trouble, but for cars behind the first one they should ideally all be keeping a distance ready for suddenly hitting the brakes in an emergency. I find at least in Australia a decent number of drivers do not keep an adequate distance from the car in front for emergency braking.


In the police report, out of the 8 cars, only 3 cars were noted to have been driving unsafe for the stopped traffic ahead. The rest of the cars were propelled into the car in front by the car behind them hitting them.


That makes a lot of sense. Doesn’t matter how careful I am if the person behind me only gave themselves 10 feet to react and stop.


Somewhat counter intuitively, and to great frustration of the person behind. The safe thing to do when someone is tailgating is to give more space in front of you so that you can brake more softly giving them more time to react.

Or pull over and let them pass, as people often use tailgating to attempt to indicate the car in front is too slow.

Edit: (tailgating = follow too closely)


I used to think the bumper stickers that said "the closer you get, the slower I go" were purely out of spite, but yeah, as it turns out they can be both for spite as well as for sound safety reasons.


Even the recommended 2 second following distance is insufficient when the vehicle in front of you comes to an immediate halt and there’s no room to switch lanes.


The recommended distance is not 2 seconds at highway speeds. 2s is a rule of thumb for <35 mph; for 50mph, it's somewhere around 4s.

But, in any case, it is your obligation to leave as much space as it takes for you to stop your car before it hits the car in front, taking in consideration the speed of both vehicles before braking and the condition of the road. The rule of thumb is there for you as a baseline; blindly following it does not automatically mean that you're in the right if an accident results.


The recommended following distance is "Never get closer than the overall stopping distance." [1], so even if the car in front somehow went from 70mph to 0 instantly, you should be able to stop.

[1]: https://assets.highwaysengland.co.uk/stay+safe+stay+back/HE_...


That is not possible. I know it is impossible, because I am trying. What will happen when you have that distance is that someone will change lanes to what he perceives as huge space between cars. And they will get very close to you initially.


Expecting people to follow that far is like expecting people to read every TOS.


Dude, we’re talking about explaining away a Tesla failure here.

Pedantic analysis of DMV regulations is absolutely essential. Clearly any dolt practicing defensive driving would anticipate the self driving car cutting him off and hard braking.

Stories like this make people think that full self driving is actually a car the drives itself fully. The car is safer than any human and by critically thinking about this we are preventing sales and essentially killing people, since Tesla has fewer deaths per mile travelled.


> Clearly any dolt practicing defensive driving would anticipate the self driving car cutting him off and hard braking.

Not sure if you are being serious. How can you tell whether a car is driving autonomously? Expect it from any Tesla?

I haven't consciously ever seen a car driving itself. And I like to think of myself being pretty defensive as a driver. I'm not even sure if the feature is available in Australia.


Of course I’m not being serious.

But if there was a video of a Tesla doing anything, the fandom of Tesla is such that defenders would appear.


> Pedantic analysis of DMV regulations is absolutely essential.

If you think following the most basic driving safety rule is 'pedantic', please don't drive.

> Clearly any dolt practicing defensive driving would anticipate the self driving car cutting him off and hard braking.

The Tesla was 100% at fault, nothing can be done when you get cut off like that. But that's not the case for every single car behind them that failed to stop in time.

> Stories like this make people think that full self driving is actually a car the drives itself fully.

In my opinion Tesla has been criminally misleading with their avertising. The level of gaslighting they've been able to get away with is unbelievable.


The difference is that not reading the TOS will not kill you.

It's also not just some arbitrary rule, it's a fundamental principle based on the rules of physics.


That doesn't reassign fault. It maybe adds more people to the "at fault" group.


its very difficult for cars behind to judge the level of braking. in a Freeway everyone assumes to break enough for a slowdown and not full stop. In Italian Highways there is a very practical custom. the cars that see a full stop not only break but signal full stop to behind cars so they know to fully break an not only slowdown


That isn't the passing lane as its a bridge (not just a highway) and the left most lane before that area is an exit, so not guaranteed to be the "fast" lane.


Cars are traveling at 'freeway speeds' in the video, but they are breaking the law. The Bay Bridge is signed for 50 MPH. This issue is endemic. When I lived in the Bay area, I'd set cruise control to 60MPH on the Bay Bridge, and would be practically run off the road by traffic speeding by.


I think it’s mostly caused by the folks making abrupt lane changes as the Tesla slows down. You can see people start to swerve into other lanes instead of just slowing down (they are looking in their side mirrors to avoid the slow car and sometimes even accelerate). I’ve had this happen where you are cruising at a speed and the person in front of you serves to avoid a slow car and now you lose all of the time to react by changing lanes as well

I’d even wager if that pickup hadn’t swerved and everyone just slowed down it wouldn’t have piled up

Side tangent, I love watching car crash videos. Really interesting to see how the system breaks down and people make mistakes. I spend hours on YouTube sometimes :)


> Side tangent, I love watching car crash videos.

Your and my YouTube feed algorithms....

They kind of creep me out when it is a crash that was not avoidable. But, boy, most of the crashes have me talking to the computer display, "What are you thinking? You have no stopping distance at all, I know exactly what's coming up in about 3 seconds!"


Exactly! Sometimes when I’m driving I get a weird sense of a terrible crash that I couldn’t see coming. I should probably give the videos a break


The car behind the pickup truck had almost 0 stopping distance. To me, it looks like the pile up could have been prevented there too. Good reminder to keep a safe distance.


> I’ve had this happen where you are cruising at a speed and the person in front of you serves to avoid a slow car and now you lose all of the time to react by changing lanes as well

It's worth anticipating this by watching the other lanes and what drivers are doing. If you can see a slow down happening in an adjacent lane then you might reasonably expect at least one of the drivers approaching to pull over in front of you.


You might like J.G. Ballard then.


Honestly, when you're driving, you sometimes only see the car in front of you. You might see it slowing down first, and you start slowing down, or continue because there is enough distance anyway. Then the car in front of you stops abruptly or crashes. Now you panic and brake hard. Maybe you still crash into the car in front. Even if you stop before crashing, this hard braking causes the car behind you to crash into you.

TL;DR - Sometimes the 3 second following distance is just not enough even if someone is paying attention because they can only see the car in front of them.


> You might see it slowing down first, and you start slowing down, or continue because there is enough distance anyway.

That's the thing, though - if you were already at the distance limit, and the car in front of you starts slowing down, you have to also start slowing down right away to maintain said limit. If you do not, then it's already "not enough distance" by definition at that point, and you're the one responsible for that.


> you have to also start slowing down right away to maintain said limit

Absolutely agreed. What I meant to highlight is, the reaction time can make a huge difference in situations like this.


>TL;DR - Sometimes the 3 second following distance is just not enough even if someone is paying attention because they can only see the car in front of them.

Yep... the usual advice locally is 5 to 7 seconds gap for freeways and similar, before accounting for weather and other conditions. Of course, that assumes you can leave a gap without some bastard deciding to sneak in and occupy it.


> without some bastard deciding to sneak in and occupy it

Unfortunately, this. It sucks to keep a larger gap, have some idiot sneak in, and now you have to brake to maintain the distance again, repeat. FWIW, I am very careful to maintain distance just for situations like the one in this post.


On this bridge you're always going to have someone occupying open space, but you can still maintain 5 seconds of visible leading distance. Alas, a lot of drivers on this stretch of freeway are insane.


A good analysis. The stated vigilance problem is well known, actually, but it is ignored, supposedly in the hope that humankind is on a progressive path to fully autonomous self-driving, and that we need this phase of experimentation to advance technology.

Completely autonomous self-driving cars (without any steering wheel, so even incapacitated or clueless people may 'drive' (like drunk or in labour or children)) indeed seem like a good solution. (Except we need less individual traffic for env reasons.) Unfortunately, the problem is very hard, technologically, and the current interim solutions will stay for a while.


Vigilance would be hard in this scenario because the Tesla changed lanes and then suddenly braked. My vigilance would be focused on whether I'm about to hit something or make an inappropriate merge, but I wouldn't have expected a sudden brake.


I’m always curious how different insurers cover the use of hands-free driving. Would anyone still buy the FSD feature if people thought insurers would reject accidental damage claims? It feels like its sitting on sketchy ground


I think the insistence on having data for every possible situation for training purposes is indicative of the problem. Humans only require a small amount of training and can extrapolate this to many situations.


This is really very wrong. Human beings create exactly this kind of accident every day. There is absolutely nothing remotely"non-human" about cutting someone off and stopping in traffic.[1] Just go check /r/idiotsincars for an hour to see much, much worse.

[1] In fact, this is so unlike FSD's behavior that I still think it's more likely that it will turn out not to have been in use at all. The only evidence at hand is one sentence in a police report that the police themselves state was unvalidated. How easy would it be to blame the car as an excuse?


The bar for FSD should not be /r/idiotsincars. What doesn't add up to me however, is the need to have training data for every possible situation that a vehicle could encounter (FSD should reason based on more fundamental axioms). Also, the "cameras only" argument makes sense only to a point. Yes, we drive largely by 400-800 THz photons. However, the eye has incredibly good anti-occluding abilities where a stationary camera can be occluded by a spec of dust.

In any case, I am not taking an anti-self-driving stance by any means - we'll get there eventually. Tesla is leading the way and taking all of the negative press.


People lie about having FSD enabled all the time. And the internet loves shitting on Tesla so they'll treat unverified claims as fact and then perpetuate it forever as long as it fits their narrative.


I don’t disagree. I think what is happening is we all thought FSD would be available by now. Since it is not, we are now collectively dismissing it. The world will be much better with FSD however so let’s give it a chance.


Technically it is - it's just in beta. There has been a full rollout. Anyone who's purchased it can use it. And obviously needs work, but it's also not the death trap people like to make it out to be.


It literally drives me around every day. Seems pretty good for a product that is "not available".


If it is ever necessary to take over then it is largely pointless. The only value is helping us get to the point where you can chill in the back seat.


Don't you feel the goalposts moving here? You went from "Humans never make mistakes like FSD does" to "FSD doesn't work yet" to "FSD works, but I don't think it has value yet". And you got called out on all of those and shifted.

Maybe you could agree to settle on: FSD works pretty well, seems not to be unsafe vs. human drivers, but isn't finished, and is improving rapidly? Might have saved us all some time.


Moving goalposts and having a nuanced view are not the same thing. I agree with your statement above except that human drivers are still obviously better on average. Looking forward to when that changes.


Can someone please explain to me why the system cannot tell you why it stopped? What prevents the program from explaining why it performs certain actions?


I'm sure Tesla has logs. Also in the visualization when it's slowing for something it shows such object in blue or red.


Very common in machine learning.


Obviously, the Tesla FSD Autopilot made a big mistake, but didn't the driver react? If my car slows from 40 to 0 MPH in 5 Seconds, and there is no obstacle in front of me, I must assuredly would step on the accelerator.

Not that FSD Autopilot is what it's marketed as, but this is the responsibility of the driver and not the car.


This story also highlights just how little Elon Musk actually supports free speech.

Since fighting for release of this video and publishing the story, Ken Klippenstein has been censored on Twitter through shadow bans and inability to find his profile through search.

https://twitter.com/stevanzetti/status/1613295292283236358


Ken was deprioritized from search weeks ago. Also he wasn't shadow banned - just deprioritized (you have to type his full name to see his result). Can we please stop with the unfounded hyperbole?


Why was Klippenstein deprioritized? Elon Musk is disingenuous in his support for “free speech” if Twitter is limiting the reach of a prominent Musk critic.


I don't work at Twitter - only someone who does would know the reason why. Being a Musk critic shouldn't give you a free pass to never be rate or search or visibility limited like anyone else when your account gets flagged for suspect behavior. His own colleague claimed it was likely due to bot-like behavior. Any speculation that it's Musk related is completely unfounded unless there's literally any evidence other than circumstantial speculation.


In other news, no car has ever seized up and stopped at an inconvenient moment before Tesla made this horrible mistake, and keeping a safe distance would not be necessary if it weren’t for those dang automated drivers. Now get off my lawn!


-- had an engine seize up in college on the highway - doesn't just grind to a stop - still have a lot of control and speed - and - didn't change lanes into the left lane - i moved to the right slowly with my hazard lights on --


It’s great that you were lucky, others are not. And it’s not a big deal, it’s a fact of life. Except when it’s a Tesla, of course.


From what I can tell, the car behind the tesla didn't crash into the tesla but stopped before. And so did the car behind that one. It is the cars after that that crashed into each others.

While automatic cars doing random things is certainly problematic, clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances and not able to stop when there is a traffic jam ahead.


In my opinion 90% of the reason we don’t have more accidents is because people are so predictable. Watch dangerous drivers weave through traffic - it’s only possible because everyone follows spoken and unspoken rules. Randomly stopping is not some “you need to be prepared, it’s your fault” moment.


A car switching into an empty lane and then stopping hard even more.

If a human driver did it deliberately as the Tesla did (since it was designed to pull over and stop) then I would consider it a criminal level traffic violation on the part of the stopping car.


I had an accident exactly like that. Car in front of me fully stopped all of a sudden as I was accelerating. This was right passed the lights. No traffic or anything in front him. I don't know why he stopped. My insurance told me he was 100% at fault.


Here in the UK, that happened to me on a roundabout. Elderly driver in front pulled out and then, for no visible reason, slammed on the brakes. I ran into the back of her.

It was deemed to be be 100% my fault. Here, if you run into the back of the car in front, it is always your fault. No exceptions.

It makes sense: you should always leave enough room to stop, so I couldn't really complain. But, if everybody did that on that particular busy roundabout, it would result in gridlock.


I've heard a story from a friend that was in an accident like that. A police officer who did the paperwork smiled and said "It's the most common type of accident - people just stop paying attention to the car in front that started moving, and develop sort of a blind spot to that".

I drive for several years, and can say my brain relies heavily on this sort of predictable stuff like "that car has free intersection in front of it, it will move forward".


Crashed at a intersection into the driver in front of me. Started driving, and i looked wethere there was traffic coming from the left, the other driver in front of me stopped for no reason. Still my fault, but in some situations, with multiple dangers, you rely heavily on the people behaving predictable.


>It makes sense

If you think like a good citizen, yes.

In a country north of UK the law was once the same. People with worn down cars would find the perfect opportunity to slam their brakes, intentionally get hit from behind and claim insurance.


> Randomly stopping is not some “you need to be prepared, it’s your fault” moment.

Drivers are indeed expected to always be prepared for sudden stops even at the highest speeds.


The other day I was driving behind a car on a country road, he did an emergency stop to avoid a rabbit in the road. I avoided hitting him, I stopped about 2m from him because I was careful to leave enough space. If I had hit him I'm sure the insurance companies would have placed blame on me and it would have been my insurance that paid out. But ultimately it would have been his "fault" for doing something utterly unpredictable and dangerously stupid.

If it had been a large deer, of course he should have stopped, at that point it's the safety of the people inside the car.

In law for insurance purposes it needs to be clear cut, person behind is almost always at fault. But that doesn't mean they are the cause of the accident in all cases. There is nuance to these things, and part of that is that braking for a rabbit, or using Level 2 automation, is increasing the chance of an accident happing on the road.

If the way you drive increases the risk of someone driving into the back of you, even if they haven't left enough space, you are at fault in my mind.


This reasoning makes very little sense to me. Why drive assuming that it will be a rabbit most of the time and not a child or a deer or whatever you deem worthy of stopping for?

If the possibility exists that a driver in front of you may stop suddenly why not always leave a minimum amount of space for a safe stop considering there is virtually no downside to doing so?


> This reasoning makes very little sense to me.

The reasoning doesn't make sense because it is applied retroactively. The parent poster doesn't want to take responsibility for their unsafe driving practice and tries to place the blame on the other driver.

Cars suddenly stopping in front of you isn't some unpredictable rare event. You are told this as part of the licensing procedure, and this is the reason minimum safety distance exists. The solution is obvious and known to any licensed driver. As such, it is unreasonable and unacceptable, to not follow the minimum safety distance.


Thats not my point. You should, and I do, always leave enough space to stop, but not everyone does.

Being a safe driver is being aware that other people, including the people behind you, may not be leaving enough space. Suddenly stopping for a rabbit, from a speed of 50mph, is f-ing stupid and massively increasing the risk of a serious accident on the road.

Using a Level 2 automation that can randomly stop for no reason is not safe driving, you are increasing the chances that someone who is too close drives into the back of you.

Just because someone else is driving unsafely doesn't absolve you of the responsibility of being aware of that and not driving safely yourself.


You should be able to stop at any time for any reason and not worry about the car behind crashing in to you. They should leave enough space. A rabbit is a very good reason. Maybe it's not a rabbit, maybe it's a rabbit-shaped rock that would cause serious damage to your vehicle, causing you to leave the road and die.


As long as there are valid reasons to do an emergency stop, the actual reason should be irrelevant for the person driving behind. You should always assume it is a child chasing their ball or something and it will be your car pushing the car that managed to properly stop into that child if you rear-end it. There just isn't a good reason to put yourself in a situation like that and it is perfectly in your power to avoid it.


I cycle a lot in an area that is quite touristy. There is a lot of shared infrastructure with pedestrians, rollerbladers, scooters and skateboarders. You get very good at spotting people who don’t know what they’re doing that are potentially dangerous usually because they’re oblivious to their surroundings.

For example, pedestrians crossing a bike path. Because a lot of people clearly don’t walk often they will just walk out without looking. People aware of their surroundings will look both ways. As soon as you see someone do that you can pass close to them without spooking them because you know they’re aware.

My point here is a ton of this comes down to acting predictably. Even a simple act like looking at someone will alleviate a ton of uncertainty.

The barriers to fully autonomous self-driving are huge and not necessarily technical. Acting predictably, being able to explain actions, drivers driving differently because another vehicle is automated and cultural differences.


Suddenly stopping at high speed is generally considered (at least in the UK) dangerous driving and will get you prosecuted and probably mean you lose your license at the very least. If it results in a death then you can get a life sentence in prison for it.

Intentionally stopping inside a tunnel is a pretty clear cut case of dangerous driving over here.


However there can be many reasons why a car has to stop suddenly: for example, a child might suddenly cross the road, or the driver might feel he or she is having a heart attack.

My driving school teacher used to say: always remember that the car ahead of you could suddenly stop at any time for a reason that you might not know.


This happend on a highway. The Tesla switched lanes into oncoming traffic moving faster than itself. It immediately started braking for no reason. On top of this, it all happened in a tunnel entrance.

Yes, you should remember that vehicles can suddenly stop. This means keeping a safe distance from the vehicle in front of you, and looking beyond that vehicle to anticipate what is going to happen. However, a vehicle illegally throwing itself in front of you and doing basically an emergency stop because it just feels like it is not something a reasonable driver is expected to predict.

If you look at the video, you can see that the first vehicle is almost at a full stop when it hits the Tesla - its driver performed as expected. The second vehicle comes to a full stop before hitting the first one, so that's also going quite well. The third vehicle (a rather large truck) doesn't brake but instead swerves into the other lane, almost hitting another car. This dooms vehicle number four, which suddenly finds itself mere feet away from a stopped vehicle. Five through seven are tailgating and never see it coming for the same reason. Number eight is able to stop in time.

The speed limit here is 50mph. Assuming they were driving the speed limit, the typical stopping distance would be 174 feet. I can assure you, nobody keeps a distance of 174 feet between them and the car in front.


In this case the Tesla stopped because of what you might charitably call a breakdown. This would probably get you out of the dangerous driving conviction but if it happened multiple times would probably result in a recall on the vehicles to fix or disable the faulty part.


The longest delay on a motorway I've ever had (get out of the car, walk the dog, hang around for a couple of hours in 32C degree heat) was a case of a driver of a small car having a heart attack, slamming on the brakes, and getting rear-ended.

The car ended up half it's length.


From what I gather from the article the car didn't suddenly stop.

I still suspect it would be classed as dangerous driving as there wasn't a need to stop and they did find a safe place to stop.

If it did suddenly brake though and it was the car, that seems like something Tesla should be liable for. The time taken for any driver, when a car starts automatically braking, to assess the situation and override isn't going to be enough to avoid the dangerous situation.


The article links to another article with a video recording of the accident: https://theintercept.com/2023/01/10/tesla-crash-footage-auto...

The car does stop pretty suddenly.


-- not only does it stop suddenly - applies break - signals into a new lane - changes lane into new lane - breaks hard - well actually - if you watch the video frame by frame - it starts to initiate a left hand turn for lane change before it even signals - if a driver did this to me - sure i hit them --


> If it results in a death then you can get a life sentence in prison for it.

Can you provide an example of someone getting a life sentence because of an accident caused due to braking? That doesn't sound right.


I think they're referring to the offence "Death by dangerous driving" and it now can carry a life sentence[1], but you would really have to be doing something exceptionally bad to get a life sentence - it would have to be deliberate, likely a repeat offence, and have other aggravating factors for that sentence. Real sentencing in the UK is very lenient- famously an American woman killed a teenage motorcyclist when she was driving on the wrong side of the road, claimed diplomatic immunity and fled the country, she eventually returned to the UK to be sentenced to... 8 months in prison suspended sentence.

[1]:https://www.sentencingcouncil.org.uk/offences/crown-court/it...


She didn't return. She deigned to attend the trial via video link from the US.


This is completely untrue in almost every aspect. Drivers being prosecuted, let alone convicted, for sudden braking simply does not happen.


That's not the way the law works in the US. People are legally responsible for hitting something in front of them but not for getting read ended after slamming the breaks. Partially this is why the Tesla system is so trigger happy about braking. Brad Templeton goes into the design/legality issues around the accident in some detail here:

https://www.forbes.com/sites/bradtempleton/2023/01/11/an-8-c...


Do you have a citation for that? I have always understood the rules to be that you should always remember the car in front of you can make an emergency stop at any time for any reason, and it is always your fault if you hit them (unless they just moved into your lane).

Conversely if you need to stop suddenly, e.g. something has crossed the road in front of you (or you think it has), you don't worry about the vehicles behind, you just stop.


This is a case of phantom braking, which is illegal.

People often forget about the most important traffic rule: it is not allowed to cause an unsafe situation.


If you crash into the car in front of you in your lane, the responsibility is yours both morally and legally. The only car which has a valid excuse is the car immediately following the tesla since the tesla inserted itself into its lane then immediately stopped, not allowing it to build a safety distance.


This is not the case in Germany. Unnecessary emergency breaking will have you be at least 50% at fault.

You are putting other people into an emergency situation which is endangering everyone around you.


The video does indeed show that this happened.

The driver's account, as quoted here, doesn't even make sense. In the first sentence it says he was driving along in lane 1, the car slowed, and he "felt an impact."

The next part is contradictory, citing a different lane (2), a different cruising speed, and then a lane change... and no impact. WTH?


Are you surprised you can be guilty of an accident when you have not hit anyone?


No. Why do you ask?


I had an accident exactly like that. Car in front of me fully stopped all of a sudden as I was accelerating. This was right passed the lights. No traffic or anything in front him. I don't know why he stopped. My insurance told me he was 100% at fault.


Would he still be 100% at fault if he was stopping for a child you didn't see?


Why do you think that matters here?


> If you crash into the car in front of you in your lane, the responsibility is yours both morally and legally.

It's not that simple. If it was it would be legal to break check someone and it would be that persons fault for crashing in to you.


The cars stopping because there is an accident ahead aren’t doing something unsafe. You running into them is your fault.


This is not true. If you cause an unsafe situation, then you are certainly to blame, regardless of other traffic rules that you think may be relevant.


This seems like a rather absolute take


Illegal? Intentional bad driving is illegal, yes.

But you would need to prove that incorrect automatic breaking is intentional, which has not been proven in any court I know of.


> From what I can tell, the car behind the tesla didn't crash into the tesla but stopped before.

The second video here clearly shows it crashing into the Tesla:

https://theintercept.com/2023/01/10/tesla-crash-footage-auto...

> While automatic cars doing random things is certainly problematic, clearly the cause of the crash here isn't the tesla

The Tesla changed lanes, moving in front of the second vehicle, and immediately applied the brakes. That is 100% the Tesla's fault.


> clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances

The cause of the crash is the Tesla. You are not allowed to stop on bridges, in tunnels and several other places. The crashes starting with the Nth car and not the 1st is normal. Reaction time of the first car eats into the reaction of the second and so on until there's no more time to stop. Understand that cars further back do not see the car in front of them applying the brakes and slowing down, they are seeing a car moving at normal speed instantly crashing. Minimum safety distance is not as big as reaction time plus stopping to zero, that would be huge.


Not sure I understand your reasoning. The further you are from the first car, the more warning you get on what's going on, not the other way round. I typically watch not just the car in front of me but 1 or 2 cars ahead. If I see those cars braking, I start breaking. It's the car immediately following the tesla that got the least warning.


It is not always possible to see 1 or 2 cars ahead, that can not be the standard for who is at fault.

edit: to make it clear, we are debating whether the Tesla is at fault, not whether the other could have avoided it. Creating a dangerous situation still puts you at fault, you can not be allowed to do so at any time on the grounds that everybody else should avoid you.


And yet, I make it a point to achieve enough space for reaction time + stopping to zero as often as possible. Everyone should.


Good for you. It is not always possible thought and this is why we have laws against braking or driving at significant lower speeds for no reason.

Creating a dangerous situation is not acceptable on the grounds that other drivers should be able to avoid it.


Wouldn't "acquire acceptable safety distance" be a reason good enough?


also, since they don't expect the need to come to a full stop, they don't apply enough braking force.


It's not illegal to stop on a bridge or tunnel if required to not crash, surely?


Of course not, but the Tesla was not about to crash, so I don't understand your point.


> clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances

Try to park on the highway and claim the people who crash are at fault and see how it plays in a court of law


The drivers who crashed were obviously wrong, but stepping on the brakes in a 70+mph tunnel is incredible dangerous. The Tesla created this situation, and holds most of the blame.


I was thinking the same. One earlier article about this had a rather inflammatory photo of some twisted-up cars next to an empty baby buggy. But if you watch the video, that’s clearly where some human drivers ploughed straight into the back of the pile-up.

The Tesla autopilot failure is really bad, for sure, but those human drivers should be banned for life. There’s no excuse for ramming into a traffic jam because you weren’t paying attention.

Edit: occurs to me that possibly I’m being overly harsh here. Is there something about the dynamics of traffic that puts the cars three or four slots back at greater risk when somebody unexpectedly stops? I would assume that the immediately following car is at the greatest risk, but after that everybody is at successively less risk as they should all be slowing and should all see each others’ brake lights.


I believe there is a dynamic effect.

The first driver has some reaction time needed so he’ll start breaking after the Tesla starts breaking. Which means, assuming they were at the same speed initially, that the 1st car will have to break a bit harder than the Tesla.

Then the second car will be in the same situation, it will have to break a bit harder than then first. 5 cars later, you are at the hardest possible breaking power, as it’s ultimately limited by adherence.

So, if there’s not a larger gap in the thread of cars somewhere to allow for breaking less hard than the car in front, it’s more or less inevitable.


That's a nice explanation, I'll buy that!

To mitigate it, each driver needs to look 2 or 3 cars ahead, not just at the car in front. Everybody should do that but it's understandable if they don't (and it might be hard to see clearly, especially in a dark tunnel).


Good luck maintaining a safe distance on a busy highway when every time you slow to make a space, somebody moves into the space from a different lane.


Traffic jams happen. Do you really expect drivers to just ram into the back of them?


Traffic jams usually have some sort of sign, a slowing of traffic, a stream of brake lights ahead.

It's a different situation when someone is slamming on their brakes from highway speed with nothing ahead of them.


Exactly this. A traffic jam, by definition, is never running at highway speeds. If it was, it's not a Jam.


Why not let them take the space?


Because then you have to slow down to keep a safe following distance again. Then it happens again, and again, and again.


Why is this a problem?


One car driving significantly slower than the rest of traffic can itself be a safety problem:

"To keep a smooth traffic flow, some highways also have minimum speed limits. If you drive slower than the minimum speed you can halt the traffic flow and create a dangerous condition. Even if there is no minimum speed limit, those driving too slow can be as dangerous as those who drive too fast."

https://dmv.ny.gov/about-dmv/chapter-8-defensive-driving#yre...


You don't need to drive significantly slower than others, just slightly slower so a new gap appears in front of you.


The conversation you are in:

1. It's the fault of the driver behind, for not leaving space

2. It's impossible to leave space, because cars will move in if there is a large gap

3. Why is this a problem?


It is not impossible to leave space. Granted, you need to do it every time a car moves in front of you. But such is life.


Is it legal to print out a "stop" sign, attach it to the back of my car, and watch all the Teslas doing full stops on highways?


I'm still waiting for autonomous-only lanes. For extremely long and boring highways (usually full of trucks), this just seems like the logical conclusion of all self-driving tech to me. Dedicate one lane to self-driving, have the vehicles all on autonomous mode, and have them keep enough distance so their braking performance prevents them from crashing in an unexpected stop. You should be able to literally just sleep until your exit comes up in 250 miles. For any unexpected problems, have the vehicle turn off onto the shoulder of the highway and stop.


We have this technology today. It's called trains.


Trains aren't really comparable to autonomous lanes.

An autonomous car can travel on both normal roads, and in an autonomous lane. Roads are everywhere -- there's a road leading up to most people's place of residence.

Trains rely on tracks which are not as common as roads. Trains are also not owned by individuals. Most people do not have a train station outside of their house.


Perhaps we could dedicate the right most lane on every highway to becoming a train line instead of an autonomous vehicle lane?

A train can move more tonnage (see people) per unit of energy (kWhr/gallon of diesel/etc.) Than almost any other mode of transportation (gosh darn sailboats and barges being so efficient!) .

So why have 100's of electric motors and separate batteries to move people, when you can skip the batteries and have an electrified third rail and just an engine or two running a train for those hundreds of people.


The nearest thing that might be described as “highway” is 20 miles from my house. It’s the last (few dozen) mile problem passenger trains in the US have always faced for everyone who doesn’t live in the Northeast or SoCal.

Guess where the two passenger lines that come even vaguely to profitability are?

The population of the US is very coastal, which doesn’t map well to a hub and spoke system like rail. Contrast with Western Europe where A -> Paris/Berlin -> B is probably reasonably close to a straight line.


This is a great idea, but wouldn't paving more road be significantly cheaper and quicker than building rail infrastructure?

I'm all for the US pouring money into rail, but the fact is that we're _very_ bad at big infrastructure projects right now. That's not to say that we can't get better though.


The problem we have seen over the last 100 years in highway infrastructure is that demand always increases to fill capacity. The greatest example of this is LA, which has some of the largest highways in the country but still has bumper to bumper traffic.

You need to think about it on two metrics;

(1) density of transport; how many people per unit volume

(2) power to # of passengers.

On both counts trains are vastly superior to almost all other modes of transit other than boat.


> Roads are everywhere -- there's a road leading up to most people's place of residence.

But parent's comment was aimed at the one lane for self driving cars comment.

So I don’t see how you comment could apply to it. Yes we have roads leading up to almost everyone’s house. But in the case of a dedicated self driving lane proposal it would be as impractical As giving everyone a railway track leading to their home.

> Trains are not owned by individuals.

Exactly. So instead of building infrastructure to improve moving a few people at a time, improving infrastructure that helps move a few hundred at a time seems like a better investment.


> Trains are also not owned by individuals.

Planes aren't either, what's your point? That people who need to go somewhere should be priced out of using _public_ infrastructure and require a personal investment currently in the tens of thousands for the cheapest autonomous cars?

> Most people do not have a train station outside of their house.

People have feet or a wheelchair, bicycles are a thing, trams and buses exist have you ever even been outside of suburbia?


My overall point is that trains will not replace cars in the United States any time soon, so they would not be a very good replacement for autonomous driving lanes.

> Planes aren't either, what's your point?

My point is that the comment I replied to implied that trains can replace cars. Trains cannot replace cars because they are a form of public transportation, which cannot easily replace cars.

> That people who need to go somewhere should be priced out

This is quite the leap. My comment did not argue the merits of either, merely that trains would not be a very good replacement for autonomous vehicle lanes.

> have you ever even been outside of suburbia?

This isn't useful to your argument. Why are you resorting to a personal attack?


Autonomous Driving Lanes world-wide: 0 kilometres.

> so they would not be a very good replacement for autonomous driving lanes.

Railways world-wide: 1,400,000 kilometres.

> Why are you resorting to a personal attack?

Are you seriously considering my putting out there that your personal experiences might influence your arguments to the point of not being credible a personal attack?


> Planes aren't either, what's your point?

If it weren't because of price, I'm sure everybody would prefer flying private.

Flying commercial is pretty much as awful as it gets.


> If it weren't because of price, I'm sure everybody would prefer flying private.

Oh, OK, the only reason we shouldn't move the 10,000 daily passengers between LAX and SFO in 10,000 planes a day is because of the price.

/s


Plenty of privately owned planes, even privately owned airports. Not many privately owned railways beyond a few garden toys that are more like giant rideable models).


JR East has 12,500 bits of rolling stock and 50,000 employees. BNSF has 8,000 locomotives and 41,000 employees. They're listed on the stock exchange and clearly have more than a few garden toys.

Also, who implied that railways need to be privately owned?


It doesn't as trains don't have the convenience of your own car where the stops are.

If you had a train you could drive your car onto it would be more equivalent.

Also train lines have far less coverage than highways.


Not to mention, with a car you get to decide your own schedule. If I want to take the train where I am, I have to be there at 4:30am or... well, that's it. That's the only time.


Poor argument for underfunded and purposefully stymied train development in the early to mid 20th century.

Just look to Europe for what a large swath of the US could have if it invested in train infrastructure.

Train good, car bad.


I live in Europe and commute by train. I need to order tickets 10 days ahead to get the cheapest price, and they're non-refundable as well as non-changeable.

That means I have fixed times, there is no flexibility if I wish to leave earlier or need to stay longer. If I do, I lose the initial charge plus a new 3 times more expensive ticket.

Ticket prices also varies depending on time, which means I have to leave really early and come back 13.5 hours later. And I'm exhausted.

1 out of 8 trains are cancelled or have severe delays, passengers are often rude or are being annoying in someway that makes it difficult to relax.

Trains are not the blessing you think they are.

Edit: Forgot to mention that I am lucky to live 10min away from the station. If I would live where I would like, it would mean 30-40min.


Trains in Europe are not magical and the best they could ever be.

They could be a lot better in a lot of ways. In Europe there is still a huge investment in car infrastructure. And while land use wasnt as bad as in the US its still far from optimal.

I do dislike the selling of these specific tickets. I much prefer a system where you can just take the train.

When I used the ICE in Germany I would never get a reservation, just a general ticket. So if I missed one it didnt matter.


Sounds annoying, which country?

Where I live the national railway carrier lets you refund or exchange your tickets and seat reservations freely up until 15 minutes before the departure time, most ticket types are valid for a day (so you don’t have to buy one for a specific connection) and the prices are, I believe, more or less fixed. The delays are inevitable I guess in most larger networks.

I don’t take the train that often, but my experience is generally pretty good, especially as I’m starting to hate driving.


You have options that would give you more flexibility on travel times you simply value the lower cost over the convenience. That’s not a train problem that’s a choice you have made.


> Poor argument for underfunded and purposefully stymied train development in the early to mid 20th century.

If I had a working time machine, plus a magic want to make people do what I want, that might be a good argument. I don't have either, though. The world (or at least country) we live in didn't do that, and the mid-20th century was quite a while ago. Now what? Even if funding for rail magically cranked up today, we won't have ride-able lines for a decade, at best. Meanwhile, I've got places to go tomorrow, and rail isn't a realistic option.

> Just look to Europe for what a large swath of the US could have if it invested in train infrastructure.

Out in my part of the country, we don't have anything like Europe's population density. Your "could have had" would be very uneconomical out here.

> Train good, car bad.

Spare us the thought-terminating cliches, especially when they're wrong. Your post isn't convincing, but this parting shot doesn't make it any better. It just makes you sound like you're a propagandist.


Europe has cars and highways.


Sure, we do, but taking the train into copenhagen for work is a far better option for most here. Thing is, when you invest in rails instead of parking, you change the math more than "europe has cars and highways" could describe.


> with a car you get to decide your own schedule

Not in heavy traffic.


Can a train pick me up directly at my home, and drop me off at pretty much any destination, with more luggage than I can carry at once, at any time of the day, at the exact moment I want to leave? No? Then trains are an alternative to self-driving cars, not a replacement.

Public transport is great, but you aren't going to convince others to support it by pretending it doesn't have any drawbacks compared to private transport.


Reminds me of how I used to build railways everywhere instead of roads in Sim City. But surely that wouldn't work out in real life?


In Tokyo at least in the proper city it's easy to feel you're fine with just JR + subways.


It is possible to move around Tokyo without cars to some extent, but there are still far more roads than there are railways. It’s nowhere near being a car-free zone, and many people who don’t drive would still need buses or taxis to get to the train station.


Possible? Millions of people daily do this in Tokyo I'd say it's more then a possibility. Have always been able to get to Tokyo station or Shinagawa/Shinjukku via a JR rail or local subway, so definitely not needed either. Shaving 10 minutes via taxi yes that is a bonus.


Not everyone in Tokyo lives so close to train stations. There are places inside the 23 wards where the nearest train station is more than 3 kilometers away. You could walk, but it would take around 50 minutes.

I should also add that if you want to move around the west region of Tokyo, you're better off using a bus because there are no trains there that lets you move north or south. The Tobu Tojo Line, Seibu Ikebukuro Line, and the JR Chuo Line all move in the east/west direction.


It works in most of the world’s cities.


Statistically speaking, the vast majority of the world’s cities have absolutely zero local rail transportation.


yes, but we already have a network of asphalt, not rails.


Here is a map of the rail network in North America (just to confirm that it does not exist /s)

https://www.arcgis.com/apps/mapviewer/index.html?webmap=96ec...


clearly rails exist. I highly doubt you, or anyone, believes that rails reach the number of destinations that roads do. Yes you could argue to move more by rail for most destinations, and last mile it by truck. But with that change I still think we're running into contention for that left lane ...


I agree that the rail network is really not sufficient in North America, but it's worth investing more in extending this network.


Maintenance cost of rail is significantly cheaper. Don't sink more into bad design.


It costs Amtrak over $300k a year to maintain a single mile of track on the Northeast corridor. For comparison, it costs around $30k to maintain a single mile of US interstate, i.e. only 10% of the cost to maintain rail track. In what sense rail is cheaper to maintain, exactly?


Citation needed. I can believe that we spend $30k a year filling potholes on a mile of interstate but every 10 years we have to resurface it and every 30-50 years we have to completely rebuild it.


Just Google it, it’s not hard. Are you really surprised that rail also costs money to maintain?


Good point! Let's dedicate a fraction of it, say the left most lane, to becoming railway lines!!!


It’s obvious at this point that the US gov doesn’t want to invest there, and it’s too hard for the private sector to go and build train routes through land.


All of Elon Musk's grand ideas for revolutionizing terrestrial transportation (that aren't laughably unworkable like miles long vacuum enclosures) are just 'trains but much smaller and with tires so that we lose all efficiencies and gain all of the drawbacks'. So this fits perfectly.


The difference to trains in Musk (and Silicon Valley's overall view) is that with automated cars, you don't have to share your personal space with other people (the poors!). It's easy to make fun of the overall project but there's elitism behind it - have your private space at the cost of the public.


Well it's fun to make fun of Mr. Musk (and his fake physics credentials) for more or less copying a 100 year old idea and design, and making it flashy again.

Yes the hyperloop is just a rebranded vacuum train/atmospheric railway... Patented in 1799 in London... Yup... Let that sink in.


It's really quite amazing how similar things are today, to 200 years ago. I've been reading some philosophical literature from 19th century and if just change a few key words you'd imagine it was written today.


Ha!

Thanks for cutting through the meandering, pseudo-intellectual cruft with an answer than makes too much sense.


Trains are great for moving lots of people efficiently and quickly from one place where they don't want to be to another place where they don't want to be.


With a good planning and a little time (unfortunately slightly in the decades) you can build infrastructure to be around the train network. Make sure you have a good connection from home location to a hub and infrastructure development around the hub for shops and offices as well as train lines to factories for factory workers as in Tokyo, combined with an integrated schedule as inspired by Switzerland and you get quite an attractive urban experience. But yes, that is a major shift from America's car focussed infrastructure.


Said no one in Europe, ever.


In the US, 200 years is a long time, and the train doesn't go where you're going. In Europe, 200 km is a long distance, and it might.


In Singapore, 50 years is a long time and the train goes where you're going.

I'm not saying the US could pull off a Singapore, but there's no excuse for San Francisco, or LA, or any other large city to have done so.


We have highways everywhere... We could definitely dedicate the left most lane and convert every left most lane into people moving train tracks. Just because we haven't doesn't mean it isn't economical or feasible.


> I'm still waiting for autonomous-only lanes.

Lanes are really expensive, it's unreasobable to have an autonomous-only lane, so a shared lane is base. I propose that vehicles in autonomous node be restricted to the slow(est) lane, where it's safer to come to a sudden stop, and the driver, manufacturer and insurer have to contend with the probability of being flattened by an 18-wheeler.


There's dedicated lanes on highways all over the country. They exist solely to let people to pay extra to avoid traffic. If that's reasonable, I think it's also reasonable to make them autonomous.


Paying extra[0] helps build and maintain the expensive roads - had I mentioned roads are very expensive? Why should autonomous vehicle be rewarded? To be very clear, I think they ought to be 'punished' due to the risks they present to others (read the fine article).

0. We also reward high-occupancy for reducing traffic, and no-/low-emission vehicles for reducing pollution, both results have clear public benefits.


No you cannot sleep in the driver’s seat of anything less than level 4 autonomy. This is independent of the presence of autonomous-only lanes or “smart” roads with nfc signaling of lanes or rules or even additional sensors; these only extend the situations under which the current autonomy can function under, it doesn’t grant new capabilities.


Given how few SDVs there are and how expensive they are, this would basically be a Rich People Lane, and further incentivize the most wasteful, climate-impacting method of travel.

Lanes for busses and vans? Yes. SDVs? Maybe in several decades, and they should still only be for HOVs only.


Makes sense. We have HOV lanes which reward for being good for the environment. Even better is an autonomous lane that rewards not adding latency to the prisoners dilemna, seems the ultimate altruistic move, someone wanting to not impede or slow others.


Isn't the primary motivation of HOV lanes to reduce traffic, not help the environment?


Let’s not forget that such lanes are just a tiny fraction of the road system, whereas lanes for Teslas would have to be nearly everywhere. And that’s skipping things like “Tesla safe” exits.


It's both. Single-person EVs could use HOV lanes in some locales until they've reached proliferation numbers.


Let's reward rich people who can afford self-driving electric cars that tend to crash into things when they aren't paying attention by giving them their own highway lane.


Put a high enough toll on it that's a progressive tax. Hell, you could modulate the toll in line with registration fees, which are linked to vehicle cost.


I don’t think that would be a solution to the problem in this article. It would just concentrate the distracted drivers into one lane.


Easy solution: just use the truck lane.


This looks like the vehicle tried and failed to get the driver to engage and pulled over best it could. There is no shoulder there so…

Let’s wait to get more info as to what the driver was doing and if they were incapacitated.

The FSD beta is pretty aggressive on making sure the driver is paying attention via both steering sensors, the in-cabin camera and the touch screen.


> if they were incapacitated.

You can see them get out of the vehicle and walk around a bit in the video[0], so I guess they were not incapacitated - see around the 46s mark in the video. The driver also claimed they were using FSD at the time the car performed the maneuver.

[0] https://www.reddit.com/r/teslamotors/comments/108kpgo/footag...


another angle: https://twitter.com/kenklippenstein/status/16128488720611287...

The driver was on I-80. Unless this is a Tesla employee, Tesla FSD is not enabled on interstate highways at this time; when you're on a major no-stops no-traffic-light highway, it defaults to Autosteer+TACC, with his subscription likely only enabling the auto-lane-change feature. All this means is that the likely culprits are either TACC stopping on its own as if there was a car there, or the AEB system performing an emergency stop to avoid hitting a (phantom) person or car; but based on the footage, it looks like a TACC failure since it doesn't look like an emergency stop you'd see in IIHS videos[0].

0: https://youtu.be/liQxaeGPHJg


FSD vs navigate on autopilot is a pretty small technical detail on the highway. Once you turn on FSD beta, from the driver's perspective FSD and autopilot are the same thing - turned on with the same input (double tap on the right stalk), with the same stuff shown on the display.

Navigate on autopilot can change lanes on the highway and follow highway exits without user intervention.

IMO when someone says they're using "FSD" they mean they're subscribed/paid for FSD, have the FSD beta turned on, and are letting the car drive itself (whether that's FSD on city streets, or navigate on autopilot on the highway.)


> turned on with the same input (double tap on the right stalk), with the same stuff shown on the display.

Small nit but this isn't true. FSD will show the newer, much more detailed preview. When switching to highway mode it will revert to the old, plane grey lines preview.


> pulled over best it could

If that is its best then FSD has no business being on the road.

It was erratic and didn't give clear intent to other cars what it was trying to do.


IMO the car signaled, changed lane and stopped without slamming the brakes. It didn't collide with anything, the pileup was formed by human drivers driving too close and too fast for their reaction time. If it had been a human driver that saw a kitten or a rock in the road, it would probably slam the brakes.

I'm not defending Tesla's FSD, which I think is inferior for Tesla's reluctance to use LIDAR in addition to cameras. Current car AIs are imperfect, they need as much info as they can if we expect them to drive safer than humans.


If it caused a crash it stopped too quickly. Unless there was an immediate, life-threatening need to stop it should simply not have stopped and continued until it was safe to stop. Any halfway competent driver could plainly see there is no safe place to stop there and you should never stop there unless your car is literally on fire or the road bed is missing.


It also stopped in a bad spot for human eyes. From the rear facing shot, looks like was just past a brightly lit to dark tunnel. Human eyes need a bit to adjust to that. It's quite possible, especially as the backup grew that people literally couldn't see the brake lights.

Thats not to excuse anyone, but it may have been a contributing factor.


I’ve seen videos of some other system (Mercedes? VW?) stopping when the driver had cruise control on without paying attention.

It put on its hazards and gradually slowed to a stop. Noting fast or abrupt. Easy for any driver paying attention to recognize and be ready for.

It was nothing like what happened in this video.

Personally I doubt this was the car trying to stop safely due to driver inattention. Seems more likely an AutoPilot/FSD bug or the phantom braking all the way to a stop at the worst time.

Or terrible driving by the driver if they had been in full control and are lying about FSD.


A week ago, I Turo’d a Model Y for a week. We were driving around, 35 mph on a two way road with a bike lane.

We were going past a bicyclist in their lane appropriately. I had auto pilot engaged. As we passed the cyclist, they moved over towards us slightly, but never left the bike lane.

The car suddenly went into a full stop, which was a pretty quick stop.

I can understand the car may have thought the cyclist was going to go in front of the car, but the scenario wasn’t one in which a human wouldn’t have slammed the brakes. My wife commented that we’re lucky there weren’t cars. Whine is because they might not have reacted as quickly.

To me, this shows the potential danger of these safety systems false positives. Not that I wouldn’t continue to use the systems because I believe overall these systems make me safer than without them.

But as always, the danger isn’t the car I’m in, it’s the one driven by another driver who’s not paying attention.


Stopping in the left lane of the Bay Bridge where the lanes are narrow and there is no shoulder is extremely dangerous. You can see the predictable results.

If the sole problem was that the driver wasn't paying attention, the least it could have done would have been to put on the hazards and very slowly come to a stop. It would have been even better if it simply continued until there was a safe shoulder to stop on. The chance of having an accident by stopping on the bridge is much higher than just driving on autopilot to the end of the bridge and then pulling over.

If the problem is that this would create a moral hazard where drivers stopped holding onto the steering wheel when on bridges with no shoulder, the car could play loud obnoxious noises inside the car when this happened.


Right it's dangerous but it's equivalent to a flat tire or something causing any other car to stop. I only skimmed the video once but it looked like the car slowed down quite a bit, and as others mentioned a car slowing in front of you even in a congested area doesn't excuse others from following too closely or at too high of speed or not paying attention, because something could happen to any car causing it to stop somewhat quickly.

I was watching the video and the whole time I kept thinking that if I saw my car slowing down for no reason (and my Autopilot not FSD) has slowed down "ghosted" before I would have immediately taken control. Why didn't the driver do that?

> the car could play loud obnoxious noises inside the car when this happened.

I'll be interested to hear about what happened there but I know if you don't toggle the wheel or take action when on Autopilot the car will flash blue and beep at you to take control and if you keep doing it it'll disengage Autopilot until you stop and park the car.

I haven't used "FSD" much at all so I'm not 100% familiar with it but there seems like a host of problems here and the software powering "FSD" is but one of those. It saddens me though that people aren't asking the best question here which is why are we driving everywhere in the first place?


> IMO the car signaled, changed lane and stopped without slamming the brakes

Which is erratic and reckless.

If it was an emergency situation it should have put on hazard lights, waited until all surrounding cars passed, then changed lanes and very slowly decelerated until it stopped.

All while aggressively alerting the driver what is happening.


Right, if someone is incapacitated, putting the life of that person and others on the road in danger is still not the right move. It doesn't matter what the driver is doing, any driving system that puts other people on the road in danger does not belong on the road.


Other systems will stop if no driver input is detected too. They slow down much more gradually and turn on the hazards so other drivers can safely react.


Because this is the first car pile up we've ever seen.

Interesting because a computer was involved we are incensed. "No business being on the road"... but the millions of other car crashes "oh yeah, well humans make mistakes!"

Tesla's claims of being "safer than human drivers" is as much a testament to their technology as it is to how poor human drivers are.


Seems to me that the cars behind the Tesla are the omas at fault?


No, you cannot swoop into a lane and stand on the brakes. Nobody was following the Tesla until he put himself directly in front of an overtaking car with not enough room.


Yes, the Tesla was possibly at fault for the first collision. It was not at fault for the 7 subsequent collisions.


I’ll bet that’s not an argument his insurance company will try to make.


The Tesla had its turn signal on for a solid 5 seconds before impact. I blame the asshole that was in the far left lane going way too fast and not slowing down (just letting off the accelerator when the Tesla signalled would have prevented the accident) when someone is trying to enter. Then they brake way too late. Everyone behind that first vehicle is also to blame for following too closely at too fast of a speed and braking too late.


Friend of mine was driving down the road one day, and someone in the lane next to him turned on her turn signal, then merged into him without looking. She explained the situation carefully to the adjuster, noting that "she had her turn signal on, therefore she had the right to merge." At this point, the adjuster contacted my friend and asked where to send the check.

A turn signal does not give you the right to the lane. You have to look, there has to be enough room.


There were multiple car lengths of room when the Tesla started merging. The other car was so far back it wasn't close to the blindspot. Americans do not adjust their speed or following distance to the conditions. They drive like they have some sort of right to a road free of obstructions and to drive at the speed limit or above, no matter the circumstances.


Takes about 8 to 10 car lengths for an average car to stop from 60 mph, from the moment the brakes are activated fully. A fast human reaction time would be 150ms, but that's just to react, not to get the foot onto the brake. So add another 3 car lengths.

There were not 11-13 car lengths between the car in the left lane and the Tesla when it brake checked them.

Generalizing about Americans is tiresome, by the way. It makes you look petty and ignorant. Which might be true, but on the off chance you didn't intend to come across that way...


Note that the Tesla changed lanes, into the passing lane no less, and then immediately slowed down significantly.

So the first car to hit it may have had a very safe distance between it in the next car, but the Tesla cut that significantly with its sudden maneuver.


It's a popular opinion but I think it's wrong. If they weren't all so close together some of that could habe been avoided. The Tesla indicated for a very long time telling other what it's gonna do.


The Tesla cut into the lane with 3 car-lengths or less behind it and rapidly decelerated. The two cars behind are going to collide because the first one, who was safely positioned before the Tesla changed lanes, didn't have time to react and the second car was probably not able to either, because the brake lights in front didn't come on until too late. Then, there was a couple cars that were probably tailgating, and finally the whole thing is a snafu.

I can't tell if the people making these comments about the drivers behind the Tesla being responsible are such defensive drivers that they never leave less than 6 seconds in front of them, or they're tailgaters themselves, or just being mendacious.


It’s also just not realistic. If everyone was that defensive (5+ seconds distance) on such a busy road it would severely lower its capacity, leading to constant traffic jams. Leaving that gap will just mean others taking it.


Saying "driving safely lowers the capacity of roads" is like saying that not jamming pennies in a fuse box lowers the capacity of the wiring.


Is it like that because it's true?


I mean, it's true until your house burns down. Then your wires stop carrying any current.


In my opinion the optimal following distance for traffic flow is probably slightly higher than what people are doing.

There might be a "laffer curve" where increasing the follow distance of other cars and limiting overcorrections would increase capacity, rather than decrease it, at the current level.

If we're getting regular "phantom jams", we're probably following too closely.


there definitely is such a curve. I just can't imagine its optimum is at 5+ seconds between cars, at least at the highway speeds we drive in dense/often congested areas (~60mph / 100kph where I live). Intuitively I'd put it around 3 seconds.


I agree there. There's a lot an individual can do to dampen the compression waves if they leave 3 seconds instead of 2. I was being facetious with 5 seconds, but in some situations the 3 second distance is still not enough time to react. For example, when there is a stationary obstruction and the car in front obscuring it swerves instead of braking.

Additionally, there's a lot to be said about vehicles not acting like they have human drivers. This Tesla basically acted like a psychopath.

I have one car with regular cruise control and one with adaptive cruise. I am more relaxed with the standard one because I get the signals I need visually to know if I need to pass the car in front or not. I can't tell if my other car is going to drop from 80mph to 60mph all of a sudden until it's basically happened. Better autopilot could help, but then would I end up in this situation?


> . The Tesla indicated for a very long time

3 seconds is a very long time? Also: 3 seconds of turn signal is a clear indication of "i am going to change lanes and fully come to a stop"? And then turning off all signals and coming to a full stop without hazards on is a clear indication of "i am stopping"?


Indicating to change lanes and changing lanes should have been enough for the traffic behind to slow down and create a safe distance between each other. People just drove too close to each other.


I wouldn't expect any normal car to behave like this, but the drivers following should have recognized it was a Tesla


Maybe it could help a large "F" sign, similar to the "L" one for learners in the UK (or the "P" one)? [0]

[0] https://www.gov.uk/driving-lessons-learning-to-drive/using-l...


They are of course at fault for not leaving a safe distance.

But ultimately the behaviour of the Tesla FSD system is what caused the chain of accidents.


> They are of course at fault for not leaving a safe distance.

They weren't following the Tesla. And a reasonable safe distance does not usually account for the guy in front of you doing an assisted stop. That is far enough outside reasonable expectations.


> And a reasonable safe distance does not usually account for the guy in front of you doing an assisted stop.

A distance where a sudden stop results in a collision is not a reasonable safe distance.


As others have pointed out in yesterday's submission. This could have been avoided if the cars behind them were leaving a more defensive distance to the other cars. In all directions, you can't just look ahead, you gotta observe what's going on around you.


Exactly this. Asking American drivers to do that is just too much apparently.


Doesn't look like phantom break, which is usually very sudden and no lane change. Most likely the driver was hands off the wheel and was warned multiple times before the car pulled itself over.


The police report claims “Full Self Driving Mode Beta Version.”


The Bay Bridge is a freeway and so FSD Beta stack should not be active - instead Autopilot stack, which does the hands off warning then slow down thing, but I think it puts hazards on too which isn't seen.


Not sure if police got this right. Currently only Autopilot is available on highway. FSD is for local road.


The car pulling itself over too close to following vehicle and then stopping would still be car caused crash in this situation.


Why would it pull itself to the leftmost lane?


There is no video!



> You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.

This is incredibly true and obvious.

It's likely people monitoring their car aren't doing anything at all: sleeping, reading a book, whatever.

But if anything, actual monitoring is harder than doing: it means actively watching what the machine does, understanding it, thinking about what one would have done, doing a diff and analyzing it, all in real time, all the time. It's exhausting.

That a situation like this is even legal is amazing.


This mirrors my experience of driving in America. People do not adjust their speed or following distance to the conditions. They drive like they have some sort of right to a road free of obstructions and to drive at the speed limit or above, no matter the circumstances.

It's as if they expect the speed limit sign to adapt to the circumstances. They have the attitude, "Well if I need to slow down, why does the sign say I can go 70mph?"

It could be freezing rain with black ice or a fog with zero visibilty, and the overwhelming majority of American drivers would not change their driving one bit. I have seen this in every corner of the country and it is noticeably different from other countries.

There will always be unexpected obstructions in the roadbed of major highways. Stuff falls off. People lose control and roll and end up obstructing the lanes.

So while the Tesla and its self-driving mode were the proximate cause of the obstruction that led to the collision in this particular instance, they were absolutely unrelated to the actual cause of this entire category of accidents.

This attitude affects many other aspects of American life. Notably, gun violence. Exactly the same feeling of entitlement to go full speed ahead and damn the consequences dictates the occasional outcome of both highway driving and eating your lunch quietly in the school cafeteria.

There is little difference between an American in a pickup truck barreling down out of the Alleghenies at 80mph in fog and freezing rain and and American with a not-meaningfully-regulated gun barreling into a classroom and opening fire.

Both are the direct result of a sense of entitlement, and both regularly lead to mayhem and death.

You can't blame the Tesla for this.


>You can't blame the Tesla for this.

It's so frustrating how mind-bogglingly polar everything has to be. Nothing in your entire post precludes the Tesla from being partially at fault. No matter how long you go on about how reckless American drivers are (and they are), none of it serves as one iota of evidence as to whether or not the Tesla behaved reasonably or whether its features were advertised properly. Why is absolutely everything either an admit-nothing attack on our "enemies" or an allow-nothing defense of our "allies"?


I agree with what you said regarding American driving habits, but that doesn't absolve Tesla from its own share of responsibility. The car that was driving in front of the other lane was not following anything; Tesla swerved into its path, so the resulting distance between the cars is on Tesla. What speed would you expect the other car to maintain to avoid such an accident?


> You can't blame the Tesla for this.

Sure you can, because this was 100% the fault of the Tesla. The SUV did nothing wrong. Nothing.


[flagged]


Nationalistic flamewar is not allowed on HN, regardless of which country you have a problem with.

You've been breaking the site guidelines badly in other contexts too. If you keep this up, we're going to have to ban you.

If you'd please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting here, we'd appreciate it.


Everyone claiming that the autopilot is at fault for stopping suddenly get something very important very wrong: Stopping a car must be possible at all times. Any automation in a car should always be able to stop the car as the safe default. The same holds for human drivers, of course: Consider the possibility that the vehicle in front of the Tesla just lost some of its load. Of course the Tesla then has to stop suddenly. Considering this behaviour a safety problem is a lazy excuse. At worst, it is a nuisance.

As for the safety distance: This is a valid point. If the Tesla has not kept a safe distance to the following car when changing lanes, there is a problem with the automation. It should always be as defensive as possible. I don't see that from the police report, but the video shows the distance to be rather short, maybe two car lengths. Call it 10m. The second car doesn't really seem to brake very hard (no emergency braking assistant?) and that crash is definitely on the Tesla as it cut the distance, unless the speed limit in that tunnel would be around 20kph or so. But now it gets interesting: The third car does stop successfully. The big pileup happens afterwards.

In conclusion: The big pileup happens because the third car suddenly stops - and rightfully so but the following cars don't keep their distance to that third car. The Tesla getting rear ended is just a lazy excuse for these drivers that simply drive recklessly.


Tesla changed lanes quite close to to the first care in pile up and then abruptly stopped. When you are changing the lanes, you are supposed to do it in a way that gives the other driver space.

Abolutely nothing was in front of Tesla, nothing could have loose the load.


The same holds for human drivers, of course: Consider the possibility that the vehicle in front of the Tesla just lost some of its load.

The difference is that the humans in the cars behind will be able to see that too and anticipate the reason for the braking. In this case, the Tesla just changes into the leftmost lane and then brakes very quickly to a stop for no apparent reason. A human doing the same thing would almost certainly be at fault too.

You can see on the video that it's not just suddenly stopping, it's changing lanes (with what appears to be very little room from the car behind) and then suddenly stoppping.


I am convinced the Tesla apologists in this thread are more concerned about the value of their TSLA stock than they are about human life. Clearly, the public has the right to expect that other vehicles on the road will be driving in a reasonable manner. There was no reasonable necessity for the Tesla in that video to maneuver as it did. If a human driver did the same thing for no justifiable reason, the same Tesla apologists would be singing a different tune.


A beta that we are all helping test . Simply unbelievable that the NHTSA allows this to happen.

The Tesla safety report which the company boastfully released wouldn't have counted this incident since the Tesla itself wasn't harmed.


Instead of finger pointing and blaming, the fact remains that several drivers crashed into an obstacle that they weren't prepared for, did not anticipate, and could not deal with. Maybe because they were driving too fast, weren't keeping their distance as they should have, or simply weren't paying enough attention.

Stuff like this happens all the time. Pile ups on highways have been a regular thing for as long as highways have existed. The pattern is always the same: something unexpected happens in front of somebody and they fail to respond or they over react. The problem recurses behind them as more vehicles become part of the problem. It doesn't matter if the problem at the front is real or not. The actual problem is that drivers further back are not prepared for a thing that should be considered something that can happen at any time.

The car in front of you might have all sorts of reasons to suddenly slam the brakes. You have no way of telling when they will do that because they are blocking your view to what's in front. There might be an unseen obstacle, engine failure, fog, a hole in the road, whatever. You have no way of telling until the brake lights in front of you come on. It doesn't actually matter why they slam their brakes or whether there even is a good reason for them to do that. You have to be ready for that eventuality and the appropriate response is going to be slamming your own brakes. And the car behind you had better be ready for that too.

Obviously we had a cascading failure of multiple drivers not being ready for that here. And they are dodging their own responsibility by pointing to the problem in front of them instead of the idiot behind them, or themselves. The second car obviously wasn't the problem as they avoided crashing into the Tesla. Good job. The Tesla is fine. It was the driver behind them that was the real problem.

The irony of human failures like this is that taking them out of the loop might make things safer. The Tesla stopped. It doesn't matter why. Human drivers behind it failed to act. Would that have happened if they all had FSD on or would we just have had a weird traffic jam occurring for no good reason at all? That too happens all the time.


If autopilot starts braking even slightly and it shouldn't, I immediately take control. For this to happen, the driver clearly was not prepared to take control in any way. It's like driving on cruise control with your legs crossed.

The automated cruise control and lane-keep on my previous car, an Audi, would occasionally do crazy things too. The hyper-focus on this being Tesla's fault continues to baffle.


Tesla has much more of an issue with phantom braking than any other manufacturer because they chose to rely only on cameras, and that system simply does not work as well when it comes to this particular issue.

Additionally, how can you blame the driver? Did you watch the video? It's not as if the car slowly stopped. It slammed on the brakes. There wasn't time for the driver to correct for this before the car behind -- which Autopilot had just moved in front of -- slammed into them.

Tesla's Autopilot and Full Self Driving get a lot of focus and criticism for two big reasons; 1) their product names clearly and deliberately paints a picture that the features do not live up to and disclaimers do not fix that. 2) they are unsafe on public roads as they operate today. They are a menace to public safety.


Driving in traffic (or in a city in general) with my foot off the pedals sounds borderline insane. I know this is anecdotal, but I've experienced multiple brands slamming on the brakes. You must be prepared to intervene.

It's not a Tesla problem. I think we hyper-focus on Tesla because their system is the best. This is a problem for all automated driving systems (even the basic ones found in almost all brands now.)


> autopilot starts braking even slightly and it shouldn't, I immediately take control

I think this is the thing that makes the most sense. But if I were to bend over backwards giving the benefit of the doubt - is it possible the driver feared resuming the previous speed because they were worried the car saw an obstacle that they couldn't?

Like the article suggests - if you're not fully engaged, it's as if you're driving impaired. And there's so much automated that it's difficult to be so vigilant. Once the car starts braking, how much latency can I afford just assessing what caused it to slow down? How hard is it braking and how closely is traffic following me?


This is anecdotal, but I'm always looking at the road. If the car starts braking, (and this isn't exclusive to Tesla,) I already know if there is an obstacle or not, and my foot is always hovering over the pedals.

I feel like such a shill responding to these comments, but based on my experience, these same drivers would get into accidents in other brand's cars as well. We only focus on the story because it's Tesla.

I'm not sure what the solution is.


Well it’s the only company that charges 20k for “full self driving capability”


Since it's a highway, the car would have been operating on autopilot, not FSD. The two stacks had not, and I think still have not been merged.


In fact all Tesla's sold today come with basic lane keeping (but not auto change lane).


Because the Audi doesn't charge users $10,000-20,000USD for "Full Self Driving," that's why we talk about it so much! If you throw it in as a safety feature in your deluxe package, and describe it as such, we aren't having this conversation. On the other hand, I strongly suspect that if they didn't do that, TSLA never gets to worth more than all other car companies in the entire world combined in market cap terms, and Elon Musk is never able to afford to buy TWTR, so one can see why he would do it.

What is the reason I should pay all of this extra money and still have to maintain full focus and attention on the road around me at all times? What is the market for it unless it's for me to be able to zone out more while driving?

Umpty-years ago I worked on robot tanks for the US Army, and we had a devil of a time answering this question. If the driver still has to be fully monitoring a single vehicle at all times, why isn't s/he driving it directly? What benefit does this not-quite real self-driving provide? Our soldiers told us that it mostly got in their way, because they wanted it go over here and the computer wanted it to go over there, and so they felt like they were fighting the computer more than anything else. It is possible that someone will come up with a better adjudication than we did, but it really made me question whether any level between 2 and 5 is actually useful to anyone.


To be fair, Autopilot, which this driver was using, is included with all Teslas at no extra cost. Personally, I find that road trips are significantly less draining with Autopilot engaged. In the same way that moving from basic cruise control to radar cruise control with lane-keep reduced some load, Autopilot takes it a step further. The difference is noticable. Remaining attentive and ready doesn't negate the benefits.


This is a bug in the meat, and is not going away. The problems are well understood, thanks to decades of examples in semi-automated trains, boats, airplanes, and industrial monitoring systems.

Until "self driving" is good enough that you're legally allowed to ride it drunk, it should not be allowed to do anything more complicated than radar cruise and lane-keeping.


This is a bug in Tesla's specific meat.

Other self driving implementations e.g. Cruise, Waymo have LiDAR which is an additional and in my opinion necessary sensor for detecting objects.

Because in this case it sure looks like a phantom breaking issue.


Is it really a Tesla specific issue? Tesla's definitely phantom brake quite a bit, but it seems like many automated cruise control implementations do this today.

https://www.motorsafety.org/us-government-investigates-1-7-m...

https://www.macheforum.com/site/threads/phantom-braking-expe...

https://www.chevybolt.org/threads/ok-lets-talk-phantom-braki...


And to me the crux of the matter is whether the systems are safer despite any phantom braking.

The focus on perfection rather than improvement is surprising to see on HN.


It is Tesla specific in the sense that they refuse to use LiDAR.

All those examples just reinforce my point that it's a necessary sensor.


Tesla doesn't sell meat, the meat is what's weighing down the front seat.


Yeah, it looks like one of those cases where Elon is being stupid - self-driving cars with LiDAR are a lot safer.

Anyone know how much a LiDAR system cost nowadays?


Heck, my $400 vacuum robot has lidar and its advantage when navigating are immediately obvious to anyone paying attention. Rather than cost it feels more like laziness - sensor fusion is hard, so let’s just delete the problem.


It depends on what the priorities are. If self-driving is allowed now, it may lead to a few accidents, but the tech will progress faster and we will have very safe FSD cars sooner.

On the other hand the sooner we get FSD cars, the sooner we will also get ASI and with it potentially the end of human kind.


Just ignore the name Tesla and think for a minute how would you feel if this were any Japanese or Korean automaker's car. would NHTSA be still as forgiving? we should feel exactly the same as in that case.


Now, as much as I wanna take the piss on Tesla, I'm more interested in this American tradition of simply ramming into anything in front of you that's not moving at your current speed or faster..

Like, how do people reason about this? It seems uniquely American to me.. These videos of people simply driving into stuff that's in plain sight right in their lane? What do they do if there's a fallen tree, tire, shopping cart? Simply pile into it instead of braking?

I'm wondering because, in my country, when you drive a car, you're supposed to pay attention to the road, and always be ready to break if something gets in your way ? But in USA, the thinking seems to be "I'm not driving too fast, I have the freedom of way here" ?

As for the article, all points are entirely valid! Systems need to either be fully autonomous, or require some level of constant engagement.. This is kind of analogous to the previous stuff written about the deskilling of labor and automation.. It's an impossible position to put someone in "this works in every easy case, almost never fails, and when it fail, it might get arbitrarily complex and require the exact skills that the operator has very rarely any opportunity to practice"


Nearly everyone else in America (and probably Italy) is going to think this sentiment is insane, but you're right on here.

The way this comes about has to do with the suburban model. When you're in an environment where traffic is commonplace, you can drive defensively or aggressively. During traffic crunches, defensive drivers are completely walked all over by aggressive drivers, and will easily double their time sitting in traffic. Leaving a safe distance means leaving room for people to dangerously merge.

This leads to a nash equilibrium of aggressive driving, and now you literally have Americans criticizing the idea of "brake-checking" tailgater as a way of signaling their displeasure at being tailgated (literally just tapping the brakes to slow down).

It's a completely messed up system, and it terrifies me that it seems to be spreading to other countries.


Your take very exaggerated to say the least. Note that the Tesla was not part of the pile-up, the car behind it braked in time. But it takes only one driver to be a little too close to the car in front of them or to react a little too slowly and this is what happens, everywhere in the world.


Tesla did not gave much space to that driver. It changed lanes to be very close to car suddenly behind.


Perhaps "Semi-Automated Driving" is a more accurate term than "Full Self Driving", but I'm guessing FSD is a better acronym to use than SAD.


It's also probably a lot easier to convince people to drop 15k on "Full Self Driving" as opposed to "Semi-Automated Driving"


You know what else is "semi-automated driving"?

A six year old behind the wheel sitting in your lap.

That's also known plainly as "illegal".

I have no fucking idea why this is being tolerated on our public roads, nor why SAE has even classified these child-equivalent levels of autonomy as if they're acceptable.


> You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.

Well, you could have extremely visible/audible signals for the driver to take over when the driving system is failing (internal lighting becoming red and blaring alarms), but that wouldn't be very popular I guess, especially with half-assed self-driving systems.



Seems like the driver is litigation lawyer (from the full police report, link in the article), with some notable portfolio (IANAL, so don't know but what it looks like). Wonder what's gonna happen...


Tesla: we have an experimental self driving system that we need to test.

Person: lol how are you gonna pay people enough to test something that might endanger them?

Tesla: no they will pay us for the privilege


Articles like this are really annoying, it's just based on some feelings and don't really prove anything. I'm not saying this accident is terrible and shouldn't be investigated of course. And of course Tesla should do better and fix those issues.

> then I’ll see a crash like this, [...] and I realize that, no, people still don’t get it.

That's just an example. You can't make conclusions based on a single example, you need lots of data. Your feelings that this is less safe than regular drivings are not enough to justify your conclusions that L2 is less safe than regular driving. I have the feelings of the opposite (I think it reduces the number of accidents), but I don't know wether this is true. Accidents that were avoided by L2 don't make it in the news, there's a huge selection bias here.

I think this kind of article might be very dangerous, because it makes people more afraid, and if L2 is actually safer on average, then reducing its usage will just increase the number of accidents.


Hopefully Tesla will be paying out 7 digit settlements for their grossly negligent behavior in exposing the public to this broken technology.


whatever they pay they'll likely do out of court settlement and make people sign a NDA, they do this for simple repairs you can bet it would be the case for more serious offenses.


I'm kind of surprised that the feds allow it since the vigilance issue is pretty much a given.


I think, like much of technology’s history, this is one of those things where something got rolled out before anyone thought it was necessary to write legislation about it.


There doesn't really need to be any legislation. It's handled via NHSA regulation (or whatever alphabet soup it is).


Cars fail and need to pull over regularly. It's hard to understand why this particular incident is all that notable.


If you change into the left lane when there is insufficient room to do so, while standing on the brakes ... you are not pulling over, you are causing a wreck. Hope your insurance is paid up.


IMO cars should be very very good at what they try to do, and not pretend to do anything more. Tesla has way too many flaky features (that customers have to pay for) for a product that can kill people. I'm fine with what Waymo and Cruise are trying to do where the car is expected to never make dangerous mistakes in the circumstances it will run (with consequences for the company), but Tesla's target of doing everything fine most of the time but still doing stupid stuff frequently is not OK.


Because you get ample notification for when your car is about to fail.

So you can switch on your hazard lights, gradually slow, let cars past and then pull over.

No normal, competent driver would act like this incident.


If you're pulling over into the left lane, you're a moron and going to cause accident.


hazard lights, or turn signal seem missing. You'd generally want to move over to the shoulder. Emergencies definitely do happen but parking, in the fast lane, in the tunnel is _incredibly_ dangerous.


Because a computer was driving.


[flagged]


Yeah because other car companies aren’t selling “full self driving”



If it wasn't a Tesla the driver would probably be sitting in jail right now for intentionally causing a wreck.

But because Tesla's pretend attempt at self-driving engaged in the reckless driving, the driver is protected despite clearly not reacting in anything even slightly resembling a timely manner.

Disclaimer: I work in the AV space... but no one in the space should consider Tesla a competitor.


The best defense of Tesla I’ve seen for this accident was “maybe it wasn’t FSD, it was a cause-a-rear-end insurance scam gone wrong”.

That’s a REALLY bad look if that’s the other plausible explanation.


[flagged]


Wouldn't it be easier to stay quiet since you don't have a meaningful rebuttal?


There was no tailgating.


The question that is never answered in this kind of story is:

1. How many fatalities per mile does tesla FSD have, compared to its best alternative if we ban it?

2. Focusing on specific failure cases is not relevant since the political decision to allow it is all or nothing.

3. It'd be perfectly logical to accept road testing FSD, even if it's significantly worse in some areas, as long as this is not exploitable, AND the net gain from allowing it is still overall positive.

I'd like to hear reasonable disagreements like "Tesla isn't actually all-in net positive" or "Here's why we should judge policy by something more than just net lives lost/saved".

Edit: the discussion fragments into two versions depending on the state of facts, which isn't clear:

A) Tesla FSD is not actually safer per-mile. If this is the case, I and most people would probably agree not to allow FSD on public roads. That's not really what this is about.

B) Tesla FSD is actually, on net safer per mile, but we should not allow it anyway.

I'm welcome to hear other options, too, but while replying I'd like to hear which one you think.


I think every discussion like this should start with each commenter counting verbally to 3,700. That's the number of people killed on roadways worldwide every single day. [1]

And that's daily. It's only every few months that we see a story like this, and this one had no fatalities.

It's totally fair to criticize Tesla, Autopilot, and/or FSD, but these discussions always implicitly accept that the status quo is acceptable, and whether or not this technology should be allowed is debatable. The status quo is an unimaginable amount of human death and injury that we accept only because it's been normalized.

Delaying self-driving cars by one year comes at a cost of up to 1.35 million human lives [1]. That doesn't excuse failures like these, but the context should be understood.

[1] https://www.cdc.gov/injury/features/global-road-safety/index...


> Delaying self-driving cars by one year comes at a cost of up to 1.35 million human lives [1]. That doesn't excuse failures like these, but the context should be understood.

Citation needed. There is nothing so far proving that self driving cars currently have less crashes then humans in comparable conditions. And they disengage in difficult conditions when most crashes happen.

Dont just start with assumption "self driving is currently safer" and then go on blame people for not trusting it.


> Dont just start with assumption "self driving is currently safer" and then go on blame people for not trusting it.

My claim doesn't depend on that assumption. My claim depends on it being the case someday.

If the US tightens regulations on ADAS today, and that results in the day self-driving cars fully replace human-driven ones being 11 years away instead of 10, it has been delayed one year, at a cost of up to 1.35 million human lives.


Delaying self driving cars does not come at the cost of human lives, if those cars are not safer then human driver. If they are less safe with current tech, delaying then saves lives.

If they tighten regulation, they are saving lives. Also, mandating companies to put in more testing would force companies to test. Without that, they are economically motivated to make big claims, but to to not invest into reproducible quality.


The current tech is irrelevant. It is not safer than a human driver, but that's not relevant.

If regulation delays the day in the future that it is safer than human drivers and that human drivers are replaced, then it is costing lives. And it would.

The only way it wouldn't is if the regulation saves more lives in the meantime than would have been saved by having self-driving cars ubiquitous sooner, but that's tough to argue when we're already losing 3,700 per day.


The excess people killed by self driving cars matter too. Self driving is more dangerous until it is not and it is actual task of regulators to delay it until then.

And if regulators won't act, it will never be safe, because it is cheaper to produce unsafe products. It earns more to lie about safety of a product, unless you have functional regulations.


So if an automaker can make a system that is capable of handling any driving task safely but occasionally ran itself off the road on purpose for no reason, as long as it’s a net positive in lives saved it should be allowed?

I don’t agree with that assessment.

It needs to make fewer mistakes than humans at all times, not just most of the time. Great with a disaster here and there is terrible.


Firstly, human error in at least 7 out of the 8 cars caused this. FSD didn't take control away from the human, it encountered something outside of it's limits and returned full control to the human.

Secondly, that's what planes are. A lot of new plane safety comes from very automated, very complicated systems.

In some situations these planes fail in a way that leaves the pilot in a more dangerous plane than one with no electronics at all.

Why do we tolerate this? Because it doesn't matter how poorly something handles failure, what dominates is the base rate of the failure occuring.

If you ban these extra-automated driver aids, and people drive Teslas less, more people die. Same as if it causes a delay in developing real FSD.

I would rather live next door to an RBMK-1000, than drive a car 10 years older than my current one.

I caution you against changing automation policy based on an accident caused by human error.


> FSD didn't take control away from the human, it encountered something outside of it's limits and returned full control to the human.

We don’t know if that’s what happened.


My thought was that the cars following needed to have adaptive cruise control engaged and perhaps then the accidents would be avoided.


So you categorize the 737MAX accidents as human errors too and thus they should not affect policy?


That's a very different situation. Namely, I don't own Boeing stock.


I see where you're going, but how do you actually justify your choices?

A: occasional random suicide (100 deaths/year), no other deaths B: no random suicide, 10k deaths per year from normal driving

You would be saying you choose B?

It's not really about the exact numbers here - just trying to get on the same page - would you really weigh your uncomfortableness with "random deaths" highly enough that you would accept increased actual deaths?

Another possibility is that the random death situation really is a signal of "actually hard-to-detect badness". If so, it should come out in statistics and we should easily be able to come to agreement in that case.

But, if the choice really is between accepting "weird mistakes+lower deaths" or "status quo+many deaths" I feel pretty uncomfortable accepting the latter knowing the real-world consequences.


This is a human comfort thing, not a reason thing. As someone else in this discussion said if computers do something that looks stupid to people, they won’t trust it.

What if the investigation here showed the Tesla was 100% correct to do what it did?

What it did is so odd I wouldn’t want to trust it. Maybe the problem is it needs to fail in a more human-like way?

If the death numbers get low enough, I think it will likely speak for itself and people will trust it. But it can’t be a 30% reduction in deaths. It needs to be like 95%.

Enough that it feels safer instead of questionable/dangerous. Even if it was actually safer the entire time.


I heard an interesting philosophical discussion many years back, where it was stated that we are morally obligated to outlaw non-self-driving cars as soon as it is proven that self-driving cars kill fewer people. It is not so easy to argue against that, because if you do, you rate your own right to choose higher than another persons right to live.


The problem with point 1 is that it's just not good at convincing people. If you have a situation where automation fails but a human would have done better, then it doesn't matter unless the statistics are overwhelmingly in your favor, which I doubt they are in this case.

This is a discussion that comes up a lot in medical technology, and if I had to guess why this form of rhetoric fails is not just because it's easier to empathize with a human, but also because when the failure case seems "simple", the issue seems a lot more like an oversight and systemic issue. That in turn makes the probability of failure probably look much higher than it really is, while also implying further undiscovered oversights.

That also kinda addresses point 2, in that, specific failure case or not, it implies a weak system with obvious oversights. That definitely doesn't help the political case for complete FSD approval.

I'm generally skeptical of this utilitarian, rationalist form of rhetoric, if only because it's overly optimistic about the ability to overcome issues in some amorphous future. Sure, a future with full FSD is probably a net good, and we can even say it's probably in within our lifetimes, but the claim that future mitigated harms outweighs all current harms of live-testing FSD won't win enough people over, and drowns out other possible policies like advocating for adapting our infrastructure to support FSD rather than having cars attempt to read signs and signals designed for humans.


I agree. People take obvious failures as a meta-sign of "badness" and throw out the system and statistical information which we have which completely proves what we should do.

That said, at some point FSD could have been banned, it wasn't, and we are living in a world where it's likely saving lives on net. The only problem is if naive arguments to ban it are accepted despite them not actually challenging its net safety gains.


Other people have said it before, but the issue is that the Tesla crashes we are seeing are ones that a human would trivially avoid.

Most people, especially non technical people, aren’t going to care about numbers that say it’s safer or better on net when the car still occasionally does things that humans view as crazy or stupid.


I'm interested in understanding why you would want every subdivision of "type of crash" to strictly improve, rather than being content that the combined crash rate goes down?

i.e. if Tesla causes 100 new crashes of type A but saves 200 of type B, isn't that a gain? Even if type A crashes are one which humans don't normally make.

A way in which I agree with you is if type B crashes are more avoidable by conscientious drivers, and you feel that your personal base rate of B is therefore already low, and that introducing new type A risk is actually NOT an all-in improvement for you.


Is there any evidence that you can point to that Teslas are saving more crashes than they are causing, compared to some other vehicle with basic collision avoidance features?


I've googled and found some very partisan sites. It seems hard to tell what's going on.

My overall analysis is: Tesla has some very serious enemies in other automakers, lawyers, etc. And yet, there are no clear studies or statistics saying they're worse, and Tesla claims they are safer. This isn't a very good analysis and I'd love to have a clearer more well-defined group for statistical comparison, but it at least seems plausible to me.


If you were hiring a human chauffeur would you insist they be better than your current driver in all conditions (rain, night-time, snow, off-road)? Or would you more likely ask that they be minimally competent in all likely conditions, and that his driving style in aggregate, for your current driving profile, be an improvement?

Insisting on not replacing a driver until every subcategory of driving type is strictly improved seems too stringent.

You still have an out here by saying that you're willing to accept more net risk due to "meta" factors like "driver not meeting minimal competence factors". But it would still be more honest in that case to fully admit that this reasoning requires knowingly accepting likely real-world increased fatalities.

And as an aside, for humans, "minimal competence" testing is reasonable - but at large scales & in statistical realms, such metrics value in protection from false claims are of less value since we already have millions of miles of real-world performance tests.


You and I actually agree completely. I was attempting to steelman an argument for why a lot of people remain hostile to the concept of self driving all together.


> i.e. if Tesla causes 100 new crashes of type A but saves 200 of type B, isn't that a gain? Even if type A crashes are one which humans don't normally make

By that logic proffeshional racing drivers should be allowed to drive drunk and stoned


There was a human driving the car.

That a Tesla crashes in ways a human wouldn't doesn't seem true. From what I've seen, even the most common accident type for a Tesla occurs at a lower rate than non-Tesla cars.

Do you have more info that's more specific about the types of crashes?

The problem here is a lot of the worst crashes are caused by humans behaving crazy and stupid. It's not a good criticism.


I guess people have no business being on the roads then, because all of the actual crashes in that video were vehicles driven by humans. If they had been following the rules of the road with proper spacing between vehicles, they wouldn't have gotten in this collision after all.

I think you're overestimating human drivers frankly. Collisions like this one happen between humans every day, multiple times per day across North America, you just don't hear about it because normal human error is at fault.


> It'd be perfectly logical to accept road testing FSD, even if it's significantly worse in some areas, as long as this is not exploitable, AND the net gain from allowing it is still overall positive

So are all 20 manufacturers going to get the right to endanger my life just so they can train their self-driving? Or is this spesific to Tesla?

Do I and my family get some compensation for this danger, or in case we are killed? Do we get a share of the profit this self driving car will generate for Tesla ?

What if Tesla never produces a good self deiving car, goes bancrupt, then I died for nothing?


It hinges on whether you think it's net safe/unsafe today. if unsafe, I'd support not allowing it, of course. If on net safer, BUT with a different distribution of harm, I think it's quite hard to make a decision. That is what most discussion is about.


Besides the fact that you shouldn't just flat out stop on the highway, how are 8 drivers stupid enough to be close enough to not be able to stop in time? Is there no limit on how close people drive to the car in front of them in the US?

Defensive driving was one of the first thing I learned, before even getting my drivers license, how come others seem to flat out ignore things like that, even for their own safety?


It's a real problem. If you leave a gap for safety (I like a 2 sec gap at least), the cars behind will take that as evidence that you don't care about keeping up with the car in front of you. Then they'll often tailgate extremely (communicating "get out of my way") for a bit, and then sometimes try to quickly maneuver around you and in the process getting dangerously close to you and/or the car in front of you.


I guess, this is the vigilance problem cascading. Drivers just don't expect cars in a straight and clear lane in a tunnel to stop at random. (At least, not without warning lights and other signs of alarm.) It's not in the expected range of behavior and nobody is prepared for it, as they follow the dull and monotone flow of traffic.


> Drivers just don't expect cars in a straight and clear lane in a tunnel to stop at random

That's just it, why not? I always have the expectation that some unexpected can happen around me in traffic, at any time, and position myself accordingly. If I'm too close to the one in front of me so I can't handle them braking suddenly, then that's telling me I'm too close.

I've been close to a lot of accidents, but never actually in one. Being vigilant at all times when driving and taking breaks when I stop being vigilant, I'm sure have played a huge role in being able to avoid these accidents.


> Being vigilant at all times when driving and taking breaks when I stop being vigilant

I guess, you're not part of the target audience. :-)

(More seriously, congratulations and many thanks for being a responsible adult.)


It is maddening, and puts the burden on people who do try to maintain a reasonable following distance. If you leave sufficient room, there's an endless line of cars that will pop into that space, and now you're adjusting again...in a loop.


So what? If you happen to go faster than the cars that go into the space between you and the one in front, change one lane to the left to pass them. If there is someone in front of you even in that lane, continue at the speed they are driving, leave space in front. Someone goes into the space? Adjust again.

Are you seriously avoiding to leave any sort of space in front of you just because other cars might go into that space? Seems like a dangerous game to play not just for yourself, but for others too.


Just making an observation about the group dynamics in play. I was not trying to rationalize anything about me personally.


This. And additionally, the tesla driver didn't seem to be aware what was going on with his car, if he is unable to react in the 5 - 6 Seconds it took his car to completely stop.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: