I don't understand why it's legal. How can it be allowed to send out uncertified software to cars on public roads? Aren't there safety standards that need to be met? I thought that safety critical products where better regulated than this.
Welcome to being me three or four years ago. I've literally met with MPs and ministers trying to get regulations and other things. They're coming. But they don't happen without civic engagement. Write to your elected representative and call for legislation. It works. It takes time, but it works.
In Europe, most of the Tesla features are disabled because they were deemed hazardous. Only lane assist and adaptive cruise control is enabled. The others are severely limited or disabled (Summon etc.)
While I'm not a big Tesla fan, I don't think legislation is any indication of the actual safety that can be provided. Europe legislation tightly follows what German carmakers can deliver. Once they can offer the same features, it will be legal in no time.
For the pollution aspect of engines, I have some sympathy with this view, even though its not entirely correct.
When it comes to safety though, I disagree. One of the decent things to come out is the Euro NCAP rating. The manufacturers are not part of the testing process, apart from they need to supply cars. Each car is then given a rating.
For Autonomous driving, from what I can see its still down to individual states.
>Europe legislation tightly follows what German carmakers can deliver.
Do you have a source for this? Cause it seems to me that they could easily deliver something like summon, seeing as parking-assist/automatic-parking already exists.
Wouldn’t it make sense that “hardware” gets tested before it can go on the roads, why is it not the case with software?
And if they find a bug, than disable the software whole-sale until it is tested by regulators again (since it can introduce new bugs as well)
And they make driving with the remainder of AP extremely unsafe in the EU sadly. For example they limit the angle of turn, meaning the cars cannot drive safely around a lot of non-highway road corners without drifting into oncoming lane (upon which the car brakes and beeps due to another feature called lane assist). Admittedly the car could slow down before the curve, but that would get you into trouble with cars from behind not expecting you to slow down for these kinds of curves.
It's heavily regulated, just ask geohot [1]. Most of the players in this space do their self-driving vehicle testing carefully under controlled circumstances with some approval from local agencies. Tesla seems to be cheating by punting all the responsibility onto their customers, because instead of testing Self-Driving Cars they're just shipping software to customers and letting people run that on their cars.
Well, software that controls a car is a new thing.
So I would imagine that it hasn't really been regulated yet in most jurisdictions. For some reason we have a tendency to write laws so that they are specific to individual things rather than general and future prof laws.
Software in cars is not "a new thing". Safety has increased immeasurably in cars over the last few decades, in no small part due to software. Complexity and opacity are a problem, but cars are much safer and more efficient today due in large part to software. I don't think that would have happened if there was some regulatory committee in place to audit the software for safety. We have liability for that, a much better model than a regulatory one.
> However, as the importance of electronics and software has grown, so has complexity. Take the exploding number of software lines of code (SLOC) contained in modern cars as an example. In 2010, some vehicles had about ten million SLOC; by 2016, this expanded by a factor of 15, to roughly 150 million lines. Snowballing complexity is causing significant software-related quality issues, as evidenced by millions of recent vehicle recalls.
I don’t get why it’s not covered under existing laws, clearly every vehicle has to be operated by a driver with a valid license correct? You would think letting a vehicle drive without a licensed operator at the wheel would be negligence.
I agree, but when it came to bullying the anti bullying laws apparently didn't cover online bullying which didn't really make sense to me.
So I assume that this is a case of the person in the drivers seat must have a drivers license, but there is no law saying how much of the driving they must do.
Nobody is doing that. In the locales where truly autonomous vehicles are being tested, it’s happening under specific legislation and regulations put in place by the states. For Tesla, behind all their bluster, the terms and conditions you accept to use their “self-driving” features make it clear that the human driver is always responsible for safe operation of the car, and all of these features are just to assist the human driver.
Kinda hard to square that with a product called "Full Self Driving". Terms of service are generally worthless as a legal shield against catastrophic harm to customers.
Unless you're driving a Flintstones car, much of the driving you do in any modern car is done by software. From fuel injection, to power steering to anti-lock breaks