Hacker News new | past | comments | ask | show | jobs | submit login

But it turns out that Self driving cars must save people inside at any cost. If not,the whole purpose of selfdriving cars to reduce accidents (coz of human error) gets defeated. Because people prefer driving themselves than to buy self driving cars.

https://www.technologyreview.com/2015/10/22/165469/why-self-...




This is wildly overstated: Tesla's have been selling better than ever despite many documented instances of their self-driving mode failing to "save people inside at any cost." A self-driving car occasionally killing an occupant is seen more like the same sort of low-odds risk as a human-driven car occasionally killing its occupant.

A self-driven car that doesn't watch out for non-passengers is likely to going to run afoul of the same sorts of laws that already exist to avoid drivers prioritizing themselves over anyone else. There's a case in SoCal now trying to make the manslaughter liability for this in a Tesla with autopilot enabled be assigned to the driver of the Tesla. It'll be interesting to see where that goes.


Unless self-driving cars are more convenient and despite prioritizing pedestrians, are still safer than a human driver.


Vehicles that don't prioritise the occupants are not going to sell well versus ones that do. It's very hard to imagine that the default could be anything but "protect occupants" in a free market where cars are privately owned. Fleet operators have slightly different incentives, which are to minimise economic damage to the service, a combination of liability/damages and PR.

To make anything else happen, you'd need to regulate. But the "self driving altruism act", which mandates that e.g. the car you just bought must kill your family in order to save pedestrians you don't know - I think it might be really difficult to get that law passed. You might be able to make some headway with fleets.

IMHO markets, human nature and politics constrain the solution space for "self driving moral dilemmas" to a small subset of what's theoretically possible.


> Vehicles that don't prioritise the occupants are not going to sell well versus ones that do.

There are plenty of cases of people trusting the existing automated systems that specifically disavow being good enough to trust anyone's lives to. Even in light of news that other people have died in so doing.


Fine, but I thought we were talking about trolley-problem style ethics here?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: