The difference is that today, it's normally the case that it clearly isn't the car's fault--whatever combination of humans, pedestrians, bicyclists, and just acts of God it is. If the car's brakes don't work properly even though they've been properly maintained or an ignition switch fails, the manufacturer gets sued. The wrinkle with the scenario in question is that it's not normally the case that programmed instructions in a machine can play a role in an accident and that's considered to not be a fault of the company programming the machine.
I assume there would still be insurance. But if my self-driving car causes an accident, I certainly shouldn't be "at fault" from the perspective of my insurance premiums nor should I be able to be sued--as would be the case today.
I assume there would still be insurance. But if my self-driving car causes an accident, I certainly shouldn't be "at fault" from the perspective of my insurance premiums nor should I be able to be sued--as would be the case today.