Yes, good point. Let me add some spark to the fire. What if the machine learns by itself? The base program is still man made but the rest is developed by it. Where does one simple say the machine made a mistake instead of programmer error?
If a human engineering team creates a car with adaptive software, and that software does not contain sufficient safeguards to ensure it at all times directs the vehicle in accordance with laws and applicable regulations, then the engineering team is liable. That is how engineering works, outside of the software industry. It doesn't matter how 'smart' the car is; it is not a force of nature or a human being. It is a product, and any catastrophe engendered thereby is the responsibility of the organization that produced it.