Hacker News new | past | comments | ask | show | jobs | submit login

Yes, good point. Let me add some spark to the fire. What if the machine learns by itself? The base program is still man made but the rest is developed by it. Where does one simple say the machine made a mistake instead of programmer error?



If a human engineering team creates a car with adaptive software, and that software does not contain sufficient safeguards to ensure it at all times directs the vehicle in accordance with laws and applicable regulations, then the engineering team is liable. That is how engineering works, outside of the software industry. It doesn't matter how 'smart' the car is; it is not a force of nature or a human being. It is a product, and any catastrophe engendered thereby is the responsibility of the organization that produced it.


But no other engineering industry has the same issue. They don't build intelligent machines. The question still stands.


You appear to be confusing 'engineering self-driving cars' with 'creating a sentient being'. Programmers are only capable of one of these tasks.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: