Hacker News new | past | comments | ask | show | jobs | submit login

> purely out of retribution, which is ultimately pointless.

If a mad scientist was making a better bomb design, and accidentally blew up a million people through negligence, would your perspective be different? What about a bridge that collapses and kills people? Engineers go to prison for that. There needs to be accountability for, demonstrably, one of the most dangerous acts that can be done.

I would be perfectly ok with the the lab being shut down/redesigned, if a flaw in its design were found, and any negligent scientists and safety officers being removed/prosecuted.




Blaming process failures on individuals rarely yields any results. Can you give me one example of a time it worked; where merely punishing an individual without making any process changes has prevented a class of accidents since that incident to now?

My take is that any process relying on a single individual is ultimately a failure. All sorts of things happen to individuals, including losing the ability to fear the consequences (think degenerative brain disease; cancer, Alzheimer's, etc.) With the fear of consequences eliminated, your process fails open and retribution accomplishes nothing. If some scientist at a bomb lab wants to kill 1 billion people, including themselves, no punishment is going to deter them. "An eye for an eye", while appealing to our reptile brains, can't scale beyond a single murder. So your process for preventing mass murder can't include it if it's to be effective.

Punishment doesn't bring your loved ones back from the dead. All we can hope for is a system that doesn't fail so easily next time.


I actually assume the policy is fine. Part of policy is, necessarily, enforcement. In any safety critical policy, there is someone that holds final accountability. That accountability is the pressure, the policy, that guarantees the policy is followed. This is why we have the enforcement of laws, and not just the laws themselves. Criminal negligence is a possibility, happens frequently, and is enforced often, around safety critical systems. If there’s no enforcement, then the policy is fiction.

I highly recommend you look into accountability, in safety critical systems. There are many many decades of policy/data that show accountability is a necessary component. Is what keeps them honest/careful.

(I’ll try to be back with references and a good example)


> Is what keeps them honest/careful.

A better way to phrase it would be, it's what ensures that the words of the policy become actions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: