Hacker News new | past | comments | ask | show | jobs | submit login

I don't see these as opposed principles.

16. They're not smart.

19. You're not smart, either.

Assume everyone is an idiot, including yourself. Verify everything (your own work and other people's work) using robust testing protocols. Don't use your own judgement; you're not smart enough for that. Gather evidence. But test your evidence-gatherers to make sure they're doing their jobs, too. Assume every component of your system-of-people is half-fallen-down and just barely limping along at all times. Make a system that works in the face of everyone coming to work with a concussion.

(And, if some component isn't barely limping along, then people are probably being too naive about its potential failures, so inject some chaos [monkeys] into it to get its reliability down to a level where routing around it has to be operationalized, too.)




In WWII, bullet holes in planes had to be repaired. But the damage wasn't uniform - some areas got badly shot up, some areas were never damaged. So they added armour to the undamaged areas, because the planes that got shot there never came back.

Similar idea from another angle: superficial design flaws obscure fundamental design flaws (Douglas Adams on the Sirius Corporation Complaints Depatment)


Or: they’re smart and you’re smart, too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: