It's an attempt to make an abstraction concrete. Think of it as the trolley problem in real life.
Stalin is famously supposed to have said, "one death is a tragedy, 100,000 is a statistic". Cynical or not it is how humans think.
> If you believe we should never use nuclear weapons, then don't have them at all.
Strategic game theory and Mutual Assured Destruction depend on the possibility that the other guy will use them if you do, and may be the only way to prevent their use. Interestingly this is one reason why you want the other guy to know your procedures, capabilities, deployments etc. Secret weapons have no deterrent value.
> Think of it as the trolley problem in real life.
Well exactly... doesn't that show you that it's a bad idea? People don't know if they could bring themselves to throw the switch even if everyone thinks it makes rational sense.
You're taking a rational, well-considered, strategic decision... and making the interlock a messy personal emotional one unrelated to the actual issue at hand. That sounds like the wrong way around to be doing things?
> Well exactly... doesn't that show you that it's a bad idea?
I don't think so, no. Sometimes we think too abstractly and make what turn out to be poor decisions. Emotions are really valuable heuristics and should be harnessed at a time like this.
Absolutely not, mutually assured destruction only works if both sides know that the other is committed to carrying out a retaliatory strike in the minutes before their death. It’s essential that the person in the position to order a retaliatory strike be someone ready to kill hundreds of millions of people for no reason other than the fact that they said they would. Putting emotional barriers between that person and the codes they need to carry out that enormous responsibility just makes it less likely that they will be able to follow through. If there’s sufficient uncertainty about whether there will be a follow-through then the nuclear arsenal loses its deterrence factor and we’re back to having to live with the fear that our rational enemies may carry out a first strike on us.
> Absolutely not, mutually assured destruction only works if both sides know that the other is committed to carrying out a retaliatory strike in the minutes before their death.
Not really. You would need to be absolutely certain that the other party won’t carry out a retaliatory strike before they’re destroyed.
The only thing that matters is that the other party is capable of indescriminate destruction, not the certainty they’ll actually do it.
It’s like punching someone holding a gun in the face.
Trolley Problems are themselves a bad idea... the Kobayashi Maru is a similar exercise. I, like Kirk, don't believe that there are situations that can't be worked around if there is time to think, and resources to act.
Isn't the Trolley problem a situation that is, by definition, time sensitive? If you had more time to think and resources to act, it wouldn't be a Trolley Problem.
If the answer to launch-nukes-by-cutting-a-human-aide is "well, I need more time to think" then maybe that's a good outcome?
It's an attempt to make an abstraction concrete. Think of it as the trolley problem in real life.
Stalin is famously supposed to have said, "one death is a tragedy, 100,000 is a statistic". Cynical or not it is how humans think.
> If you believe we should never use nuclear weapons, then don't have them at all.
Strategic game theory and Mutual Assured Destruction depend on the possibility that the other guy will use them if you do, and may be the only way to prevent their use. Interestingly this is one reason why you want the other guy to know your procedures, capabilities, deployments etc. Secret weapons have no deterrent value.