I posted a comment saying "a common mistake is thinking 10% = No" and you've responded by with something that looks a lot like "you're making a mistake thinking 10% = yes".
My suggestion is you don't have enough information to claim I'm making any specific mistake.
You prepare for a 1 in 10 chance event which other people don't prepare for; i.e. is you act exactly as if you think it will certainly happen, but you tell yourself you don't think it will certainly happen?
This sounds like a distinction without meaning. Using "people mistake my preparation for me believing that it will happen, so they are dumb" is weird. You're acting as if you think it will happen, of course people think you think it will happen. Why else would you bother preparing? You're not buying 10% of the sandbags you'd need, you're buying 100% of them because nothing else would do.
"Oh but I'm smarter" but your belief is leading you to take the same actions as the person who thinks 10% == yes, so in what way is it smarter? In what way is it different?
> You prepare for a 1 in 10 chance event which other people don't prepare for; i.e. is you act exactly as if you think it will certainly happen, but you tell yourself you don't think it will certainly happen?
"Prepare for" is not the same thing as "think it will certainly happen". It's a case of making your cost-benefit calculations appropriately. You don't do all the things that you'd do if you thought it was 100% going to happen, you do a cheap subset.
This doesn't answer my question. If you prepare for it, then you are acting as if you think it will happen. That is, if you thought it would definitely happen you would do the same thing you are doing now, therefore the two states are indistinguishable except for the story in your head - and the main point about the story seems to be "I don't think it will happen therefore I'm better than the people who do think it will happen, even though we are both expending the same effort, doing the same preparations, to avoid the exact same problem"
> If you prepare for it, then you are acting as if you think it will happen.
That isn't the case. Preparation isn't a binary state either - people can be more or less prepared for an occurrence. The estimated likelihood of an event changes how much time and effort should be spent preparing to it.
There is a big difference in how people behave if they thing there is a 0%, 0.05% or 98% chance of being mugged when they go outside for example. The 0.05% case would rationally start avoiding people late at night or not carrying valuables. The 98% person would not go outside.
Of course, if you persist in dismissing everything that makes a difference, then you will never figure out what the difference is.
You are being inconsistent in the way you presume (mostly mistakenly, I believe) various opinions and motives on the part of Scott Alexander, while attempting to reject opinions, motives and rational thought as being relevant to the behavior of people who prepared for C19.
Well, I agree that "chanting" (i.e. explaining to someone who will not listen, apparently) will not change anything (and least of all your understanding of the issues), but actually practicing cost-benefit analysis effectively does make a difference - as we can see, for example, in the difference in how C19 is progressing in South Korea and in Britain.
Are you absolutely sure you are across all the flaws in my character and decision making process after my 2 short comments? You've got practically no idea at all how I approach planning and you're already telling me a bunch about myself that I never said.
> "Oh but I'm smarter" but your belief...
Case in point.
The article says Nate Silver had Trump's chances at a little worse than 1 in 3. The average person is going to sit through 17 presidential elections.
I don't have to be very clever to say that Trump winning is totally consistent with the model; and anyone who was caught by surprise has failed to understand probability. 29% is not saying no chance of victory and it isn't really that surprising. It isn't even a forecast of defeat if you think about it. It is evidence that a defeat was likely, but a modeled probability is not a forecast - it is evidence used in forecasting.
I'm really claiming that most people will hear "10% chance of disaster!" then after it didn't pan out 9 times they'll start talking about the boy who cried wolf or stopped clocks when the disaster strikes the 10th time. I have a massive advantage over them, because that is exactly the wrong attitude towards probabilities.
My suggestion is you don't have enough information to claim I'm making any specific mistake.