I feel like probabilities are best stored as inverse logistic function, because the closer the probability is to 0 or 1, the less you care about significant digits, because the events are going (not) to happen anyway. (That's why I am proponent of LNS.)
I feel that it's actually the opposite. There is a world of difference between 0.0...01 and 0.0...001 , where ... represents the same number of 0s in the two cases. Same applies for the other end.
edit:
Do I really want more resolution between 0.41 and 0.42 than between 0.01 and 0.02?
You're dead on. Metals are sold by how fine they are, so .999%, .9995% (should end in a 7 numeral not a 5, whole industry is fucking up right there, but at any rate) .9999%, more nines more purity better experimental results. When they talk about 4 nines or 5 nines the actual measurement is of the
f(x)= - log(1 - x). That is the purity. Negative of the log of the complement. And that's why 4.5 nines should be .99997% not fucking .99995%, so stupid (it's almost exactly .99997%). Mining engineers, come on! Such impressive degrees! PhD's all over the place! Professors even! I'm sorry, don't mind me with my math.
Some experiments like with liquid crystals early on required purities that what chemical company was it, Merck I think, around the year 1900, complained like they felt insulted. Told the inventor of liquid crystals because it was an absurd amount of purity required to get them actually working. But look at them go! Right in front of your very eyes!
The logistic curve becomes denser close to 0 and 1. Which makes sense: you will want to tell apart 1 defect per million from 0.01 dpm, and 5-sigma process (99.977%) from 6-sigma (99.99966%), much more than tell apart 30% from 30.001%.