It's great that you use significant digits, but then you have to be explicit about it, and display exactly that many digits, in this case all the zeros (1,00000).
> I would say that six sign. digits are more than enough for most applications
A better solution would be to not use a fixed number of significant digits, but to display the result with the precision of the least precise number the user entered. If you could follow the propagation of uncertainty through the calculations, that would be even better.
> It's great that you use significant digits, but then you have to be explicit about it, and display exactly that many digits, in this case all the zeros (1,00000).
Yes, I completely agree from a statistics/physics viewpoint. The reason I display it as `1` right now is that I only have a fixed number of sign. digits (and not the sophisticated solution that you propose - which would be great!). If I always displayed all six digits, one would get `3 - 2 = 1.00000`...
> A better solution would be to not use a fixed number of significant digits, but to display the result with the precision of the least precise number the user entered. If you could follow the propagation of uncertainty through the calculations, that would be even better.
> I would say that six sign. digits are more than enough for most applications
A better solution would be to not use a fixed number of significant digits, but to display the result with the precision of the least precise number the user entered. If you could follow the propagation of uncertainty through the calculations, that would be even better.
https://en.wikipedia.org/wiki/Propagation_of_uncertainty
PS: Great job anyway, bookmarked, thanks!