Hacker News new | past | comments | ask | show | jobs | submit login

The output is currently rounded to six significant digits. In this case, the result is 1,000004.., which is rounded to 1,00000 and displayed as 1. I'm not sure what the best default behaviour would be. Typically, I would say that six sign. digits are more than enough for most applications..



It's great that you use significant digits, but then you have to be explicit about it, and display exactly that many digits, in this case all the zeros (1,00000).

> I would say that six sign. digits are more than enough for most applications

A better solution would be to not use a fixed number of significant digits, but to display the result with the precision of the least precise number the user entered. If you could follow the propagation of uncertainty through the calculations, that would be even better.

https://en.wikipedia.org/wiki/Propagation_of_uncertainty

PS: Great job anyway, bookmarked, thanks!


Thank you for the feedback!

> It's great that you use significant digits, but then you have to be explicit about it, and display exactly that many digits, in this case all the zeros (1,00000).

Yes, I completely agree from a statistics/physics viewpoint. The reason I display it as `1` right now is that I only have a fixed number of sign. digits (and not the sophisticated solution that you propose - which would be great!). If I always displayed all six digits, one would get `3 - 2 = 1.00000`...

> A better solution would be to not use a fixed number of significant digits, but to display the result with the precision of the least precise number the user entered. If you could follow the propagation of uncertainty through the calculations, that would be even better.

That would be fantastic. I'll look into it.

> PS: Great job anyway, bookmarked, thanks!

I'm glad you like it!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: