Hacker News new | past | comments | ask | show | jobs | submit login

why? if 0 was getting divided with i would want to know otherwise i'll be using wrong calculations





Because "0" often means "show none", so when dividing by 0, I'm fine "showing none".

I'm sure it doesn't work for everybody, but I never had a specific need to deal with zero in the division that didn't result with "actually let's count it as 0"


There are several domains where 0 is different from none, for example, most computations involving rates.

Imagine that you are running some A/B tests and you want to track conversions. If one of the experiments received 10 users and had 5 conversions, you want to show 50%. If it received 10 users and had no conversions, you will show 0%.

However, if it has received 0 users, while you could show zero conversions, the correct answer is to say that you don't know the conversion rate. Because maybe, if you had had 10 users, they could have all converted, and the rate would be 100%. You simply don't know.

Same logic applies over computing ROI, interest, velocity, etc.


Agree, I'm not saying there aren't counter examples, I'm just stating that making the call of returning zero when dividing by zero it's not an insane call, there are valid reasons for doing that. It's a judgement call (and they do provide a function that does the right thing)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: