Hacker News new | past | comments | ask | show | jobs | submit login

> Give it a decade and I bet the ~585 year time frame is reduced to a small enough interval where bugs appear.

I'm skeptical that much changes in the next decade regarding where 64-bit integers are useful. It's been a long time since there are been meaningful clock frequency increases in commodity CPUs. Even if we had a 100Ghz CPU and we had it increment a register once per clock cycle, we're still talking ~6 years for it to overflow. And, thats assuming a 20+ times increase in clock frequency when clock frequencies have been mostly flat for the last 10 years.

> a programmer in the future commenting that 128-bits should have been used, and 128-bits will break in the future of that future, and so on.

I disagree with this statement. With 128-bits, it becomes difficult (though not impossible) to even come up with things that are numerous enough to be unable to count with that number of bits. That was never the case for 64-bit or smaller integers.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: