Hacker News new | past | comments | ask | show | jobs | submit login

Whenever I do embedded work with counters, every time I assign a variable that has a possibility of overflowing I do a little mental count about how likely it is to overflow. That's part of of the software development process.

They may not have had a meeting about it, but I think it's exceedingly unlikely that whoever decided to assign a 32 bit int to store time didn't give some consideration to the date range it could represent. Otherwise how would they know not to use a 16 bit int?




They didn't design it in a vacuum. They had worked on other OS'es already that used 32-bit ints with a 1 second quantum and they (probably subconsciously) thought that if it had been good enough for those other systems it was good enough for Unix.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: