Hacker News new | past | comments | ask | show | jobs | submit login

Yeah maybe. I've worked on code where there were constants defined for minutes-in-hour, hours-in-day etc though, and that was just annoying; it's conceivable that our culture will one day decide that the Babylonians were idiots and we should use the French revolutionary clock instead, but I'll take my chances



How many seconds in an hour? Most of the time, 3600, occasionally 3601, and very rarely 3599. Hours in a day? Mostly 24, but 23 once a year ad 25 once a year.

These all seem like good reasons to make then functions (taking a timestamp as a n argument) rather than mostly-correct constants.

I swear, the more I learn about calendars and timekeeping, the more I realise I never ever want to deal with it.


Tell the reader that this here code is prepared for leap years, leap seconds, leap hours:

  SecondsPerStandardHour = 3600 // or NormalHour or TypicalHour 
  HoursPerStandardDay = 24      // or NormalDay, TypicalDay


More realistically, software will run on other celestial bodies which have different time periods (e.g. the moon, mars, asteroids, etc.).

For me, the use of those annoying constants is a form of mental relief: I don't have to remember why I'm dividing by 60 or 100 or whatever in this specific spot, it's written out in a way that reading it provides the context.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: