Yeah maybe. I've worked on code where there were constants defined for minutes-in-hour, hours-in-day etc though, and that was just annoying; it's conceivable that our culture will one day decide that the Babylonians were idiots and we should use the French revolutionary clock instead, but I'll take my chances
How many seconds in an hour? Most of the time, 3600, occasionally 3601, and very rarely 3599. Hours in a day? Mostly 24, but 23 once a year ad 25 once a year.
These all seem like good reasons to make then functions (taking a timestamp as a n argument) rather than mostly-correct constants.
I swear, the more I learn about calendars and timekeeping, the more I realise I never ever want to deal with it.
More realistically, software will run on other celestial bodies which have different time periods (e.g. the moon, mars, asteroids, etc.).
For me, the use of those annoying constants is a form of mental relief: I don't have to remember why I'm dividing by 60 or 100 or whatever in this specific spot, it's written out in a way that reading it provides the context.