Hacker News new | past | comments | ask | show | jobs | submit login
Insane complexity of calendrically correct date and time operations (yourcalendricalfallacyis.com)
256 points by mtmail on Sept 20, 2018 | hide | past | favorite | 140 comments



I've encountered just about every item on this list working on my Apple Watch complication, Better Day[1], which supports 11 calendar systems in 21 languages. The funny thing is, the last item on this list — always use ICU through the NSCalendar API — is the exact conclusion I arrived at, and one that I preach to anyone who will listen.

Especially with modern Swift syntax, complex questions like "what's the current day of the year" end up having simple, almost poetic answers like "calendar.ordinality(of: .day, in: .year, for: date)". And you can trust that they are correct for every calendar and every locale. Usually. [2]

[1] https://itunes.apple.com/us/app/better-day-a-complication/id...

[2] https://www.joeycastillo.com/notes/2016/09/18/of-crescent-mo...


I only skimmed your article, but are you sure that's right? There must have been at least several date adjustments before your app was release yet the system time maintained the correct date. Presumably it'll correct afterwards, invalidating the date offset.


The thing to keep in mind is that before the sighting, the months are pre-calculated for when the first crescent moon should rise over Mecca. So a change only affects the single month where the observation differed. For example, if you're expecting to see the first crescent moons on October 10, November 9 and December 8, but you didn't see the first one until October 11, that doesn't push the moon sightings out for the rest of the year. It just means that the month you thought would end on October 9 will have an extra day, and the month you thought would start on October 10 will start a day later (and as a result will be a day shorter).

Future months are still totally fine; the date offset is just a temporary fix that the user remembers to turn on when the announcement is made, and to turn off at the end of the month.


The leap second has a history of breaking computers. Pretty much every Linux server running Java broke in 2012, for example. 2009 saw a bunch of Linux kernels crashing in logging code. 2016 was mostly better except for Cloudflare: their systems had an assumption time never runs backwards. https://blog.cloudflare.com/how-and-why-the-leap-second-affe...



Imho that's no better. A leap second "smears" the day lengths. Smearing the leap second means introducing changes at some scale, eg. more or less micro- or nanoseconds. Software needs ro be able to cope with this.


I had the same thought but...

The smear technique results in seconds that are 11.6μs longer than nominal, and Google makes the case this is within the "manufacturing and thermal errors of most machines' quartz oscillators". So at face value, I'd say software already copes with discrepancies of that order.

However I would infer the maximum delta actually seen by an NTP client is larger and depends on the client's polling interval. e.g. If a machine runs a normal clock locally and synchronizes every 20 minutes (which is probably on the high side), it would see about a 14ms jump on each sync during the 24 hour smear window.

That's assuming the NTP client doesn't already use a smear algorithm locally to correct for drift (Windows doesn't, but I think Red Hat Linux might? Per https://access.redhat.com/solutions/39194 "NTP polling does not directly synchronize the local system clock to the server clock; rather, a complex algorithm calculates an adjustment value for each tick of the local system clock")

Maybe someone more well-versed can elaborate further.


Not wanting to be a complete BSD bigot, I can't actually remember if this broke in the {Free,Net,Open}BSD world as badly. The 2015 docs for FreeBSD don't help (oh, for a time machine) but go to "you can test what we do" which is good.

https://www.freebsd.org/doc/en_US.ISO8859-1/articles/leap-se...


BSD is actually worse, it doesn't handle leap seconds at all, it simply ignores their existence and assumes the clock was simply inaccurate.

That's fine as far as it goes, but make sure never to use a BSD machine as an NTP server or you will cause a lot of confusion for any servers downstream.


I don't know about OpenBSD or NetBSD but I've witnessed FreeBSD handling leap second correctly. I also witnessed OpenNTPD ignoring it and that's documented, so I suspect OpenBSD doesn't handle leap seconds at all.



Not to jump on the bigotry, but two of the three bugs mentioned in that comment were in proprietary software (Java and RRDNS) that merely runs on Linux. The hrtimer issue in 2012 was a real kernel bug though.


The 2012 bug was a bad interaction between Java and thread locking in the Linux kernel. It affected MySQL as well. The 2009 bug was in the Linux kernel, in the logging code. The 2016 bug I mentioned was in Cloudflare's proprietary code and only had a big impact because so many websites use Cloudflare. There are zillions of other smaller leap second related bugs, these are only three of the most visible.

(btw, the word "bigotry" has a real meaning and this isn't it.)


Java was open sourced in 2006.


There's now "Google time".[1] This handles by leap seconds by making each second slightly longer, starting 12 hours before the leap second and ending 12 hours after it.

[1] https://developers.google.com/time/smear


While I wouldn't want to be the one responsible for designing, testing, implementing and converting to using said system, it does seem like the easiest way to handle the problem for the majority of cases in this day and age, where measuring 1/86,400th of a second is trivial. If only we were deciding how to implement leap seconds for the first time right now...


And there's also UTC-SLS.

But a more important distinction than whether you smear over +/-12 hours or +/-1000 seconds is whether you broadcast a bogus time signal to all your hapless servers, which is what Google does, I think, or whether you fix your kernel and C library so that the system knows the real time but reports UTC-SLS to those programs which use the old APIs and are therefore presumed to be incapable of understanding the real time. For example, clock_gettime(CLOCK_REALTIME, &res) might provide UTC-SLS (or whatever) for naive programs while the same syscall with CLOCK_TAI or CLOCK_UTC might give you the actual real time correctly.


Old APIs won't be the only problems, it's easy to introduce bugs like this new API or not (it's much easier to say use monotonic timers these days, but it's still easy for devs not to use th=em)


Christ Church College at Oxford still makes a limited use of 'local time' offset 5 minutes and 2 seconds from the time zone.

https://greenwichmeantime.com/info/oxford/


Jolly good, the whole rail-road mania with their cum tempore timetables never full-filled their fantasy of a unified time-zone for Old Blighty.


This could very well have been called “Falsehoods programmers believe about date and time calculations”. It fits very well into the pattern of other similarly titled lists which have been published in recent years.


https://infiniteundo.com/post/25326999628/falsehoods-program...

edit: which cites https://infiniteundo.com/post/25326999628/falsehoods-program... (from HN's own patio11) as inspiration, as the original "Falsehoods programmers believe about X".



You're right, thank you. I missed the edit window.



Here's one such list for anyone interested:

https://github.com/kdeldycke/awesome-falsehood


Those lists make a point of not backing up any claims made.

I much prefer this format.


This is actually a lot simpler than it looks. There are two concepts.

Calendars almost never have crazy holes in the middle of them. When you say "I'm in Chicago", you pretty much can schedule an appointment at 1 or 2pm on "Sunday" or "9/20" or "next Wednesday" and everybody knows when that will happen. Leap years cause a little bit of trouble.

Where it gets tricky is when you have to interact with wall clock times, say, calculating the number of hours until your calendar event happens, or looking for all the events in a given range of wall clock dates. This is the world of timestamps, offsets, and "the global timeline" that everyone in the world experiences together.

Calendars also don't really care about daylight saving time.

If you keep these two separate and are specific about which you're dealing with, 90% of this goes away. Separate calendar from arithmetic/wall clock times. Not all will disappear, but a surprisingly large amount.

Source: working on https://www.interval.org/ and using the excellent NodaTime library. Thank you, Jon Skeet.


> Calendars almost never have crazy holes in the middle of them.

The comment right above yours at the moment talks about effectively that very problem (in this case, an unforeseen date moving a calendar back a day).

https://news.ycombinator.com/item?id=18036403


The Problem with Time & Timezones - Computerphile https://www.youtube.com/watch?v=-5wpm-gesOY

A fantastic overview of the headaches you'd have if you wanted to code all this.


> Weeks start on Sunday in the United States, Monday in Europe, and a couple of places start on Saturday.

Wow. I live in Europe, and find it weird that people can consider the week to start on Sunday.

I guess everything that may appear normal within a cultural frame of reference can be off in another, even the most basic of things!


It's not a U.S./Europe dichotomy. The first day of the week being Sunday is a Judaeo-Christian, and Islamic, thing.

For countries with these as official religions you will find that offically Sunday is still the first day of the week for many purposes. For example: In the U.K., officially still Protestant, a week for the purposes of statutory sick/maternity/paternity pay or national insurance begins on a Sunday. Sunday trading legislation was talking about weeks commencing on Sunday as recently as 1950.

* http://www.legislation.gov.uk/uksi/2014/1640/article/2/made

* http://www.legislation.gov.uk/ukpga/Geo6/14/28/section/22/en...

If you think that millennia-old traditions, which you can still find observed in Europe if you look, are weird then academic weeks beginning at noon on Wednesday, examples of which can also be found in both the U.S. and Europe, will probably blow your mind.


> I live in Europe, and find it weird that people can consider the week to start on Sunday.

You are almost certainly neither Portuguese nor Maltese.

In these two European languages, the days of the week are literally numbered as if Sunday were the first day of the week.

Portuguese: Monday is "segunda-feira" which means something like "second market day" Maltese: Monday is "it-Tnejn", which AFAICT means "the second [day]"

In both languages, most of the other days have similar names, representing their position in a week that starts on Sunday.


It's the same in Greek, Monday means "Second" but still, everybody considers it the first day of the week.


What is the relevance of saying that a week starts on a particular day? In the most common usage, the "week" starts on Monday in the United States, and concludes on Friday, to be followed by the "weekend". A calendar week of 7 days does start on Sunday, but who cares? If they changed how the calendars were printed, life would be different in no way.


In parts of the world, weeks of the year are numbered, so that you can talk about things happening "in week 43", and so on. For such a system, it's of course quite important to agree on which day a new week begins. Also (again, as a European) it feels very odd to not have the weekend (Saturday and Sunday) be at the end of the week, but instead "straddle" between two weeks.

I found the original article a bit strange in it's partitioning of the world into { US, Europe, some places }, I feel that is giving ... undue weight to the US convention, but I haven't studied this to see if I'm right.


In the US, weeks are not numbered, and the way to refer to a week is e.g. "the week of the 10th". (If you're talking about a different month, "the week of September 10th".) You would usually name the Monday of the week because the context is usually work-related.

I question whether "Thursday of week 37" is really that much clearer than "Thursday, September 13".


Perhaps it's not, but it is a bit more compact and even more so in my native language, Swedish. The names of months are longer: "September", which is the same word in Swedish has four syllables while the word for week ("vecka") has two.

And sometimes you really want to speak about the entire week without naming a day ("we'll have the first boards week 40, smoke-test and bring-up during week 41, and then start proper development by week 42"), since it's times in the future and you don't want to be too specific but still align work tasks with weeks.

Also (re: a sibling comment) Google Calendar shows week numbers, or can be configured to. For quickly finding out the week number, there's always https://vecka.nu. :)


> Also (re: a sibling comment) Google Calendar shows week numbers, or can be configured to. For quickly finding out the week number, there's always https://vecka.nu. :)

KDE and i3 desktops can be configured for pretty much arbitrary date formats, I always have the KW in there.


Indeed, it isn't. I'm living in a country where my company as well as our (foreign) customers use week numbers all the time. It's painful. My watch doesn't show the week number. A normal calendar doesn't. The company calendars do, but there's never one nearby when I need one. So I have a bookmark in my web browser to a 'week calendar' on the net. I can use a 'date' command, but the actual +%<something> syntax always disappear from my mind, so I have to look it up. Week numbers is just a pain. You can't even be sure about week 1 or week 52, it depends on the weekdays that year.


I wrote a date parsing library and spent a fair bit of time trying to wrap my brain around week numbers. Definitely messy.

It never occurred to me to check my Apple Watch for a week number complication. A bit disappointed it's not there, even if I'd never use it.

Update: you can install an app to Apple Watch that supplies a complication for it. "Current Week".


So for all intents and purposes, the week actually starts on Monday in the US, if I understand what you're saying.

For instance, if I were to tell you "Beginning of next week, I'm gonna X", you'd understand Monday, correct?


If you said that to me at work, I'd understand Monday, because Sunday is not part of the work week.

I'd likely understand it as Monday in other contexts too, because it's very rare for something to start on Sunday. Monday and Saturday are the transition points and thus the logical places to start doing something.

But I might understand it as Sunday if context seemed to point that way. What I'm saying is that nothing is organized according to "the beginning of the week", so the question of which day is the beginning of the week doesn't have any implications.


No hard implications, but it might influence how you number the days (C99's tm_wday uses 0-6 with 0 = Sunday, which suggests a week starting with Sunday, particularly since tm_mon uses 0-11 with 0 = January, but is also compatible with the way the days are numbered in Chinese, with Monday = 1), and it might be related to how you work out which months have 4 weeks and which months have 5 weeks: it's unusual, but some part-time employees get 4 weeks' pay at the end of most months but 5 weeks' pay at the end of some months, bizarrely.


> it's unusual, but some part-time employees get 4 weeks' pay at the end of most months but 5 weeks' pay at the end of some months, bizarrely.

Why would this be unusual? I'd expect pay-by-the-week to be more common for part-time employees, but it's not rare for FTEs either.


Well for me as Dev it changes how I render a week calendar to a user


I mean … I was aware of the fact that the week starts on Sunday in a LOT of other places. But still.

So Sunday, the first day of the week, is still part of what's called the weekend, right?


> I mean … I was aware of the fact that the week starts on Sunday in a LOT of other places. But still.

> So Sunday, the first day of the week, is still part of what's called the weekend, right?

I've always interpreted "the weekend" as a shortening of "the weekends." Sunday is at one "end" of the week and Saturday is at the other "end." Sort of like a if you hold a stick out, there's a "near end" and "far end."


Correct. Which, if one thinks about it, the start of something is one end of it. (said tongue in cheek)


> Wow. I live in Europe, and find it weird that people can consider the week to start on Sunday.

Netherlands reporting in. My parents always taught me that the week starts on Sunday.


Funny, I'm Dutch and I learned the week starts on Monday. Well, except for strongly religious families.


My parents were pretty religious in their youths, and were from conservative parts of the country. Perhaps that has something to do with it.


Many of these make the weird assumption that you want to support multiple calendars. For better or worse, this is rarely the case.


I think that's a bit naive.

Properly supporting calendrical calculations serves two major purposes:

1. Serve the billions of people who don't use the Gregorian calendar or who live in areas that don't match your idea of time zones.

2. Functions as a human form of "platform independence". Properly handling dates/times quickly exposes places you've made silly assumptions or accidentally broken things. This protects you from actual bugs and ensures you avoid whatever the future version of the Y2K problem is. Such bugs are always easier to fix when you have a smaller database and fewer users.*

* In general bugs and breaking changes _never_ get easier. Every day you delay increases the pain imposed on future you and users. Obviously there is a balance - a startup needs to worry about getting to "default alive" first and foremost - but don't delay too long and where it doesn't impose an inordinate cost do it right the first time.


> Serve the billions of people who don't use the Gregorian calendar or who live in areas that don't match your idea of time zones.

There isn't that much use of non-Gregorian calendars in software, even in places that do make use of them. And if you do support them, it's another source of bugs, as you have to deal with new Japanese era names and the like: https://blogs.msdn.microsoft.com/shawnste/2018/04/12/the-jap...


I didn't say you would Never want to support multiple calendars, but like I said. For better or worse, most software written will not.


Yes, or that you care about historical aspects.

If we go on a far enough time scale, all calendars are constructed abstractions so theoretically nothing about them is true, but I'm not sure what's the point of that.


That was my thought reading this list too. I am sure every calendar system will have a list of calendrical fallacies like this. But if you do decide to support the Hebrew calendar, you will not get “The current year is 2018” wrong. Or the number of digits in a year, if you choose to support the Japanese calendar. Actual fallacies in those systems will be less obvious.


I invented this very regular calendar while I was writing recurring-scheduling software for a scheduling tool:

https://calendars.wikia.com/wiki/6*6*10_regular_calendar

It was my fantasy escape from the crazy irregularities of the Gregorian calendar.


Can you imagine the transition to that calendar? People wanting their two days off every five working days are one of the problems.


It will never happen. The US can't even get on the metric system. The UN couldn't even reform the calendar to have uniform quarters: https://en.wikipedia.org/wiki/World_Calendar http://strangeside.com/time-sabbath-on-tuesday-364-day-year/


I think on of the best attempts to fix calendar system was done by George Eastman (Kodak founder), who presented "International Fixed Calendar" created by Moses Cotsworth to League of Nations in 1923. At that time League of Nations was trying to redesign current calendar system and was accepting different proposals.

Unfortunately they failed to come to consensus between different calendar designs :(

https://en.wikipedia.org/wiki/George_Eastman

https://en.wikipedia.org/wiki/Moses_B._Cotsworth

https://en.wikipedia.org/wiki/International_Fixed_Calendar

https://en.wikipedia.org/wiki/International_Fixed_Calendar_L...

https://gizmodo.com/how-the-quest-for-a-perfectly-rational-c...


That calendar's weeks start on Sunday and I would rather have hours vary by length of daylight throughout the year than start weeks on bloody Sunday.


Here's my personal favorite, GMT +0h 19m 32.13s, which was used in the Netherlands until July 1, 1937: https://en.wikipedia.org/wiki/UTC%2B00:20


This really should be an operating system service, so all the applications get it right. Instead, everybody rolls their own, with erratic results.



Only Microsoft's seems to be part of the operating system.


"False. The UNIX epoch is January 1, 1970 in UTC, but is Dec 31, 1969 in Los Angeles."

This isn't specific to Los Angeles, right? It's just referring to the fact that anyone with a negative UTC offset would see an epoch date of Dec 31, 1969 in their local time (I hope).


One related wrinkle that can catch out UK people (who tend to conflate local non-summer-time with UTC) is that the epoch is not midnight 1 Jan 1970 in UK time -- we were in the middle of a "UK time is GMT+1 all year round" experiment in 1970...


Yes.

This reminds me of an anecdote from a talk on timezones. Shortly after Stephen Hawking died, the the Google search snippet for "When did Stephen Hawking die" said "Tomorrow" for people in the US.

It was pulling the date from Wikipedia, but it was still the previous day in the US.


Hm. No. The UNIX epoch is in UTC, so it's always January 1, 1970. The tick (at this moment: 1537511644) is always relative to January 1, 1970. The timezone (TZ) only comes into it when that epoch time is translated to a local time with 'date' and other tools.


True.


False. It’s the year 5779 in the Hebrew calendar. שנה טובה


שנה טובה


I’ve heard that Perl 5 has the best/most complete DateTime library. Is that true?


It cares about leap seconds so yes, it's quite accurate.


Bonus problem: in my company, the business calendar (which is the first week of the year, quarter ends) is sometimes shifted by one week with respect to the ISO definitions for historic reasons.

I have implemented a large application which does a lot of time calculations, and run onto most of the issues listed in the article, except that I didn't use hewbrew calendars :).


For quick scripts, I do all my date math with seconds-since-the-epoch, retrieved with "date", and then convert back at the end. The final date benefits from the latest timezone package update, so mostly I do OK.


This mostly work without problem for past time, but it can be wrong for future time, since timezone/DST may change. When you want 2pm next Sunday, you want 2pm next Sunday local time, not whatever 2pm next Sunday is supposed to be as of now.


I seem to remember Tom Scott doing a really good little video about this.



The whole problem results from people using the same time for user-display and time calculations. Display time is user-facing feature that Should Not Influence system time. If people used https://en.wikipedia.org/wiki/International_Atomic_Time for software none of the problems related to calendars will occurs. IAT has no leap seconds or any calendar corrections at all.


Just learned google calendar uses the timezone your currently in by default even if you set the location back to your home. When I got back from travel, the notification was decidedly off. :)


Google’s “helpful” treatment of time zones decidely isn’t. I wish there was a way to tell google calendar to ifnore timezones altogether.

E.g. I am in Vietnam. I make a dentist appointment for when I am back in the US at 2 pm. I put the appointment on the calendar. I get to the US on the day and find the appointment is at 1 am the day before. Worse still, is in the opposite direction because I wont get the alarm until the event has passed by 11 hours.

Stop being clever google. If I put an event in my calendar for 2pm, just leave it there. Yes, yes, I could hunt down the timezone menu item every time. Better yet, I finally get around to transitioning out of that terrible company’s services.


Is it true that Unix time is as simple as it seems? Is there exactly one second between adjacent integers? Do all Unix time values occur only once simultaneously around the world?


What you want is probably TAI [1], Unix Time has leap seconds.

[1] https://en.wikipedia.org/wiki/International_Atomic_Time


The one mistake with UNIX is that UTC is used. It should have been TAI, and leap second adjustment should have been handled by the timezone package (but at the time people weren't worried about leap seconds, so it's understandable). As it is, the system clock has to change. This is as bad as MSDOS time (and other systems which put localtime into the actual RTC). Using UTC as the time reference, independent of local time, was nearly genius. With TAI it would have been pure genious. No, wait, using an epoch time of the Big Bang, and step size Planck time, now that would have been pure genious. It doesn't take as many bits as you may think..


Frankly I'm bit confused why civil time in general is based on UTC instead of TAI. Except for astronomy, who does benefit from observing leap seconds?


Civil timekeeping is based on UTC because we want clocks and calendars to remain synchronised with the sun. It's not just astronomers who care about whether the sun is above the horizon. If you think about it, the solar day is the one unit of time that almost everybody cares about and pays attention to. If you screw up the day, you also screw up the week and anything else that is based on counting days. Why would you want to do that?


But offset of UTC for TAI is like 37 seconds or something poxy like that. Nobody cares about this, the sun moves too slowly for that to make any difference.

There's been talk of setting UTC=TAI for quite a long time and I think it'd make sense. It'd take so long for TAI to drift from the actual rising and setting of the sun that we'd probably all have standardised on Swatch Beats by then anyway.


Yes, the difference is 37 seconds now, but it's growing quadratically. It's quite possible that we'll change the way we express times and dates, but it's almost certain that we'll want them to stay in phase with the sun, and we'll probably want to stick with the SI second and with something like TAI. My guess is we'll continue to want a simple relationship between TAI (or its successor) and the civil clock time, something like the current relationship according to which they are a whole number of seconds apart. That implies that we'll need something like leap seconds (or leap minutes perhaps). There doesn't seem to be a reasonable alternative.


In 1950 astronomers pointed out that there would have to be two kinds of time, one to agree with calendar days and one to be as uniform as possible. Arguments over subsequent decades inexplicably decided that there could only be one kind of time specified by international agreements, and we ended up with a choice of two out of three characteristics in what we now call UTC. https://www.ucolick.org/~sla/leapsecs/picktwo.html


You website is a true hidden gold nugget, thanks for putting in the time to write all that up.

As an expert (as far as anyone around here is), what would your pick be for common civil time? Personally I feel like "precise time and simplicity" is almost obvious choice, but apparently it is not quite that clear cut.


"Simplicity" is rather subjective, even if you answer the question: simple for who?

Civil time has to stay in phase with the sun. As I see it, that's not negotiable. Inserting leap seconds, so that the nanosecond field of UTC remains the same as the nanosecond field of TAI, and the jumps that occur are negligible for ordinary people, seems to me overall the simplest solution, though I can see that UTC-SLS would be simpler for some people in some situations, and switching to leap minutes or leap hours would be simpler for people living now, who could then just ignore the problem. (Pollution and global warming and lots of other things can be treated in the same way, of course. Perhaps some of these things really will be easier to solve in the future, but I'd rather not rely on it.)


> "Simplicity" is rather subjective, even if you answer the question: simple for who?

I used simplicity in the meaning provided by link in parent comment refering to three desirable properties of time systems: "Every "day" has 86400 "seconds" (606024)."

> Civil time has to stay in phase with the sun. As I see it, that's not negotiable.

I don't see why that needs to be the case on a seconds level.

> switching to leap minutes or leap hours would be simpler for people living now, who could then just ignore the problem. (Pollution and global warming and lots of other things can be treated in the same way, of course. Perhaps some of these things really will be easier to solve in the future, but I'd rather not rely on it.)

Considering that need for leap hour would appear in over 500 years, I feel like trying predict the situation then is really borderline overarrogant.

Also leap hour would be basically a timezone shift, and I bet we will be doing timezone changes anyways in the next 500 years


Not at all. Just because it involves an "hour", rather than a "second", doesn't mean that it resembles a "timezone shift". It's totally different.

With a timezone shift, all that happens is that "09:00:00 +0900" is the same as "10:00:00 +1000". We can cope with that. But if you make UTC jump back an hour, then we have "09:59:59 +1000" followed by "09:00:00 +1000", and then the whole previous hour happens again. The internal timestamps in computer systems (typically expressed as a number of seconds since some epoch) repeat themselves for an hour. Causality is violated. Most computers stop working. You would probably have to switch them off beforehand to prevent data loss and even longer interruptions to service. You could shut down all the servers and desktop machines, stop all public transport and so on, but you can't just turn off the computers in embedded systems and satellites and so on...

So let's not try to arrogantly predict the situation in 500 years. Let's carry on with the established system of leap seconds, at least until someone comes up with a sensible alternative. Then there won't be a "situation" in (less than) 500 years.


I feel like throwing in a leap hour (basically tz shift) once in a millenia would be more reasonable solution, if for nothing else than letting future generations deal with it instead of trying to futilely pre-empt problems that are not really problems yet.


What do you mean by growing quadratically, exactly? The rotational drift of the Earth is growing that fast? Where can I read more about this?


The length of the day is increasing roughly linearly at roughly 2 ms (per day) per century, because of tidal effects. The difference between TAI and UT1 is the integral of that, so grows quadratically. If we assume that the mean solar day was "correct" around 1900, then a hypothetical atomic clock that was synchronised to the sun around 1900 is, after x centuries, out by about 36525.x^2 ms. TAI was in fact synchronised to UTC around 1960, when atomic clocks were widely used, and the slowing down of the Earth's rotation is rather unpredictable in the short term (decades), perhaps being affected by climate change, so the numbers are all imprecise, but the long-term (centuries, millennia) quadratic growth of the TAI-UT1 difference is inevitable.


Yes, roughly quadratic increase of LOD over the long term, but over short term more like a random walk. Right now the earth's crust is rotating faster than it did a century ago because things have speeded up. For a view over 2 millennia see plots of LOD at https://www.ucolick.org/~sla/leapsecs/dutc.html


You need to learn about Arthur David Olson's "right" timezones.


Or GPS time which is UTC w/o the leap seconds


Thanks! That's what I was looking for.


No, because unix times can repeat in the case of leap-seconds (this is necessary to be able to represent future times correctly).


You can avoid repetitions by using clock_gettime(CLOCK_MONOTONIC, …).


But that's no longer Unix time, it's an arbitrary value that's usually something along the lines of "number of seconds that the CPU has been awake since boot".


Unix time depends on the wall clock time, so it is a wall clock time, so it can jump around arbitrarily when the computer clock is adjusted, and its seconds can be stretched as needed for gliding adjustments. Applications need to cope with that.


It depends from the Unix and decisions by the local system administrator.

* https://unix.stackexchange.com/a/327403/5132


As the ephemeral Tom Scott also demonstrates [0], even IF you account for all the listed factors (and many more) you are STILL sometimes out of sync, just due to the nature of the power grid itself.

As Albert said: Time is what clock measure. The more you dig into it, the more you realize how true that is!

[0] https://www.youtube.com/watch?v=bij-JjzCa7o


Wisdom and experience from senior hacker developer "Oh you are working with time and dates that is going to get "complicated""

Lets propose a new time format we could count the number of exoseconds since the beginning of the universe. Then convert that number to whatever native time format we want. Store that as a number and use that for conversion. Age of universe 4.415×10^17 seconds, 4.415×10^26 exoseconds


Not universal enough, areas near a black hole have experienced less time, the CMB has experience 0 exoseconds, even earth and space will slowly diverge. This will just make a lot of future programmer cyborgs very unhappy.


You are reinventing TAI64NA badly.

* http://cr.yp.to/libtai/tai64.html


Choose a non-Earth-centric time standard. People speculate how a space-faring society would be able to synchronize time and location under relativistic physics. The answer is an intergalactic GPS using far away qusar locations and frequencies. The Voyager probes located Earth for a futrue viewer using a quasar GPS.


still not stable over long time periods unless you account for stellar drift (which gets more and more uncertain over time)


From the list: All the important years are four digits long

This left out the Long Now Foundation's advocacy for five-digit years to deal premptively with the Y10K bug. And, less tongue-in-cheek, to promote a view of time that is not conventional in this day and age.



Reminds me of another great post on the topic by Zach Holman: https://zachholman.com/talk/utc-is-enough-for-everyone-right


> Weeks start on Sunday

> False. Weeks start on Sunday in the United States, Monday in Europe, and a couple of places start on Saturday.

In some countries weeks start on Mondays.

> All the important years are four digits long

> False. It’s the year Heisei 30 in the Japanese calendar.

Or 13.0.5.15.0 in the Maya calendar.


The only question I got is: why we as humans cannot fix this by having a proper calendar system that actually reflects the world and does not have these flaws?


The world is irregular (Earth's rotation and revolution durations vary) but we want regularity in our calendars and clocks. That's the source of the problem.


What do you mean by "actually reflecting the world"? If you mean using objective criteria of some sort, how would that help deciding if the week starts on Sunday or Monday, or what days of the week are to be business days?


What surprised me the most on that list was the the Unix epoch is December 31, 1969 in Los Angeles.


Don't let that answer confuse you. It seems to be phrased in an intentionally confusing way, but it's really only talking about time zones. At midnight 1970-01-01 UTC, Los Angeles, North America, and roughly half of the world still had their calendars on the previous day, because they have a negative time zone offset (Los Angeles was UTC-8:00).


Summer of love has echoes down the years...


Yours in calendrical heresy, JS


Just as well this article doesn't contain a technique for locally making P=NP. I've never trusted Vidona, and I never will.


After I read to the end, I half expected it to load more calendar fallacies.


A proper programmed petaflop computer should be able to handle this.


I really dislike the tone these sort of things tend to take, which is “here’s some stuff you think but you’re wrong about.” The reason people get stuff like this wrong (and names, and addresses, and and and) is because it’s really hard to get right. And it’s added difficulty on top of writing code which is already pretty challenging.


Especially when it's all stuff that's already been covered in very similar essays.

Actually introducing good libraries and methods for handling these issues properly would be much more useful than a smug collection of fallacies and a tacked-on link to ICU at the end.


I don’t understand this objection.. the method to handle these issues is to use ICU / NSCalendar.


Or a runtime equivalent, e.g. the java.time package gets this stuff right.


The main point is “don’t be naive, since it’s more complicated than you think.”

Knowing it’s complex, the next step is to get help, not hack up your own solution.

I liked the tone... sort of a “can you believe this?!”


programming in a nutshell


Imagine we did not have names for months and days. Would you call the tenth month October or December? It does not make sense. Then, the other months at the start of the calendar, would you name them after phases of a military campaign of preparation for war, conquering, looting and pillaging? Would that be deemed politically correct?

On the weekday names we are not doing any better. Tuesday - named after the god of war? Doesn't make sense to me.

As for the lengths of the months, imagine if you were trying to make the case for the different lengths of months and you were trying to get buy in from your friends in accounts. Surely they would want equal amounts of days in each quarter? Imagine trying to persuade them that different lengths were better, what plausible possible reason could there be?

We learn the names of days and months by rote at a very young age. We get encultured into accepting the status quo and never questioning little details - nobody asks at school why 'Dec'-ember is month twelve.


>"Then, the other months at the start of the calendar, would you name them after phases of a military campaign of preparation for war, conquering, looting and pillaging?"

Do you have a reference for this? Besides Mars how are the others related? And also that would make a lot of sense to name periods of time after what you are doing during that time.

Edit:

Also, I remember reading that the one thing the pharaohs of ancient egypt had to pledge before taking power was not to change the calendar again. Since changing it according to whatever current whims caused even more chaos. At some point the "deep state" of the time just wouldn't stand for it anymore.

Edit2:

Found the ref. Actually I mentioned it here a few years ago:

"It must have been, then, that there were local attempts to retain the coincidences between the true and the calendar year — intercalation of days or even of months being introduced, now in one place, now in another ; and these attempts, of course, would make confusion worse confounded^ as the months might vary with tlie district, and not with the time of year.

That this is what really happened is, no doubt, tlie origin of the stringent oath required of the Pharaohs in after times, to which I shall subsequently refer.

[...]

When the year of 365 days was established, it was evidently imagined that finality had been reached ; and, mindful of the confusion which, as we have shown, must have resulted from the attempt to keep up a year of 360 days by intercalations, each Egyptian king, on his accession to the throne, bound himself by oath before the priest of Isis, in the temple of Ptah at Memphis, not to intercalate either days or months, but to retain the year of 365 days as established by the Antiqui.^ The text of the Latin translation preserved by Nigidius Figulus cannot be accurately restored; only thus much can be seen with certainty.

To retain this year of 365 days, then, became the first law for the king, and, indeed, the Pharaohs thenceforth throughout the Avhole course of Egyptian history adhered to it, in spite of their being subsequently convinced, as we shall see, of its inadequacy." https://archive.org/stream/dawnastronomyas00lockgoog/dawnast... https://news.ycombinator.com/item?id=11017726#11020076


simple: call them by their ordinal/cardinal names. that's what east asian calendars do. october is "month ten"

https://en.wikipedia.org/wiki/Japanese_calendar#Months

https://en.wikipedia.org/wiki/Names_of_the_days_of_the_week#...


That's what we did in latin! October is month 8! And then some dude messed it up.


Yeah, the dude changed the definition of where the new year starts, moving it back two months. It's still arbitrary, it should have been at the (slightly fuzzy) winter solstice..


> As for the lengths of the months, imagine if you were trying to make the case for the different lengths of months and you were trying to get buy in from your friends in accounts. Surely they would want equal amounts of days in each quarter? Imagine trying to persuade them that different lengths were better, what plausible possible reason could there be?

The year is 365 + epsilon days long. The epsilon means we need to periodically add or remove days to synchronize solar time with the seasonal calendar; in the Gregorian calendar, we sprinkle 97 such days every 400 years. Armed with this complexity, you have several options:

1. Accept that a month is going to need an extra day about once every 4 years.

2. Have the extra day be not part of the regular cyclical rotation (i.e., be intercalated). This is probably even more complex for bookkeeping than option #1.

3. Insert an extra month's worth of days on a longer period of time (this is essentially what lunar calendars do, but the effect of leap days themselves are less noticeable since the lunar month is quite out of whack with the solar calendar).

As for months themselves, you run into the problem that 365 and 366 are not amenable for evenly dividing the year into months: their factors are 5×73 and 2×3×61, respectively. 360 is much nicer (2³×3²×5, i.e., is divisible by every number between 2 and 10 but 7); unsurprisingly, a few cultures opted to use 360 as their basis for month divisions. The Mesoamericans used a calendar that was 18 months of 20 days, with 5 extra days that were intercalated.

As it turns out, the lunar month is about 29.5 days long, and the moon is a very obvious cyclical timekeeper (much more obvious than keeping track of days than the sun's seasonal positioning changes). Using a lunar month is going to give you a natural 29/30 alternating month period. However, you end up short by around 11 days when you do this. If you sprinkle them around the months, you find yourself naturally having 7 months of 30 days and 5 months of 31 days.

In terms of calendrical complexity, no one beats the Mayans. They opted for a calendar that has three different lengths of the year: a ceremonial calendar that counts 13 months of 20 days, a solar calendar that is 18 months of 20 days with a 19th month of 5 days, another calendar for long-term record keeping that counts from an epoch in base-20, except that one digit is base-18 instead (for a "year" of 360 days). On top of this, they have a "week" of 9 days, and they counted the current day of the lunar cycle, the current lunar cycle number, and whether the current lunar cycle was 29 or 30 days long. And there's another 819-day cycle that we still don't know what it corresponds to. Oh, and you count months and days like "January 1, February 2, March 3, ..., December 12, January 13, February 14, July 31, August 1, ..."




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: