Hacker News new | past | comments | ask | show | jobs | submit login
Leap seconds: Causing bugs even when they don’t happen (berthub.eu)
228 points by ahubert on Aug 3, 2021 | hide | past | favorite | 172 comments



What we'd like to have:

1. Use SI seconds

2. Keep in sync with earth rotation

3. Avoid leap seconds

Pick any two:

- UT1: 2, 3 (not 1) [https://en.wikipedia.org/wiki/Universal_Time]

- TAI: 1, 3 (not 2) [https://en.wikipedia.org/wiki/International_Atomic_Time]

- UTC: 1, 2 (not 3) [https://en.wikipedia.org/wiki/Coordinated_Universal_Time]


Who is the "we" who wants to keep "in sync" with the Earth's rotation to such high precision? Obviously having 3:30 pm suddenly become the middle of the night would be bad, but we're talking a delta of 18 seconds spread over almost 50 years... hardly big enough to be called "out of sync" in any noticeable sense. Maybe in a millennium or two we'll lose the cultural context of what "it's 5:00 somewhere" meant, but I'm sure the historians could add a blurb.


Astronomers and navigators. These also were the first people that needed accurate time and first people capable of measuring time accurately, so much of the expertise is in astronomical or naval institutions. This was still very relevant in the 1950s when the modern time systems were being developed.

There is a growing movement in the timekeeping community that agrees that keeping UTC close to UT1 is no longer important and that leap seconds should be abandoned.


I tend to agree. Let astronomers use a rotation-synchronized time. (Aren't they bothered by jumps anyway? They probably need a more gradual skew, which is to say, not SI seconds.) Let everyone else use TAI, and in a few millennia when high noon isn't noon anymore, we'll worry about it then.

Oh by the way, unless you live smack in the middle of a timezone, solar noon isn't clock noon anyway. Plus it gets all jacked up with daylight savings (which should also be abandoned), so the whole notion of leap seconds is farcical on its face.


This is the kind of half assed solution that could be called procrastination.

Its obvious the only solution is to slow Earth rotation down so we have no need for leap seconds while maintaining high noon precision for millenia


The article addresses that nicely in it's 4th sentence:

" ... one day I would like to do the math on the “great leap second gyroscopes” that we could mount near the poles to steady the Earth’s rotation, so we can stop talking about this. We may occasionally have to desaturate these gyroscopes with huge rockets also."

I have an alternative solution, which is that we stop earths rotation and get it tidally locked to the sun, so it's always 00:00:00.000 midnight at Greenwich UK. This'll screw up the ski resorts in New Zealand, but anyone who's spent a winter in Invercargill on the South Island there will agree that it'd be much much nicer as a tropical paradise.


Or just change the orbit of the earth and moon. You'd solve the leap seconds problem && climate change!!!


Yeah, I think the real mistake was when everyone decided to use UTC instead of TAI. I can't actually hold the astronomers to blame for this. More likely it was telephone network engineers or someone at a stock exchange.


> Let everyone else use TAI, and in a few millennia when high noon isn't noon anymore, we'll worry about it then.

Or when we exist as a society on the moon and Mars and in the asteroid belt - and the relevance of the rotational angle of earth will be no more interesting to almost everybody the same way local time in Greenwich UK doesn't matter to 23/24ths of the planet.

(Hopefully that's well before the "few millennia" it'll take for TAI to be radically different to what we use UTC for these days, and "soon enough" that it's worth taking into account already.)


I don’t think this actually benefits navigators and astronomers. You just can’t predict local area noon based on clock time without additional data. Even when you have that data, it won’t be accurate to the second anyway.

Even if you are at the exact center of the time zone, clock noon is only rarely at solar noon. See https://en.m.wikipedia.org/wiki/Equation_of_time


If you’re on a ship and want to get longitude, you don’t predict local noon. you measure local noon and compare it to a clock that was set to noon at a known longitude (such as Greenwich), and correct the offset using the equation of time.


That doesn't actually work. You can't measure local noon accurately enough for it to give a useful longitude. My celestial navigation textbook estimates the accuracy of this method as 15' under the best possible circumstances, and 40' under more realistic circumstances.

Even if you ignore this issue and assume you could somehow measure local noon exactly, you still would need almanac data in order to do the correction. Once you are involving almanac data, it stops mattering if local noon has any connection at all to 12:00 on your clock.


> Obviously having 3:30 pm suddenly become the middle of the night would be bad, but we're talking a delta of 18 seconds spread over almost 50 years... hardly big enough to be called "out of sync" in any noticeable sense.

And the Julian calendar was 'good enough' for a long time… until it wasn't and Pope Gregory XIII had to announce the skipping over of 10 days so that (e.g.) the Spring Equinox actually lined up with the season of spring more accurately.

I understand the sentiment of wanting to simplify things, but kicking the can down the road so it's someone else's problem is wrong in my eyes.


Large parts of the world shift civil time by an hour twice a year. Drifting 1 day every 200,000 years doesn't sound like kicking the can down the road.

If human civilisation still exists by the time we've reached just 1 hour of difference we'll be occupying multiple bodies throughout the solar system, if not further afield, and have to have come up with a new time system anyway. Even the length of a second varies because of relativity.


Yeah, we already do the Feb 29 thing every 4 years. We make exceptions to that rule every 100 years, and an exception to the exception every 400 years. Those leap years make larger oscillations than any leap second ever tries to correct for, and we're totally fine with that. Adding another exception every 200,000 years doesn't sound unexpected at all.


They do not. Those adjust the calendar, not the clock.


A calendar is just a clock that measures longer durations of time.


But one of these adjustments aims to keep mid day at noon, while the other aims to keep solstice around 21 Dec.

You could skip leap years and still have the sun up at noon, while your solstice date drifts away. Or, you could skip leak seconds and still have the solstice on 21 Dec, but have your noon drift away from mid day.


very few locations have the sun directly above at 12 (based on a timezone offset from UTC), and just as many places have it directly above at 12 based on an offset from TAI

The sun is overhead from Greenwich at 12:06 GMT today, it varies by about half an hour from 1143 to 1214 through the year (ignoring daylight saving)

I'm not sure if varying from 1150 to 1221 would be any better. Even after a minute of leap seconds (say 100 years) the point where the "sun is overhead at 12:00:00" point for points at UK/Ireland latitude would have moved about 10 miles from its current position. in a few millenia the timezone would be wrong, but easilly corrected by either skipping a "spring forward" or "fall back" time.


Irrelevant in this context.


> Yeah, we already do the Feb 29 thing every 4 years.

Except it's not every four years.

It's every four years, but it is skipped if the year is divisible by 100… but not skipped if it is also divisible by 400—which ended up biting people in 2000 which was a leap year:

* https://slate.com/technology/2016/02/the-math-behind-leap-ye...


> > Yeah, we already do the Feb 29 thing every 4 years.

> Except it's not every four years.

> It's every four years, but it is skipped if the year is divisible by 100… but not skipped if it is also divisible by 400—which ended up biting people in 2000 which was a leap year ….

Unless they edited, that's exactly what your parent said, after you cut off the quote:

> Yeah, we already do the Feb 29 thing every 4 years. We make exceptions to that rule every 100 years, and an exception to the exception every 400 years.


> Large parts of the world shift civil time by an hour twice a year.

Yes, and how many things break every time this common thing occurs? Or every four years when February 29 rears its ugly head.

And these are code paths that should be well tested because of their frequency.


POint is we can cope with the sun directly overhead at 14:20 (as it is today in Madrid), or at 11:20 (as it is in Warsaw in November)

A drift of a minute over a lifetime makes no difference.

We should just use TAI (with timezone offsets) for general time and be done with it.

Even today UT1 is widely different to UTC (183ms), so it's not like using UTC is accurate. Why is 183ms an acceptable difference (no leap milliseconds), a couple of hours (timezone sizes), one hour (DST), but we have to correct for 40 seconds each century?


In 100,000 years when 12pm is in the middle of the night, no one is going to care. It will have been that way for all of living memory, and the fact that noon used to be midday will be nothing more than a historical curiosity. The drift is far too slow for anyone to notice.


I've met grown adults that think 12 PM is the middle of the night already. (They don't understand PM vs AM.)


> I've met grown adults that think 12 PM is the middle of the night already. (They don't understand PM vs AM.)

That's because it doesn't make any sense to describe the middle of the day by '12 PM', since 'PM' should mean "after midday". To be sure we have conventions about this, but not knowing those arbitrary conventions is no reason to think that people don't understand the concept. I think better usage is 12 noon and 12 midnight (https://en.wikipedia.org/wiki/12-hour_clock#Confusion_at_noo... )—though that still requires further disambiguation at midnight (is "midnight on August 3" between August 2 and 3, or between August 3 and 4? I don't know whether there's a common disambiguation here, as with 12 n versus 12 m).


The inconsistency is in using 12 instead of zero, which results in four discontinuities per day (12:59 AM → 1:00 AM, 11:59 AM → 12:00 PM, 12:59 PM → 1:00 PM, 11:59 PM → 12:00 AM). The times should be divided into 0:00 AM through 11:59 AM and 0:00 PM through 11:59 PM so that the hours wrap around at the AM/PM boundary, not one hour later.


> The times should be divided into 0:00 AM through 11:59 AM and 0:00 PM through 11:59 PM so that the hours wrap around at the AM/PM boundary, not one hour later.

Sure, although, if we're re-defining how people tell time anyway, we might as well use a 24-hour clock and so avoid the need for AM/PM at all; and, even if we do that, there is still, inevitably, the confusion about on what day 00:00 AM falls. My point was just that not knowing that "12 PM" means noon is a matter of not being aware of an arbitrary convention, rather than of not understanding the concept of AM vs. PM. (Indeed, arguably it is a matter of understanding their meaning all too well!)


> the confusion about on what day 00:00 AM falls.

Not so much actually: 00:00 is on the day that is starting, 24:00 is on the day that is ending.


> Not so much actually: 00:00 is on the day that is starting, 24:00 is on the day that is ending.

That makes a lot of sense!


Time goes 00:00:00 through 23:59:59

Except when leapseconds come along and make it 23:59:60

And daylight saving skips an hour, or repeats it.

AM and PM are really anachronisms though, I don't remember the last time I've ever seen "4pm" written down in normal life.


Every major OS displays time w/ AM/PM?

I just checked my OS X machine, it has it. My Linux machine has it, though I grant that among the many DEs for Linux, some might not use it. (I use MATE, a fork of GNOME 2, for reference.) And I've Googled a screenshot of Windows, which seems to still use it.

My phone, microwave, and stove, I guess, seem to presume I know whether it is morning or not.


If you use US/UK localisation, yes. I don't use language localizations (since translation is often poor/missing so you get a mix, and I have to learn new terminology all the time) but you bet I want my 24-hour clock and not that old Latin stuff.


The average person doesn't use 24 hour/military time. Maybe in the tech world.


Perhaps in your country. Iny the vast majority of countries I've been to, opening hours on shops, timetable, TV schedules, phones, etc are 24 hours. Have been for decades.

UK TV schedule: https://www.bbc.co.uk/schedules/p00fzl6p

France opening hours: https://www.malathronas.com/wp-content/uploads/DSC_6720.jpg

Thai train timetable: https://teakdoor.com/images/north.gif


Yes, it is different in the US.


> The average person doesn't use 24 hour/military time. Maybe in the tech world.

True in the US (and IIRC the Anglophone world more generally), but AFAIK 24-hour is more commonly used outside of military/technical domains in many other countries.


> That's because it doesn't make any sense to describe the middle of the day by '12 PM', since 'PM' should mean "after midday".

I think that explanation fails because, IME, people who know that “PM” means “post meridiem” which means “after midday” are more, not less, likely to also know that 12:00 PM is noon than those to whom PM is just an arbitrary marker.


Midnight can be confusing. Some folks work around that by stating "12:01 AM on <date>..." instead of midnight.

12 PM = noon is something I remember learning in kindergarden, and should be common knowledge for most adults in the US.


> 12 PM = noon is something I remember learning in kindergarden, and should be common knowledge for most adults in the US.

Well, maybe, although assuming a commonality between others' education and mine is always dangerous; but I've got way more respect for someone who understands a concept but doesn't know about an arbitrary exception to it, than for someone who understands a pile of arbitrary rules and doesn't know how to fit them to any conceptual framework.


"I think better usage is 12 noon and 12 midnight"

Or maybe 12:00 vs 00:00


What about leap milliseconds? There are plenty of applications that need millisecond accuracy.


If you need millisecond accuracy then you definitely don't want UT and so you'd want to avoid leapseconds, not add "leap milliseconds".

TAI is very smooth. TAI gives the sort of predictable monotonic clock you might appreciate if you care about milliseconds. One second after another, with a perfect 1000 milliseconds between them, forever.

UT is based on a large rock (the planet Earth) spinning. It varies a bunch 'cos the rock is a weird shape and is seismically active so it spins differently over time. If you care about "millisecond accuracy" then you do not want to base that on the spinning of a big rock when you can get a perfectly nice atomic clock and use TAI.

If your thought is "But I need millisecond accuracy for my astronomical observations" what you've got there is a big misunderstanding of your context. While UT might seem convenient for figuring out where and when to point telescopes at thing, it's useless for actual time. You need TAI.


Why is it obvious that 3:30 PM bad as the middle of the night is bad?

I'm all for a universal world-wide time. then we'd just have to learn what time is middle of the night for what location, but we have digital aids for that. It solves all timezone bs.

My guess is is that we whine for about a month and then it becomes normal for ie. 14:00 to be middle of the night depending on you location.


I personally tried this for a few weeks: I set my clocks to GMT, and converted everyone else's silly little timezones into one unambiguous number. All I accomplished was memorizing the offset between GMT and local time and getting marginally quicker at the required arithematic to convert between them. It didn't even solve any problems, because a) nobody else was doing it, so, shrug, and b) I immediately lost all context for the earth's rotation, and that turned out to be a massive pain in the ass. I think the very least we can do, though, is get rid of DST.


I have done exactly the same, but I am happy to continue like this after switching to GMT about 20 years ago.

Adding/subtracting the local time offsets when necessary is easier for me than trying to think in local times and DST.

Moreover, I have not lost the context for the earth's rotation, but I am better aware of it, by remembering which is the GMT time of the noon where I live.

During summers (i.e. with DST), the noon is delayed here by about 75 minutes from 12:00, so keeping in mind the correct UTC time of the noon makes me more aware of the Sun position. There are many places where the time difference between noon and 12:00 local time is much larger, making the official local time pretty useless for determining the Sun direction.


Just curious: How far east/west from Greenwich are you? - Depending on that meaning of colloquialisms can become weird (what is today/tomorrow? What is "around noon" etc.)


Noon is when the Sun is at maximum height, so it is approximately half time between sunrise and sunset.

The time of the noon varies from place to place depending on the longitude.

In the beginning, every major town had its own local time, with noon at 12:00. After the time zones were introduced, noon should have been everywhere some time between 11:30 and 12:30.

With DST, you can have a difference of up to 90 minutes between noon and the official 12:00, but there are countries which occupy more than one time zone, but do not bother to have so many time zones as necessary to minimize the differences between noon and 12:00, so there are places where there can be more than 2 hours between the true noon and 12:00.

It does not matter how far east/west you are but only how the time zones are set in your country, together with the DST.

If you would want to use the traditional boy scout technique of finding the south using a watch and either the Sun or the Moon, you would need to know the hour of the true noon in that place, otherwise the angular errors would be excessive.


In the town where I live, there is a belltower, with a clock that used to be set to local time. It is West of Greenwich, so the bell tolled the hours several minutes after GMT.


Yeah UTC/GMT is not intended for personal use, but try managing a SAAS product that has two user shards, US and EU, and keeping aggregated incident reports on your favorite monitoring system sane without UTC.


Swatch tried to do this ages ago: https://en.wikipedia.org/wiki/Swatch_Internet_Time


We already have universal world-wide time: UTC and TAI. You can use these today, and nobody can stop you.


> It solves all timezone bs.

So when does Monday start? 14:00 UTC in your location?


Why not?


Monday 13:30 and Monday 14:30 are now a week apart for you. But for someone in the next timezone, it is 1 hour apart.

Basically you are replacing Time Zones with Day-of-Week Zones.


Because you'd get up on Monday and then book a table for after work on Tuesday. The ambiguity of defining "today" would be problematic.

Now sure people who work night shifts have that, but it's a small number of people and a small number of services that are open 24/7.


> we have digital aids for that

Like a clock that you can adjust so that 0:00 is the middle of the night?


> I'm all for a universal world-wide time. then we'd just have to learn what time is middle of the night for what location,

There is one thing that is highly inconvenient and to have day border during working hours.


Isn't 3:30pm in the middle of the night a regular occurence during parts of the year for people at certain lattitudes?


Out of interest, have you read https://qntm.org/abolish?


Interesting but I feel that all questions are answerable with another definition of time relative to "solar noon" (some kind of icon with the sun on an arch comes to mind, my Amafit Smartwatch uses this). Also the article does not take into account that our phones may warn us "that Uncle Steve in Melbourne is probably asleep", for example.

The date line? Yeah lets put that in the center of the Pacific.


I don't specifically care about earth's rotation, but as a diurnal creature whose internal body clock synchronizes to the sun, I definitely care about the rhythms of natural light. The whole reason clocks were invented was to help people more precisely measure the day-night cycle.

So the question really becomes, "Who is the 'we' who wants to have a more theoretically pure timekeeping system that gets worse and worse at its primary purpose?" Turns out that's a pretty small audience.


If the drift is 18s over 50 yrs then it doesn't address your issue to any noticable degree throughout your life.

Nor that of your children, grandchildren etc etc, as the change is gradual enough that nobody would notice. The only way to get a significant difference is by somehow sleeping for thousands of years (1k would be 6 minutes, which would still be impossible to tell with human senses).

It really is a pretty pointless solution to a non-existing problem.


If you're saying it is only a little worse in the short term, I agree. But it is still worse at its primary purpose. And in exchange for what exactly? Making it easier on a few programmers to not write a few bugs?

If these were the only bugs that ever got written, I might buy it. But we all know that's not the case. Adopting a less correct timekeeping system will reduce the total number of bugs written by 0.000000001%. That is putting the cart firmly before the horse. Computers are here to serve people, not the other way around. Whenever we find ourselves making systems worse due to engineer convenience, we should think real hard about why we get paid the big bucks.


> adopting a less correct timekeeping system

Precisely the opposite, and that's the root of the problem. We started with astronomical clock ticks and have gradually transitioned to progressively more accurate measurements. The attempt to maintain synchronization with astronomical units is for backward compatibility, not because they're more correct.


Again, I disagree. Humans invented timekeeping to aid their internal sense of time, which is based on how our planet moves. That's still by far the primary use case for clocks.

I agree that for a narrow set of science/technology use cases, a more abstract notion of time is useful. In those narrow realms, it makes sense to stick with the more abstract system. E.g., the way "these three navigation systems broadcast a continuous monotonic clock signal that is not influenced by leap seconds" is reasonable in context. But once we leave those narrow contexts, we need to translate back to the actual human purpose that kicked all of that off. Which again, is what happens: "navigation satellites do transmit when leap seconds happen".

Insisting that we make the actual system worse from the perspective of most humans for the convenience of technologists is a mistake. Even more so when we do it in the name of abstract ideas.


The oldest people will have a difference of maybe 40 seconds from the point they were born to their death.

There is nobody who can tell the difference, so I call bullshit on the thesis that the average person profits from this.

I haven't had to implement anything like this, but as an observer it just feels like a pointless effort.

But it's already decided, so we will have to live with it no matter how pointless it is.


I didn't say "the average person profits from this". And I also don't believe it. So if you'd like to call bullshit on that, you've replied to the wrong comment.


We don't even have to go through that situation. As soon as the delta reaches 30 minutes you just shift every timezone by one hour.


>Who is the "we" who wants to keep "in sync" with the Earth's rotation to such high precision?

The military is a big one. ICBMs still use celestial navigation to determine their position and adjust their trajectory in flight.


Agreed, 2 seems way lower priority than the other two. You already put your clock "out of sync" with the Earth's rotation every time you move East or West. Nobody actually cares about this. 1 hour timezones are generally granular enough for our needs.


+1, seems like leap seconds could be covered via a leap hour every 10k years or so. Implemented as a daylight savings change that gets skipped. Implement it at the level of time zones, which is complexity that we already have to deal with most of the time when we want to interface with humans.


> seems like leap seconds could be covered via a leap hour every 10k years or so.

The increase is quadratic. Without leap seconds, the difference will be an hour in 1000 years, but already a full day in 5000 years.

https://www.ucolick.org/~sla/leapsecs/dutc.html -> Effects of disconnecting time from the sun


Ah, that does complicate things.

It's not totally clear to me why they think "Given that the first leap hour would not happen for centuries, it is not clear that any systems (legal or technological) would build in the necessary complexity for handling it." It seems like that infrastructure is already in place— the IANA time zone database, and associated technology. UTC would become disconnected from sun-time, but all the other zones could shift relative to it, and it seems like most things would continue to handle it as expected.

But, I also trust that other folks have thought about it in more depth than I have. I'm not seriously trying to solve the problem (If I were, I'd get involved in standards committees or similar), and I'm happy to concede that my solution misses key details :)


I tend to agree with what you’re saying. My impression is that the main concern is for current technology used by astronomers, satellites, navigation and so on, which require leap seconds to be taken into account and have a relation to the length of day, and which would stop working correctly if the meaning of broken-down time would suddenly change. It maybe wouldn’t be a major problem if all such systems could be redesigned from scratch.


> The increase is quadratic.

In that case UTC has the same problem (steadily increasing numbers of leap seconds till we run out of places^Wtimes to put them), so we may as well have a less defective timekeeping system in the mean time.


Just stick with TAI and convert times to UTC when displaying them is required. The real crap here is the C and POSIX time_t type and its related functions, which are massively out of date (like the 70% of the C standard library). The ISO C standard committee is way too scared of breaking stuff, and when they add something is often utter garbage (look at the whole _s functions fiasco in C11).


> Just stick with TAI and convert times to UTC when displaying them.

That may not be suitable when you want to represent an exact point in time years into the future. You’re usually not interested in the exact number of elapsed seconds (TAI) in that case, but want to represent e.g. 2031-08-03T:00:00:00Z exactly.

The solution is to use different data types for elapsed time and for calendrical/time-of-day time. Software developers need to learn that those are not equivalent, and libraries, databases etc. need to provide appropriate data types that make the distinction clear.


If you want an UTC time in the distant future, more than a few years away, then you cannot know how much time will pass until then.

The leap seconds are inserted randomly, you cannot predict when this will happen.

The UTC time is known only for the past, more precisely for the past 60 years.

For the future beyond a year, the UTC time is unknown.

On the other hand TAI, which is a real time, not a conventional quantity determined by political decisions, is known both for the past and for the future.

You are right that e.g. for scheduling meetings in the future, a different type must be used, whose meaning is "the time moment that will have this official name by then". This time should not be used for computing time differences with a resolution smaller than a day, before it becomes close enough to the present.


I think you're misunderstanding the proposal. Civil time (UTC) is still known well in the future, just as well as TAI is known well in the future. It is the relationship between them that is unknown.


UTC is known in the future only as labels for time moments.

You cannot compute the difference between 2 UTC times, at least one of which is a future time, with resolution of a second or less.

So this does not have anything to do with TAI.


And if a negative leap second ever happens, there will be moments that can be labeled with a UTC time which never end up happening.


The problem with rendering TAI as UTC is you can't do it for future dates because you don't know how many leap seconds will have been added by then. For future dates you can either:

- store TAI: you'll always know how many SI seconds in the future it is, but you won't be able to accurately render UTC,

- store UTC timestamp: you'll always know what the time on the clock will be when the event happens, but you won't be able to accurately calculate how many seconds in the future it is.

The alternative is to ditch UTC. Make all wall clocks tick in SI seconds and render TAI dates in local time. This will always be correct. The only downside is you won't be able to accurately predict what position the Sun is in the sky when the event happens. But is this actually important?


Rendering future times as local time already has a problem where the timezone (DST) could potentially change.


The real answer is you should store the timezone with the timestamp

So you store "the meeting is at 10am London time".

This will work, always...except for when you repeat an hour due to the DST shift but that's usually in the middle of the night on a Sunday


Another problem here is that DST is political, and under local political control. Politicians don't get why changes in timscales have to be announced well in advance; there have been cases where they were announced as little as a month before they came effect.

Some commenters seem to wonder why people care about the relationships between timescales, and how to convert between them. Well, contracts seems to be a good example. It could be vitally important to know, to the second, when a 200-year-old contract came into force.


Assuming London exists in the future ;)

Yes... this way leads to madness. Time seems to be in a special class of problems that is easy to get wrong and hard to get right. I think we just have to aim for "least wrong" and try to keep our sanity intact.


or…

if you want the future time stamp to be “exactly x seconds” in the future you can store a tuple of the TAI time stamp of when you create the data point, and the SI seconds into the future from that time and then render it as appropriate for the audience’s level of understanding

And if you want the future time stamp to be not actually a time stamp but a time of day on a calendar date, then you store a tuple of the TAI creation time, the UTC date you want it to be in the future

Because you can never ever be sure of the future, by keeping the time you make the assumptions you can at least either calculate the correct answer or fix the data later when you find out something has changed.

TAI is for timekeeping.

UTC is for calendars.


They should be scared of breaking stuff! If newer standards break things, those newer standards are unlikely to get wide adoption, which prevents actual progress.

In any case, while the POSIX and C standards need to accommodate atomic time, that's not enough. The software on top needs to switch to atomic timestamps too.


C++ is being adding basically both the kitchen and the sink for years now and still it hasn't broken anything. If anything, lots of warts have been removed now. This whole reasoning doesn't hold when the ISO C standard adds half-assed, optional rubbish to the C standard only to deprecate it the next release.

The ISO C can't create a newer, saner C string library or a more useful time library, but it finds the time to devise and release crap like VLA, threads.h or _s functions. Meanwhile, C++20 has stabilized std::chrono::tai_clock, which works on all major operating systems, and does not break anything.


> and still it hasn't broken anything.

Well, when you explicitly throw ABI backwards compatibility out the window it's far easier to make that claim.


There's plenty of good additions in C and bad additions in C++. (Also I object to the assertion that VLAs are bad! They are an improvement to function signatures.)


> I object to the assertion that VLAs are bad! They are an improvement to function signatures.

The main issue is that they are not restricted to that. They went overboard and basically made alloca() part of the language, causing all kinds of bugs when a newbie (or a senior, too) mistakenly uses variables instead of defines for an array size.

If they were just an annotation, it would not have been complaining, even though they would still remain quite unusable in practice due to C++ not supporting them (that's why they're seldom used in public headers).


C++ is getting timezone database access in stdlib soon too


As far as I remember from earlier discussion, the reason that problems arise is that the posix standard defines a "day" as 86400 seconds, so leap seconds need to be erased from history after they've happened. This doesn't make a lot of sense; on March 1st, we don't start pretending that February 29th never happened.


The problem is that posix needs the conversion between 'posix time' (seconds since 1970) and the split out year/month/day/hour/minute/second format to work for the future as well.

Given that we don't know when there will be leap seconds in the future, this conversion is imposible if we want to take leap seconds into account.

So the solution is to either ignore leap seconds (as posix currently does) or have a regular rate of leap seconds (just like leap years). Unfortunately, the rotation of the earth is are not regular enough for a fix rate.

At the scale of a human life, leap seconds are completely irrelevant if you want to know the position of the sun.

So it is bizar that our civil time has leap seconds. Looking back, that seems to be a historical mistake.


> The problem is that posix needs the conversion between 'posix time' (seconds since 1970) and the split out year/month/day/hour/minute/second format to work for the future as well.

> Given that we don't know when there will be leap seconds in the future, this conversion is impossible if we want to take leap seconds into account.

The conversion is impossible anyway. We don't know what the calendar will look like in the future. That isn't a problem that can be solved by any means.


As always, the problem can at least be made easier by using the right data structure.

Counting time as just "seconds since epoch" is like, really convenient, and by convenient I mean lazy. It may have even been excusable when computers had kilobytes of RAM and if they had hard disks at all, it wasn't more than a megabyte. When "assembly" was high-level programming.

But now that we have infinite disk, processors that are idling most of the time they aren't rendering 3d video, and programming languages which almost don't suck, we could stand to use richer representations of dates that include the semantic concepts people use when talking about time, like "years" and "days" and "hours". You don't need to know how many leap seconds -- or even leap days -- are between 2021-07-01 and 2022-07-02 to say that one is a year and a day after the other.

There was a time and a place for "GOTO"s; there still is, just much less often than back when that was your only real option for flow control. Likewise, the time for storing dates and times as seconds since Jan 1 1970 was much closer to that date than today.


That is different type of future. For a while leap seconds would come at a rate of around one leap second every 18 months. It is safe to say that the current calendar will be with us the next couple of years.

Long term, who knows. But we would like to be able to do date calculations a couple of years into future.


> Long term, who knows. But we would like to be able to do date calculations a couple of years into future.

On the fairly safe assumption that you don't care about being off by a few seconds, leap seconds aren't a factor there. Assuming one leap second every month (!), your computed date two years out will be off by less than half a minute. No one will ever even notice; the entities that consume dates in calendar format -- people -- aren't capable of meeting time tolerances that tight.

If you need to coordinate something down to the millisecond, calendar dates aren't for you.


That's the point, if you need to coordinate things down to the millisecond, leap seconds are incredibly harmful, and if you don't, they are irrelevant.


> if you need to coordinate things down to the millisecond

If this coordination needs to happen > 6 months in the future, then UTC is the wrong time standard to use. Use TAI.


There is no use case for UTC.


Timezones and timezone offsets do change more frequently though, and sometimes withot any heads up. Being able to correctly format future dates is only possible for UTC, even with leap seconds.


Well, POSIX time is an additional layer of stupidity, but it certainly not the only source of problems; notably these GPS/GNSS related problems do not have anything to do with POSIX time


There is no problem with GPS time, only with converting from GPS time to POSIX time or some other leap-second affected time.


>on March 1st, we don't start pretending that February 29th never happened.

Speak for yourself.


TAI sounds perfectly reasonable for most systems and should be the default IMO - only when displayed to humans do we really need to care about syncing up with the earth's rotation.


Smear it! You sacrifice 1 on days with leap seconds (by 0.001%), but otherwise have all three.

https://developers.google.com/time/smear


Kind of, but differences in time will forever deviate from the SI second so I don't think that really counts.

Also 0.0001% of the distance to a GPS satellite is on the order of 200m. Not impossible to work with, but not great either. Though I have no idea what the exact ramifications would be.


Let's call the Google proposal UTC_smear. You can simply convert between UTC and UTC_smear and vice versa just fine. You just need to be extra careful to store each timestamp with their corresponding time format.

Just like you can convert between UTC, TAI and UT1.


> Kind of, but differences in time will forever deviate from the SI second so I don't think that really counts.

I don't follow. Outside of days with a leap second, you are using the SI second.

If you're measuring skew via raw-seconds-since-epoch or something similar, that's already an error-prone measurement. It's a non-goal for smeared UTC to be synchronized with TAI, just as it's a non-goal for unsmeared UTC to be synchronized with TAI.


Right but as long as GPS satellites are synchronised with each other it doesn't matter all that much how synchronised they are with anyone on the ground, the position they return will still be correct.


Take a good hard look at the GPS interface specification. The GPS satellites aren't synchronized with each other*. They are synchronized with the ground control segment.

https://www.gps.gov/technical/icwg/IS-GPS-200K.pdf

* recent satellites have limited autonomous navigation capability as a backup.


You still end up with a bunch of messages that end up saying "at time T I was at position X" where T is off in absolute terms (which you indeed can solve by using an additional satellite to synchronize your clock) and is measure with a different rate (which you cannot solve, unless you know a leap smear is occurring, or by using yet another extra satellite and some mathematics that I don't think anyone's ever bothered working out yet).


Ah yes, for people who want to support leap seconds, but who can happily ignore a clock error of 0.5 seconds :)


... I wouldn't call it an error? It's an offset between smeared UTC and non-smeared UTC, like between any two other time standards.

It's internally consistent, and for communicating with external entities that expect unsmeared UTC you convert as needed as much as possible. No different from if you were running TAI and communicating with unsmeared UTC, or communicating with TAI while running unsmeared UTC.


I like to think of it as UTC being off by 1s just before the leap second. :)


Clearly the most realistic solution is to slow down the rotation of the Earth when necessary.


I think you mean speed up.


You need to be able to do both!

Since the start of atomic time the day has generally been about 1ms longer than 24x60x60 seconds; at the time of the last leap second in 2016 it was 1.5ms longer. But since last year it has been slightly less than 24x60x60 seconds! For the last few months it has been around 0.25ms shorter.


Well either way would work I suppose, but yeah let's speed it up a bit. I propose we move the moon closer.


Afaik additional problem with UT1 is that it's near impossible to tell UT1 time in real-time, because you need to do astronomical observations. So it's kinda impractical as civil time.


The rotation of the earth is predictable enough in the short term that it is forecast a year into the future: see IERS Bulletin A https://www.iers.org/IERS/EN/Publications/Bulletins/bulletin...


One of the things we want is a timescale that allows us to put a timestamp on a contemporary event, with the hope that in 2,000 (or 200,000) years, they will be able to work out exactly how long ago it was we were referring to.

GMT failed disastrously at that; there have been about 12 mutually incompatible definitions of GMT over the years.

So, UTC should retain leap-seconds; that's what UTC means. A new timescale, without leap-seconds, MUST have a new name.

I'm betting the ITU will just change the definition, without changing the name.


I think you're looking for TAI. UTC = TAI + Leap Seconds (unless I'm missing some minor detail).

You don't know how many seconds lie between arbitrary UTC timestamps in the future, but you do know how many seconds lie between arbitrary TAI timestamps in the future.


Exactly.

The position of the sun doesn't matter that much. Many locations already do daylight saving and would actually prefer to switch their clocks to "saving time", which moves the sun to a point that isn't highest at noon. (As if it actually mattered to keep the sun fixed.)

We can always calculate where the sun and stars were positioned if we care.


Hence, the giant leap second gyroscopes! :-)


This is neither here nor there, but "leap second gyroscopes" has turned up some of the weirdest google search results I've seen. A few examples:

> Downward movement of work quality? He drew in step a leap second? Gyroscope and accelerometer. Gunfire this weekend outdoors and enjoy shopping here!

> Successful troll was a leap second? Gyroscope and accelerometer. Blissful for you. Both inconsequential from a class. Miracle whip classic macaroni salad?

> Had lent thee all a leap second? Gyroscope and accelerometer. Shah is threatening hearing or sound. Been fired from their roost and lay siege to the renovation ...

Now it's not unusual for webpages to pad their text in order to trick search engines, but these are downright weird.


UTC with only leap hours will satisfy all three, assuming 2 can be slightly weakened.


You can even avoid leap hours if you are willing to have Greenwich not be at +00:00 timezone


We had a leap second bug on the options trading desk I worked on that brought down our system. As market makers, we had an obligation to keep providing liquidity so this was a serious issue and exposed us to fines.

Our exchange co-located data centers used GPS for precisely synchronized timestamp generation but the firmware on some of the GPS hardware had a bug and failed to take into account a leap second. When that leap second was to be inserted, they became out of sync by a second.

We generated spot price feeds from each location and a component that consumed these feeds would check to ensure that they were not stale (any data more than 0.5 sec old could not be used as an input for pricing and trading). Well, a lot of exchange feeds started looking stale, and our system stopped quoting on said exchanges.

First there were some murmurs from the traders and within minutes the entire room hit a crescendo of panic. It took, I think, the better part of an hour to debug.


Can you provide the model of the GPS hardware? I work in the same industry and have set up multiple NTP / PTP timer servers that have never failed to account for leap seconds. It's almost always the downstream software that crashes. See the 2012 leap second insertion as an example.


How did you resolve it?


They just needed to add leap seconds to the hardware timestamp after they debugged it.


Worth mentioning, there's an ongoing debate about removing leap seconds, as they seem to cause more trouble than benefit they bring (UT1 and UTC being in sync).

Personally I've dealt with some unit tests around simultaneous use of multiple timezones (we had to show the timestamps in the local timezone of the respective event). One day a good chunk of them stopped working for no apparent reason.

After a lot of head scratching, I figured out an update of tzdata was responsible for it, since it added new planned leap seconds. Our underlying library was properly taking the leap second into account, but the test conditions weren't.


At my work we have a system where dates are in a local timezone, where the timezone depends on the data ingress point, but times are in New York local time.

So many bugs occur when juggling 2 or 3 timezones simultaneously.


I won't name it, but I can guess where you work, as the place I once worked had the same problem and surely not many places have made the same mistake.

Worst is where the NY local time jumps back due to DST and you have two actual points in time represented by the same numerical NY local time. That manifests itself as an hour long gap in the data for the Australian stock exchanges!


It shouldn't be problematic as long as the exchange stops trading for at least 2 out of 24 hours, since no DST shifts can bridge the gap and cause ambiguity.

24 hour exchanges are hosed though

Where do you work now and what mistakes have they made? :P


> It shouldn't be problematic as long as the exchange stops trading for at least 2 out of 24 hours, since no DST shifts can bridge the gap and cause ambiguity.

Actually one hour would do, but it has to coincide with New York 1am - 2am, not its own DST shift. If I remember right, only one or two exchanges had that problem in practice even though there are lots of Asian exchanges. In fact it might've been the New Zealand stock exchange rather than the Australian one. But yes all 24 hour exchanges are in trouble. This was a good 10 - 15 years ago now, so it's possible they've added a suitable hack to "solve" the problem.

I work for a small consultancy now where much of the software design choices were made by me. So naturally there are no mistakes ;-)


> so it's possible they've added a suitable hack to "solve" the problem.

No, it's still a problem.


Offered for perspective/interestingness, not as an argument:

For anyone wondering why it matters so much that time be precisely linked to the rotation of Earth, I'll note that time is a fundamental component of navigation. When you do celestial nav you make corrections down to the second, including accounting for the number of seconds your watch gains or loses. So it's not just about putting timestamps in a database. There's a straight line (a rhumb line? har har) from "what time is it" to "where are we" and in that context losing a second means you get a different longitude. Because time in this setting isn't really time--it's an indication of how far east or west of the prime meridian the sun is.


Is there any practical navigation use case that needs enough precision to not be off 27 seconds from Earth's rotation and unable to use GPS?


Thanks, this helped me adjust my mental model!


So many comments on here asserting that leap seconds are bad and should be abandoned and that it would be simpler if they didn’t exist, but…

You realize this is about the GPS system, right? The thing about GPS satellites is that they are in space, orbiting around the earth. If the earth’s rotation speed changes their orbital speed doesn’t (not immediately, anyway).

GPS absolutely needs to adjust for the variable rotational speed of the earth - otherwise the GPS coordinate grid would gradually move relative to the surface of the earth.

So GPS doesn’t exactly need leap seconds but it really does care about how long a rotation of the earth takes which… amounts to the same thing.


> So GPS doesn’t exactly need leap seconds but it really does care about how long a rotation of the earth takes which… amounts to the same thing.

I don't think this is right. GPS explicitly uses a timebase that does not include leap seconds [1].

On the subject of the article: my modest and very reasonable proposal is that we apply a leap second every six months without fail, dithering between positive and negative leap seconds so as to remain close to sidereal time. That way we would flush out bugs every six months and wouldn't have them accumulate and hit us all at once.

Or we could be boring and use TAI or GPS time as the system clock every where and apply leap second corrections when we go from the system clock (currently UTC) to local time.

[1] https://en.wikipedia.org/wiki/Global_Positioning_System#Time...


I mean, the article is literally about GPS satellites transmitting leap second data as part of their messages. So yes, the basic clock is just counting seconds but leap seconds are a pretty important component of the GOPS model.


So, GPS involves a few things: precise clocks that can be used to triangulate position; a model of where the satellites containing the clocks are in space; and a model of where the earth is in space.

UTC has basically the same inputs (the clocks are on the surface instead of in orbit, but one of the main time labs [NIST] is in Colorado and its altitude needs to be taken into account), but the calculations and logistics, the timebase of the model, are all very different.

So you can’t derive UTC from the GPS ephemeris, because there’s a human in the loop who decides when leap seconds happen. So GPS needs to include UTC as a separate signal, alongside the ephemeris and everything else.

(In fact, if I remember correctly, GPS includes nanosecond-level corrections between GPS time and UTC, because atomic clocks are not completely perfect.)


> GPS absolutely needs to adjust for the variable rotational speed of the earth

Sure, but that's a spacial thing, not something that's relevant to timekeeping. (There's technically a change in relativistic time dilation, but it's less than a rounding error - about five millimeters per second at the equator makes 1.000000000002388402 versus 1.000000000002388457 if I've got my math right. (The relativistic time dilation from the earth's rotation is itself a rounding error.))


And when you’re using GPS to find your location on the surface of the earth… spatial things matter, right? GPS is for figuring out your location, not what time it is.

You can use it to figure out what time it is, but that’s a side effect not the goal.


Sorry, but your comment is just nonsense. :( GPS time doesn't include leapseconds: It's effectively TAI with a different offset.

GPS indeed has to correct for all sorts of orbital stuff, but that is done with equations on top of a stable atomic timebase. (And it has to be that way-- each satellite's orbital parameters are different).


Knowing your location relative to the satellites is only so useful if you’re actually standing on the surface of the earth (or trying to fly to a particular point on its surface). You also need to know which way the earth is facing right now. And to do that you need to know how many times it’s gone round since some reference time - which varies, but amounts to ‘number of days taking into account smeared leap seconds’


That isn't at all how the ephemeris data is represented in GPS. The satellite locations/headings are specified in earth-fixed geocentric coordinates. So, it's as if the earth was sitting still and the satellites were swarming around it.

The effect of earths very slow changes in rotation speed would effect the accuracy, but the ephemeris data is periodically updated based on observations of satellite locations by ground stations, there are many larger sources of positioning error than changes in the earth's rotation. This is also why first fix from a cold (memory wiped or long offline) receiver takes a long time even when it knows your location, unless you have AGPS (e.g. via the cell network): it doesn't know what satellites are overhead and it has to do a brute force search before it can find the signals and download the ephemeris data.

(There is a side channel that carries info on leap seconds, so that civil timing stuff can derrive UTC from the GPS time. ... but that isn't used by GPS itself at all).

Here is a nice page that shows the current GPS time and notes that it does not include leap seconds: http://www.leapsecond.com/java/gpsclock.htm


(One of) the biggest pains with leap seconds is they are so infrequent and so I/O-bound, that they are almost impossible to test. What is less functional than relying on a random change in a radio signal below the noise floor coming through dedicated hardware, which is finicky and doesn't work well indoors (E-coated glass windows? Fuggetaboutit).

I think one thing that could have be easily done to help is emit leap-sub-seconds (100-500ms) more frequently, so we would have more chances to detect bugs per year. Alas, leap seconds are baked in as int8 so that's never gonna happen. That's also totally not the mindset when these systems were designed, which is kind of my point. The right kind of forethought could have avoid much of this pain.

Option 1: We have 2 time standards out of sync. Let's make an announcement yearly-ish and add 1 second of offset that everything has to obey.

Option 2: We have 2 time standards out of sync. Let's continuously track the offset to (whatever precision) and this offset is just always part of the correction (like we do time zones), rather than this value which you just assume is static 99% of the time.


This could lead to an interesting situation, where in June of 2029 a multitude of double-spending attacks are initiated on various banking/financial software across the world, with the attack lasting a few seconds in total. What a headline that would be... Of course this depends on specific implementation, but I can see how this could happen to a wide array of implementations.


Looking at how banks handled these kind of issues in the past, they'll shut down all operations for these few seconds and throw away anything that still came in these seconds as invalid.

It sucks for their customers and partners, but that would be a decent conservative option I think.


Somebody should have told the financial institutions in Batman: The Dark Knight Rises that throwing away very obvious unwanted transactions was an option!


On one side it might not be a good idea to have actual actionable and efficient criminal plans enacted in popular fiction.

On the other side, it seems to be a disservice to the community to have stupid ideas widely spread around. It's hard to balance for sure.



I am still in favor of replacing the annoying one hour jumps when switching to and from DST by having a leap second nevery hour for 150 days. That would also force people to fix their leap second bugs.


Is there somewhere to read more about this? It sounds very interesting, are there any places/systems that are considering using this, or even talking about it?


One of the interesting corner cases of leap second is in traffic enforcement. It you are building section control traffic enforcement system, you measure the t1 at point on entry and t2 at exist. Because of you know the length of a section , distance over time difference becomes your speed. But when you have leap second, you might just get a fine for driving at jet-fuelled speed on your grandpa's car.


Supposing the speed-controlled road spans 1km, then if you go at 100km/hour, it will take you 36s:

  You have: 1km/(100km/hour)
  You want: s
   * 36

With a second less, your speed would be miscalculated to:

  You have: 1km/(36s - 1s)
  You want: km/hour
   * 102.85714
So not really something to worry about


Unless the road span in question is a 10m intersection. Now if you went through at 30km/hour it took you 1.2s, but lose that second and now you went through at 180 km/hr.

Go through at 60km/hour and you left the intersection before you entered it. Does that mean the government owes you the fine?


> Does that mean the government owes you the fine?

no it just means the fine is immediately overdue :)


in the UK, speed traps (that aren't "average speed check" traps) operate over a distance of ~10-30 meters, depending on the road's speed. that second is very important.


I really hope that traffic enforcement systems don't use timers that know the date.


It is not just for timer, as part of evidence you also have to capture when that event occurred. There it does capture the date.


You should record the event time separately [1] from the beginning/end of the measurement period - the first as a time instant in UTC ('wall clock'), the latter as a difference between two values of a guaranteed monotonically increasing clock.

This is something you need to do regardless of leap seconds to handle things like NTP kicking in to adjust your local wall clock time.

[1] - Depending on the programming environment, these might be either different timestamp types recorded separately or converted appropriately to reflect different uses, or a smart combination type like time.Time in Go.


Every time I read an article about time handling in software, it makes me glad that’s not how I earn my paycheck.


Leap seconds are a bad idea and should be abandoned. Computing the time between events by subtraction should work. And while we are at it, dropping time zones and daylight saving time would also make sense.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: