"On Unix systems we measure time as the number of seconds since "the epoch": 00:00:00 UTC on January 1st, 1970.... this definition is not based on something sensical such as, say, the objective frequency of vibration of a Cesium-133 atom, but on a convenient fraction of the time it takes a particular large rock to complete a full rotation around its own axis."
Well, seconds have not been defined as "a convenient fraction of the time it takes a particular large rock to complete a full rotation around its own axis" for quite some time, and the origin is set to an abstract event in the past, which is not (as far as I know) subject to retroactive revision as a consequence of the vagarities of planetary or celestial motion (if it is, I would be fascinated to know more.)
> seconds have not been defined as "a convenient fraction of the time it takes a particular large rock to complete a full rotation around its own axis" for quite some time
That is true.
> origin is set to an abstract event in the past
That is also true.
> which is not (as far as I know) subject to retroactive revision as a consequence of the vagarities of planetary or celestial motion
I'm afraid you are wrong on that. The unix time is synced with UTC. UTC has so called "leap seconds" scheduled at irregular intervals by the International Earth Rotation and Reference Systems Service to keep it in sync with the Earth's actual movements. So in effect the unix timestamp is wrangled to sync with the Earth's motion.
Thanks for the correction, this is quite important. I had taken statements that Unix and Posix time excludes or ignores leap seconds to mean that it increases monotonically in a continuous manner in sync with whatever atomic clock is the standard; on the contrary, it recognizes and implements leap-seconds with a discontinuity deviating from the steady tick of that standard.
I also learned that 00:00:00 UTC on 1 January 1970 is a proleptic time, as UTC was not in effect then, though I am not sure that makes it subject to subsequent planetary motions.
"Unix time numbers are repeated in the second immediately following a positive leap second. The Unix time number 1483142400 is thus ambiguous: it can refer either to start of the leap second (2016-12-31 23:59:60) or the end of it, one second later (2017-01-01 00:00:00)."
> Well, seconds have not been defined as "a convenient fraction of the time it takes a particular large rock to complete a full rotation around its own axis" for quite some time
Seconds have not ever been defined that way, because the time it takes for the earth to complete a full rotation around its own axis (the "sidereal day") was never a question of much interest. It's mostly relevant to astronomers.
Seconds were always defined in terms of the synodic day, the time it takes for a point on the earth that is aimed directly at the sun to rotate around the earth until it is once again aimed directly at the sun.
They still are defined that way, in the sense that the only purpose of other definitions is to match the traditional definition as exactly as possible. If cesium started vibrating faster or slower, we'd change the "official definition" of a second until it matched the traditional definition. Given that fact, which definition is real and which isn't?
That's an interesting point of view, but is it not the case that the length of the synodic day varies throughout the year, on account of Kepler's 2nd. law? - IIRC, this is the first-order effect in explaining the analemma. The length of the sidereal day is also variable, but more slowly (at least if we put aside atmospheric and earthquake effects, which I guess affect both equally.)
I think we need both a way to reckon our everyday experience of passing time and a constant measure, and one cannot be just a special case of the other. Personally, I feel that introducing leap seconds into Unix time just muddied the waters.
As to the sidereal day being of little interest outside of astronomy, James Watt's patron/partner Matthew Boulton built a sidereal clock and tried hard to find a purchaser (even having it shipped to Catherine the Great in hopes of sparking some interest) without success. It was on display in his Birmingham home when we visited a few years ago.
> but is it not the case that the length of the synodic day varies throughout the year, on account of Kepler's 2nd. law?
I don't see the relevance; the length of the sidereal day isn't constant either.
Variation in the length of the synodic day is the reason a day may contain other than 86400 seconds. If days are not of constant length, you could vary the length of a second on a day-to-day basis, or you could define the length of a "reference day" and then define the second as a convenient fraction of that. We have taken the second approach (though the first was used historically).
But we don't care about the vibration of cesium; if that were to change, we would adjust by changing the definition of a second, not by accepting that seconds were now of a different duration than before. Thus, the fact that cesium is referenced in an "official" definition of the duration of a second is meaningless. The officialness of that definition is illusory; in reality, seconds continue to be defined as a convenient fraction of an average day.
> Variation in the length of the synodic day is the reason a day may contain other than 86400 seconds.
Indeed; so when the second was defined by a fraction of a day, that day was an abstraction which approximates any given synodic day. I take your point that the author's quoted statement was incorrect, as the synodic day is a function of the earth's orbital period as well as its rotational one.
I also take your point that we don't care about a specific transition of a cesium atom per se, but I am not so sure that any more than a rather small fraction of the population care that a second is 86400 of what was an average day at some point in time (even though it is is a good approximation for today's days.) What fraction, I wonder, have ever performed a calculation on seconds using that number, other than, perhaps, for pedagogical purposes? Other than that, is it not just a convenient short interval where its constancy (a property that cesium delivers better than does the Earth's motions) is paramount? One might point out that such calculations are frequently done on their behalf, but from that perspective, the recent large increase in the use of GPS, with its heavy computational demand that is dependent on the constancy of atomic clocks, should equally be taken into account.
> so when the second was defined by a fraction of a day, that day was an abstraction which approximates any given synodic day
Commenting separately for this - I'm not sure it's true. To my understanding, the historical European division of the day was that the day contained 12 hours of equal length, and the night contained 12 other hours of equal length. They were obviously familiar with the fact that the day and the night were themselves not equally long, so an hour of day and an hour of night would almost never have been equal durations.
This willingness to vary the length of an hour over the course of a year suggests that it probably wouldn't have been a problem to vary the second along with it.
I had heard that of the ancient Greeks, though Wikipedia gives the origin more vaguely as the ancient Near East. It is a practical way of thinking when you are outdoors most of the time, you are not very active at night, and a sundial is your timekeeper. It also helps to be at a relatively low latitude, where the seasonal variation is not so great.
This makes sunrise and sunset the important moments in daily timekeeping, and this is, of course, the case in the Jewish definition of a day (I would not be surprised to learn that Jewish scholars also worked with intraday fractions of the lunar cycle.)
If this did lead to a willingness to vary the second along with the length of the hour, this would seem to me to argue against the specialness of the definition of the second as 1/86400 of the mean synodic day. I doubt, however, that anyone but perhaps a few scholars at that time ever contemplated such a small division of time. The second only became a practical measure with the development of accurate mechanical clocks, by by then, mean time had largely taken over sundials and other forms of intraday ephemeris timekeeping, as that is what mechanical clocks measure. (This transition, I suspect, had a greater impact on the general population than did the adoption of the atomic standard, if only because the latter had no practical impact.) At that point, the second was defined by sexagisimal division of the hour (and thus, obviously, transitively as a fraction of a particular mean synodic day, given the definition of an hour in such terms.)
> I am not so sure that any more than a rather small fraction of the population care that a second is 86400 of what was an average day at some point in time (even though it is is a good approximation for today's days.) What fraction, I wonder, have ever performed a calculation on seconds using that number, other than, perhaps, for pedagogical purposes?
It would be unusual to make a calculation directly using the factor of 86,400 between seconds and days.
But people make calculations using the conversion factors of 60 seconds per minute, 60 minutes per hour, and 24 hours per day all the time. If you have those, you can't avoid the conversion factor of 86,400 seconds per day.
You can't deny the implication, but whether that elevates it to a position of special relevance is another matter, at least as I see it. Personally, unless I have been thinking about it recently (which is very rarely), I would have to do the calculation before answering the question of how many seconds are in a day (and if I were being more than usually pedantic, I would have to ask "which day?")
In my view (which, admittedly, is somewhat subjective), people are not even tacitly concerned with the number of seconds in a day unless they are contemplating the number of seconds in an approximately day-length or longer period. On the other hand, I can imagine some of my relatives, who were conservative in all matters almost by reflex, responding to the SI definition by saying something like "Nonsense! there are sixty seconds in a minute, sixty minutes in an hour, and twenty-four hours in a day, and that's that!" - once they had learned something, it was set in stone. I am pretty sure that the annual variability of the synodic day was not among the things they had learned, and the idea of a leap second would be deeply troubling.
I appear to have talked myself into agreeing with you that 60.60.24 is what matters to most people by default, as they don't know the minutiae of celestial timekeeping or the increasing importance of atomic timekeeping (let alone relativity!) in their everyday lives.
That’s a very recent change - and it’s not like 9192631770 transitions/hz (which is hilariously self referential!) is some obvious, natural value that ISN’T based on the historic ‘typical’ length of the day based on our rotation around the sun.
A second being 1/86400th of a day (24 hrs * 60 minutes * 60 seconds per minute) is still essentially true, and based, essentially still on the seasons and our movement around the sun (or relative movements between the various bodies in the solar system, depending).
Being a chaotic natural system, we of course need to add fudge factors here and there to simplify the math day to day while keeping it aligned with observed reality, like leap seconds and all), at least where it intersects with reality in a material way.
Well, seconds have not been defined as "a convenient fraction of the time it takes a particular large rock to complete a full rotation around its own axis" for quite some time, and the origin is set to an abstract event in the past, which is not (as far as I know) subject to retroactive revision as a consequence of the vagarities of planetary or celestial motion (if it is, I would be fascinated to know more.)