Cool, but now I am looking up the cost of a “Kvarz Active Hydrogen master”
When the Apple Watch first came out they made a lot about hoe precise it is. I wonder if that sounds still true, they’ve kind of dropped it from the marketing.
Eh I'm guessing it's just not interesting to most people. Your phone/laptop is also a very precise timepiece but it's a feature you get "for free" in a modern OS, so not usually worth mentioning.
Your phone/laptop is not especially precise by modern standards. They are often nothing more than ordinary crystal oscillators, with frequency errors somewhere around 10 or 20 ppm (very roughly, sometimes worse, sometimes far worse--the Raspberry Pi is awful). You may intuitively think of this as "very precise" but these are not really much better than quartz watches made in the 1970s.
Note that this is nowhere near precise enough that you would worry about leap seconds. Your computer might gain or lose 5 or 10 minutes every year, so leap seconds just get lost in the noise. Of course, you won't notice this due to NTP... but at that point, you're not really relying on your phone or laptop, you're relying on atomic time standards elsewhere in the world.
If you had a TCXO (temperature-compensated crystal oscillator, relatively common in watches and RTCs), you could get better accuracy, maybe down to 0.5ppm or 20 seconds per year. These basically work by adjusting the oscillator in response to changes in temperature. This is on par with the best quartz timepieces available today (0.1-0.5 ppm), but still not accurate enough to worry about leap seconds. You can check the actual drift on Linux systems with NTP by running "cat /var/lib/ntp/ntp.drift", which shows you how many ppm your particular clock is off by. For me, I see a value with magnitude around 1.2, which is ~38 s per year. I can conclude that my motherboard probably has a TCXO because otherwise the drift would be larger.
If you had an OCXO (oven-controlled crystal oscillator, note that the "C" is different), you could get down to 0.005ppm, or something on the order of 200 ms per year. At this point, observing leap seconds becomes mandatory. These work by maintaining the crystal at a predefined temperature using an oven. However, this takes up more space and requires more power, so it is out of the question for laptops or phones.
(As a minor note, the very best modern mechanical "chronometer certified" watches are somewhere around 10ppm, making them worse than all but the worst quartz watches.)
Perhaps of interest: the Accutron 214 (released 1960) uses a tuning fork's ringing for both regulation and mechanical power, and was advertised to be accurate to 1 minute per month. They are reasonably easily obtainable, and are a great example of some really elegant engineering.
I think a significant portion of that precision came from synchronizing with network time, did it not? Or did it run unusually precise even when disconnected from the network?
This _almost_ seemed driven more by a desire to make sure the animated watch faces stay beautifully in sync when customers view the Apple Watches together in stores, which is a pretty cool little trick, that the watch itself is accurate is nearly a side benefit... I recall various Apple exec interviews at the original launch drawing attention to this.
You can get accuracy well under 100 ms using ordinary NTP over the public internet, and I assume that since the Apple Watch has GPS that this is also used to synchronize the watches.
All you need is an ordinary crystal oscillator and a GPS receiver, stitch it together with a little software and all your watches in the store are synchronized.
Some people on the watchuseek HAQ forum determined that, even permanently disconnected from an iPhone and any network, it is accurate to ~10 seconds per year, which is extremely impressive.
That's about 3 ppm, which is pretty good. Based on that ppm value, my guess is a temperature compensated crystal oscillator. I imagine it was kept at room temperature for the duration of the test?
Towers do use GPS to sync clocks and typically they also have rubidium clocks so that they can last a considerable time without GPS. Why feed and wait for your hungry hippo GPS chip when you can do almost as good (in an absolute sense) or better (in a relative sense) by just observing tower timestamps?
To be more complete and precise, they use GPS to condition a rubidium clock. For several reasons it isn't a good idea to take timing directly from GPS without a highly stable local oscillator if you are using the output over long or continuous periods.
And to be even more precise in the pre-CAE days the main source for disciplining was SDH/SONET clock of the network interfaces. It is higher stratum clock source than GPS, but as the frequency is higher it ends in more accurate PLL lock.
At least in 3GPP networks (ie. GSM/UMTS/LTE, all of which depend on precise 27MHz clock in the handset) the PLL lock to network's time reference gives more precise time reference that you can reliably get from GPS and essentially for free. The MS (ie. your phone) needs to maintain that PLL lock in order to stay associated to the network (which itself has traceable time/frequency calibration to some kind of national frequency standard and GPS clock).
AFAIK in 3GPP2 (ie. US-style CDMA) the handsets actually have GPS receiver that gets used in the initial association phase to speed up the timing PLL lock (which for the initial cell acquisition has to be significantly more precise than in 3GPP systems)
Initial GPS signal acquisition is power intensive, but if it is already tracking, then there should be no incremental cost to read the time information.
GPS time tends to be very precise as well. In the NTP world, (simplifying greatly) GPS can serve as a reliable “stratum 0” time source which feeds stratum 1 servers, which go on to feed the rest of the NTP network.
Building accurate NTP stratum 0 GPS time source is somewhat nontrivial endeavour. Which explains why most of stratum 1 NTP timeservers (typically GPS based) give wrong time (with error on the order of 500ms).
But what value would that add? How often would you need to sync your phone clock but not be connected to network (which will be providing time sync anyways)?
I own a watch that uses GPS as a time source and it is not without its limitations. It only works outdoors, and it takes several seconds to get as sync - over a minute if you also need to sync your position (to adjust the time zone).
When the Apple Watch first came out they made a lot about hoe precise it is. I wonder if that sounds still true, they’ve kind of dropped it from the marketing.