Originally Posted by turtlebay777
I thought NASA were being extreme when they insisted their clocks had to be accurate to +/- ten seconds!
Yes - This is more closely related to NASA than I thought to mention.
The usual need for network transfers in a pile of high traffic is to get within 128mS.
Getting the time from network time servers can get the clock within a few milliseconds.
My need is different. Machinery can be synchronized to move with millisecond accurate timing, and register shaft position angles within fractional arc-seconds. If the machinery is pointing antennas at moving satellites, you then have to raise the game to a whole new level, because now you have to make it happen at an exact celestial time counted forward from a "epoch" related to a the number of microseconds after various "Truncated Julian Dates" from back in 1950, 1970, etc.
Latitude is still, as it always was, a matter of knowing the right time. It is when the stuff is not attached to Earth that it gets difficult. The Earth is slowing down, and it wobbles its axis, and that wobble itself has a "wobble". The Moon pulls things about, and the Earth centre of mass is not in the centre, and the Earth is not a sphere. The NASA published algorithms have to be used with care, and the input information updated daily, or hourly.
Time is controlled in layers.
NIST-F1 The Cesium Fountain Atomic Clock - 3 parts in 10^16
Less than 1 second in 100 million years, aiming to get 1 second in 300 million years
It is run several times a year to set hundreds of "lesser" atomic standards.
These are used to set network time servers, again in "levels"
"Unix Time" counts from 0h 0m 0S Jan1, 1970. For this work, we will only trust a Linux PC running ntp.