Table of Contents

  • Technological development in general, and that of information-communication technologies and applications in particular, requires the increasingly precise timing and synchronization of different electronic devices.

  • The Radiocommunication Study Group 7 for the Science Services (SG 7) was created through a structural reorganization in 1990 at the Düsseldorf CCIR Plenary Assembly. At that time, the Space Research and Radio Astronomy Study Group (SG 2) was consolidated with the Time and Frequency Standards SG 7 to form the new SG 7 on Science Services.

  • The work of the Rapporteurs for the various sections of the Handbook was outstanding and we would like to express appreciation to the contributors and the many others who have contributed their time and efforts on this Handbook.

  • Long time ago people have marked time by the movement of the sun and other stars, by phases of the moon, by the changing of seasons, and by the passing of generations. Time intervals have been measured by sandfilled hour glasses, water clocks, and mechanical devices. The frequencies of musical instruments are tuned by comparison to tuning forks and pitch pipes.

  • Over the last few decades atomic clocks have moved from laboratory novelties to large scale usage. The improvement in quartz oscillator technology and in satellite timing systems has augmented the decades of improvement in atomic clocks. Navigation, communication and power systems have all benefited greatly from these improvements. Precise timing has moved from a novelty to a necessity. Hence, many applications depend on precise timing elements.

  • Time and frequency transfer play a critical role in the functioning of the Global Positioning System (GPS) and in turn are also greatly enhanced by its worldwide acceptance. This chapter highlights the role of clocks, time and frequency in the operation of GPS and the capabilities and limitations of GPS in providing a mechanism for their dissemination.

  • Historically, navigation systems have depended on time. This was clearly demonstrated by the sailing of Harrison’s chronometer on HMS Deptford in 1761, to prove that the instrument allowed navigators for the first time to determine longitude accurately and reliably. Because of this relationship between navigation and time, the timekeeping community has always had a keen interest in the use of navigation systems for the distribution of time. Even today, the heart of the GPS rests on a highly evolved clock technology. Unlike navigators, who need four GPS satellites by which to determine their position, timekeepers who know their position need only one satellite to determine time. Observations of a single satellite could allow timekeepers to remotely synchronize clocks around the world.

  • The GPS is a real-time predicted information operation. The five U.S. Air Force Monitor Stations maintain continuous real-time links to the MCS so that current tracking and status information can be combined with that from additional monitor stations established by the National Intelligence and Geophysical Agency (NGA) to provide additional coverage of the GPS satellite constellation. The increased observational data then provides for prediction of new system variables. These predictions of system performance are subsequently uploaded into the satellites’s on-board memories for transmission in the satellite navigation messages. The errors remaining in this process are considered to be random with dominate systematic errors (biases) removed.

  • The Global Navigation Satellite System (GLONASS) is a government global navigation satellite system (GNSS) which is designed for providing a continuous all-weather support of an unlimited number of aeronautical, maritime, terrestrial and space-born users with high-precision position-fixing and timing information at any point of the Earth and in the near-Earth outer-space.

  • The principal function of communications satellites is to transfer RF signals for communications purposes. Therefore, time and frequency transfer via communications satellites is usually accomplished by piggybacking off the communications functions. Some satellite systems, especially those utilizing timedivision multiple access systems (TDMA), require precise time for their communications functions [ITU-R Handbook, 2002; Ha, 1990]. In the majority of TDMA cases, the satellite just passes through precise timing signals from the ground, and the satellite payload itself does not require precise time for its operation. Time transfer capability is required in the satellite bus telemetry, tracking, and command system, for ranging and time tagging purposes. However, this part of the satellite system is generally not available to the leasing user. Frequency accuracy and stability requirements for commercial satellites are in general limited to that required for maintaining band allocations and for performing ranging functions. Government communications satellites tend to have much more accurate on-board frequency sources, because they are also used for navigation or other purposes that require precise time.

  • As it has been described previously, UTC is established at the BIPM by a post-processing computation based on the data of about 350 atomic frequency standards maintained in about 60 timing laboratories distributed worldwide.

  • The transfer of precise time via satellite has as its largest potential error source the propagation delays of the Earth’s neutral atmosphere and the ionosphere. In this section these range delays will be outlined and various techniques for compensating for them will be presented and discussed. The range delays of the troposphere and of the ionosphere differ in several important aspects. The range delay of the earth’s troposphere is not dispersive; that is, it is not a function of frequency, at least not over the normal radio frequency range used for ranging to artificial Earth satellites. The range delay of the ionosphere, on the other hand, is dispersive; it varies inversely with frequency. Thus, by measuring the relative range delay at two, suitably spaced, frequencies the absolute range delay can be computed directly along the satellite to user path. The range delay of the Earth’s troposphere cannot be measured directly, but several models, or indirect measurement techniques can be used to infer the tropospheric time delay contribution for time transfer via satellite to high accuracy.

  • GPS methods have been the basis for most high-accuracy time and frequency transfers for more than two decades. The usual approach for maintaining Coordinated Universal Time (UTC) has relied primarily on single-frequency pseudorange (C/A-code) data and simple common-view (CV) data analyses that assume cancellation of most systematic errors [Allan and Weiss, 1980]. With improved data yields thanks to widespread replacement of the earlier single-channel receivers by multi-channel units, intercontinental CV comparisons have achieved uncertainties of a few ns averaged over five-day intervals [Lewandowski et al., 1997]. In contrast, the parallel development of high-accuracy geodetic methods using dual-frequency GPS carrier-phase observables has demonstrated positioning repeatabilities at the cm level for one-day integrations [Zumberge et al., 1997]. Assuming such positioning results can also be realized as equivalent light travel times (~33 ps), the potential for GPS carrier phase-based geodetic techniques to permit sub-ns global time comparisons is evident, as widely recognized by the 1990s. In fact, the method has been shown to have a precision approaching ~100 ps at each epoch in favorable cases for one-day analysis arcs [Ray and Senior, 2003]. The absolute time transfer capability remains limited to >1 ns, however, due to instrumental calibration uncertainties [Petit et al., 2001]. In addition to higher precision (equivalent to frequency stability), the geodetic approach easily lends itself to global time and frequency dissemination. This is consistent with the basic GPS operational design (albeit with replacement of the GPS broadcast message with more accurate information), unlike the point-to-point nature of CV, which furthermore degrades as baseline distances increase..:

  • In Time transfer, one of the primary goals is to compare clocks and/or frequency standards over widely separated distances. There are many reasons for making such comparisons. One can simply be interested in making instantaneous measurements of widely separated clocks to monitor the performance of time scales through the intermediary of the clocks being compared. One can also be interested in making comparisons of advanced frequency standards, such as cesium fountains, and/or checking the long-term stability of such standards. The precision and accuracy with which these measurements can be achieved is of interest for metrology.

  • Time and frequency can be transferred via a number of techniques depending on the accuracy required. The primary means of accurate time transfer is the global positioning system (GPS). GPS uses a constellation of satellites each containing atomic clocks. These spaceborne atomic clocks, combined with the monitor station caesium standards establish GPS time, the system synchronization time. Using GPS for time and frequency dissemination relies upon the stability and precision of GPS time for positioning. Simultaneous passive reception of multiple GPS satellites requires the satellites to be precisely synchronized to each other with less error than that expected from the individual satellite pseudorange measurement with the user receiver. The stability of the individual satellite clock between updates or re-synchronization with GPS Time, determines the system synchronization error.