Page images
PDF
EPUB
[graphic][merged small][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][graphic][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][merged small]
[graphic][graphic][subsumed][subsumed][subsumed][subsumed]

Figure 5-First commercial atomic beam frequency standard. National's Atomichron [33]. © 1972 by the IEEE.

This was National's Atomichron developed [33] by Daly and Orenberg in collaboration with Zacharias and further improved by Holloway and McCoubrey. This device used Ramsey's separated oscillatory field method for increased precision, a special design of cesium oven that could be operated several years without exhaustion, titanium pumping to permit permanent sealing off of the evacuated beam tube, and many other features generally necessary for an effective commercial device. The first commercial Atomichron is shown in figure 5. The development of the Atomichron was supported financially largely by the U.S. Signal Corps at Ft. Monmouth, NJ, and the Office of Naval Research, although some support came from the Air Force. A purchase order by the Signal Corps for a relatively large number of Atomichrons made possible the development of massproduction techniques and improved engineering to permit sufficient reliability and reductions in price to assure commercial success.

The early atomic beam frequency standards were subject to various frequency shifts dependent on the amplitude of the radio-frequency power and on other variables. To account for these results, Ramsey, with the aid of computational analysis supported by the National Co., investigated the various possible distortions that would occur in an atomic beam resonance [34]. The elimination of radio-frequency phase shifts and other sources of distortion made possible the marked increases in accuracy that have been obtained with the atomic beam frequency standards.

From 1956 on, atomic beam frequency standards developed rapidly. Mockler, Beehler, Barnes, Hellwig, Wineland and others [33,35,36] developed an atomic cesium frequency standard at the National Bureau of Standards in Boulder, CO. CO. Other commercial organizations such as TRG, Bomac, Varian, HewlettPackard, Frequency and Time Systems, Inc., and Frequency Electronics, Inc. became involved. Many laboratories throughout the world either constructed or purchased atomic beam frequency standards including those in Canada, France, Germany, USSR, China and the laboratories of Kartaschoff [35] and Bonanomi [35] in Switzerland. Reder, Winkler and others [35] at Ft. Monmouth and Markowitz at the Naval Observatory sponsored various worldwide. studies of the comparison of atomic clock frequencies and the synchronization of clocks. Extensive studies were made of other atoms such as thallium for use in the atomic beam tubes, and various molecular resonances were studied for possible use in a molecular beam electric resonance apparatus for

frequency control purposes. A T1205 frequency measurement accurate to 2 parts in 10" was reported by Bonanomi [37]. However, atomic cesium remains the most widely used substance in molecular or atomic beam frequency control devices. Particularly effective atomic beam cesium clocks were developed and sold by Hewlett-Packard which also developed a "flying clock" particularly suitable for the intercomparison of atomic clocks in different laboratories. A typical beam tube for an atomic cesium frequency standard is shown in figure 6. Accuracies as high as 1 part in 1013 have been claimed for some laboratory cesium standards [35].

[graphic][merged small]

Microwave Absorption Spectroscopy

Microwave absorption spectroscopy had an early start in the experiments of Cleeton and Williams [38,39] who observed the absorption of microwave radiation at the NH, inversion frequency. However, research on microwave absorption was inhibited at that time by the lack of suitable microwave oscillators and circuits so there was no further development of microwave absorption spectroscopy until after the development of microwave oscillators and waveguides for radar components in World War II. Immediately following World War II there was a great burst of activity in microwave absorption spectroscopy. Although there were no publications on experimental microwave spectroscopy in 1945, in the single year of

1946 there were a number of important publications from many different laboratories including reports by the following authors [40]: Bleaney, Penrose, Beringer, Townes, Dicke, Lamb, Becker, Autler, Strandberg, Dailey, Kyhl, Van Vleck, Wilson, Dakin, Good, Coles, Hershberger, Lamont, Watson, Roberts, Beers, Hill, Merritt, and Walter, and in 1947 there were more than 60 published papers on this subject including a number of publications by Gordy and Jen, by authors with reports the previous year, and by others. A typical microwave absorption experiment at this time is shown schematically in figure 7.

Microwave absorption techniques were quickly recognized to be of potential value for frequency standards. In 1948 a group of workers [25] at the National Bureau of Standards built an ammonia clock that was completed in 1949 and is shown in figure 8. It eventually achieved an accuracy of 1 part in 108. Rossell [25] in Switzerland and Shimoda in Japan devised an improved ammonia absorption clock good to a few parts in 10°.

In 1951 Townes [41] analyzed frequency stabilization of microwave spectral lines and the application to atomic clocks. In the references he lists other early contributors to the field. At the 1951 Fifth Frequency Control Symposium, Dicke [42] reported on microwave absorption molecular frequency standards. In the seventh, eighth and ninth symposia, Dicke, Carver, Arditi, and others described the continuation of this work at both Princeton and the

Radio Corporation of America (RCA) with the financial support of the Signal Corps and the Office of Naval Research [35,42]. The microwave absorption studies soon merged with the optical pumping techniques, described in the next section, since the intensities of the resonances were greatly enhanced by the use of optical pumping.

Optical Pumping

The starting point of all research on optical pumping was a paper by Bitter [43] in 1949, which showed the possibility of studying nuclear properties in optically excited states. Kastler [44,45] showed the following year that this technique could be effectively combined with the double resonance method he and Brossel [44] had developed. Both optical pumping and optical detection techniques served the purpose of increasing the signal-to-noise ratio of the resonator output signal. The optical pumping greatly enhances the population of certain states so the signal is not weakened by stimulated emission nearly cancelling absorption and the optical detection increases the signal-to-noise ratio because of the lower noise level of optical detectors over microwave detectors.

The combination of optical pumping techniques with the buffer gas method for reducing Doppler shift developed by Dicke [33,42] provided gas cells of real value as frequency-control devices. Although many

[blocks in formation]

Figure 7-A typical microwave absorption experiment using a radio-frequency bridge and heterodyne detection.

[blocks in formation]

abroad. Figure 9 shows a typical optically pumped rubidium frequency standard.

The optically pumped gas cells have the advantages of simplicity, relatively low cost, large signal-to-noise ratio, and good spectral purity. Unfortunately, the relatively large shift in frequency due to numerous buffer gas collisions is dependent on purity, pressure, and temperature. Changes in the light intensity shift due to variations in the pumping lamp intensity or spectrum may also be a problem. As a result, the stability of rubidium gas cells over a period of several months is ordinarily no better than a few parts in 101o. These shifts prevent the optically pumped gas cells from being primary time and frequency standards, but the gas cells are used as frequency control devices when their accuracy is sufficient. Research is currently in progress in a number of laboratories to improve the stability of optically pumped gas cells; Bouchiat, Brossel [46], and others, for example, have eliminated the buffer gases as was done earlier in the hydrogen maser [47] and have used suitable coated walls to retain the atoms and reduce the effect of the first-order Doppler shift.

[graphic]

Molecular Masers

In 1951, Pound, Purcell and Ramsey [48] studied nuclear spin systems with inverted populations and noted that such systems in principle were intrinsic amplifiers rather than absorbers. The first suggestions actually to use systems with inverted populations as practical amplifiers or oscillators were made independently in the early 1950's by Townes [49] and independently but somewhat later by Weber [50] and Basov and Prokhorov [27]. The first such amplifier was successfully constructed in 1954 by Gordon, Zeiger, and Townes [49] and called a maser (Microwave Amplifier by Stimulated Emission of Radiation). The device used inhomogeneous electric fields to focus the higher energy molecular inversion states of ammonia molecules in a molecular beam. These molecules then emitted coherent stimulated emission radiation in passing through a cavity tuned to the 24-GHz ammonia inversion transition. A schematic diagram of the first ammonia maser is shown in figure 10. A report by Gordon on the new ammonia maser was a major attraction at the special meeting on atomic and molecular resonances sponsored by the Signal Corps Engineering Laboratory in 1956. In that year Bloembergen [29] proposed the three-level solidstate maser and in 1958 Townes and Schawlow [30] pointed out the possibility of masers at the infrared and optical frequencies.

« PreviousContinue »