Page images
PDF
EPUB
[graphic]

that work was concerned with helping users locate the scientific literature that reported research results. If users were to take advantage of the power of computers, the entire range of scientific content including the data resulting from research, which are usually in numerical or graphical format, would have to be available in electronic form. The remainder of the paper covered all major aspects of scientific data storage, retrieval, and dissemination.

After defining what is meant by the term "data"repeatable measurements, observations and statistical results-Lide discussed the importance of quality in data activities. As science and engineering were confronting increasingly complex problems, from depletion of the ozone layer to shortage of critical materials, important decisions required data of the best quality to be available. With the advent of computer modeling as a major technology, data quality was even more important because of the potential for modeling results whose dependency on input data would be difficult to trace. The National Standard Reference Data System, as established by NBS, was a concerted attempt to involve expert scientists in the effort to assess the quality of reported measurements. As already mentioned, NSRDS maintained a series of continuing data centers with expertise in a well-defined discipline. Lide described several of these data centers and their current approach to evaluation of laboratory data on well-defined substances and materials. He also described parallel activities of the World Data Centers, which had been established by the International Council of Scientific Unions to perform a similar data quality assessment task for observational data in fields such as geophysics, oceanography, and atmospheric physics.

The paper then turns to the challenges of the electronic revolution and, somewhat surprisingly, defines the two paradigms for computer delivery of data that are still used today: installation on one's own computer and access via networking to remote data collections. At the time the paper was written, the first recognizable personal computers were just beginning to appear. Yet the essential features of scientific databases stored on PCs are all addressed, namely local control, heavy use, inclusion of search software, and the facility to transfer data to computational software and other applications. The description of online data services is equally prescient, even though it was based on a model of subscription services in which the user connects directly to a remote computer rather than through the World Wide Web of the year 2000. What is particularly interesting is the recognition that users routinely require a multitude of data resources to solve real-life problems, so

Fig. 1. A sample of current Standard Reference Data products. that online data resources need to be integrated together to provide maximum impact and benefit.

The last major section of the paper describes a time of rapidly evolving cooperation to handle the flood of data and to take advantage of the electronic revolution. The cost of data evaluation is high, the volume of data too great, and the change in technology too rapid for any group to go it alone without ample resources. Further, in many cases, the need for consistency among collections of recommended data can be a prime motivator to work together rather than in competition. Lide mentions CODATA, the Committee on Data for Science and Technology of the International Council of Scientific Unions, as a major factor in bringing together data experts on an international basis.

The paper concludes with a prediction of three trends: the need for reliable data becoming more pressing; computer-based data dissemination methods, especially online systems, growing in use; and coordination in the development of computer-based systems being essential.

Unlike the majority of papers in this Centennial volume, this paper does not present original research results. Instead, it describes one of NBS's largest and best-known programs, the Standard Reference Data Program. More importantly, the paper describes quite accurately what challenges the NBS data programs were facing in the last three decades of the NBS/NIST first century. At the time of publication, NBS was beginning a series of intensive and large-scale cooperative programs that changed the availability and quality of data in many areas of interest to industry. The names of the programs suffice for description:

[ocr errors]
[ocr errors]

American Society for Metals-NBS Alloy Phase Diagram Program

American Ceramics Society-NBS Phase Diagrams for Ceramists Program

• National Association of Corrosion Engineers-NBS Corrosion Data Program

• Design Institute for Physical Property Data (DIPPR), in cooperation with the American Institute of Chemical Engineers

NIST fluids property data programs, in cooperation with the Gas Producers Association, the Compressed Gas Association, and the Supercritical Fluids Extraction Consortium.

The trend continues today with NIST recently forming the Research Collaboratory for Structural Biology with Rutgers University and the University of California at San Diego to operate the Protein Data Bank.

In the years following the publication of Lide's paper, NBS/NIST expanded its data activities even further to include engineering data and even data used to "calibrate" statistical and other kinds of software. At the same time, the proliferation of personal computers, as envisioned by Lide, transformed forever the dissemination of NIST/NBS data. By 1995, over 70 PC databases were available for sale from NIST. Parallel efforts on traditional online services and the Internet/World Wide Web brought even increased availability. As of May 2000, NIST was already providing online access to its evaluated data via 15 web-based systems.

The basic components of the NIST data activities in 2000 remain essentially the same as defined in Lide's paper. Quality is the defining feature. The hallmark of NIST data work remains the evaluation by recognized experts. Easy availability is equally important so that the return on the taxpayer investment in data is maximized by the widest possible dissemination of data. Finally, cooperation ensures that limited resources are not wasted on duplicative efforts and that users are not confronted by competing claims for best quality.

Data evaluation remains an important component of the overall NIST measurement portfolio, providing a

snapshot at a given time of the quality of measurement technology in different fields. NIST is the world's leader in the evaluation of physical science and other data, a tribute to the foresight and vision of leaders such as David Lide, Edward Brady, and many others. The paper Critical Data for Critical Needs remains a classic in defining the scope and importance of data in the advancement of science and technology and of the impact computers have had on scientific data work.

David R. Lide was hired by NBS in 1954 to set up a microwave spectroscopy laboratory in the Thermodynamics Section of the Heat and Power Division. In the early 1960s he led the integration of NBS research programs in infrared, microwave, and ultraviolet spectroscopy into a single Molecular Spectroscopy Section, which he headed until he became Director of the Office of Standard Reference Data in 1969. He was active in various national and international organizations, including stints as Secretary General and later President of CODATA and President of the Physical Chemistry Division of the International Union of Pure and Applied Chemistry. He received Department of Commerce Silver and Gold Medals and the Samuel Wesley Stratton Award of NBS for his research in spectroscopy, as well as the Herman Skolnik Award and the Patterson-Crane Award of the American Chemical Society for contributions to chemical information. After leaving NIST in 1988, he became Editor-in-Chief of the CRC Handbook of Chemistry and Physics and has published several other books and electronic databases.

Prepared by John R. Rumble, Jr.

Bibliography

[1] David R. Lide, Jr., Critical Data for Critical Needs, Science 212, 1343-1349 (1981).

[2] Edward W. Washburn (editor-in-chief), International Critical Tables of Numerical Data, Physics, Chemistry, and Technology, Vols. 1-7, published for the National Research Council by the McGraw-Hill Book Company, New York (1926-1933).

[3] J. D. H. Donnay and Helen M. Ondik (eds.), Crystal Data Determinative Tables, Vol. 1-4, Third Edition, National Bureau of Standards and Joint Committee on Powder Diffraction Standards, Swarthmore, PA (1972-1978).

Materials at Low Temperatures

An offhand comment by a long forgotten colleague provided the impetus to develop one of the most comprehensive books available on the study of materials at low temperature. As the 30-year era of the NBS Cryogenics Division was coming to a close, it was remarked, perhaps more than once, that the accumulated expertise of the division should be gathered and preserved. Thus, the idea was born to pull together a tutorial text and apply the resident expertise to a critical evaluation of the existing data on cryogenic materials. It is rare enough that such an opportunity presents itself, but rarer still that 13 authors should work together to create such a text. That the result had widespread impact and influence on the low-temperature community, however, was no surprise.

The book, Materials at Low Temperatures [1], consists of 14 chapters, each a combination of tutorial text and critical data analysis for 14 different properties of materials at cryogenic temperatures. It was written during the years 1980-1982 by the staff members of the former Cryogenics Division while the Division was being disbanded during a major reorganization, with the staff being distributed throughout three different Centers that have since evolved into three laboratories: Electronics and Electrical Engineering Laboratory (EEEL), Materials Science and Engineering Laboratory (MSEL), and Chemical Science and Technology Laboratory (CSTL). The 590-page book represents the consolidation of an estimated 600 staff-years of experience accumulated by the Division staff while it led the world in research and development of cryogenic technology. The history of that experience can be traced from its beginning with nuclear weapons, through the rapid growth of the space age, and into the world of low-temperature physics and superconductivity.

The book contains nearly 3000 references to extensive collections of theoretical and experimental work, much of it data for the critical analyses. The book is organized into the following 14 chapters:

Elastic Properties, H. M. Ledbetter

Thermal Expansion, A. F. Clark

Chapter 1 Chapter 2 Chapter 3 Chapter 4

[ocr errors]
[ocr errors][merged small]
[ocr errors]
[blocks in formation]
[ocr errors]

Thermal Conductivity and Thermal

Diffusivity, J. G. Hust

Electrical Properties, F. R. Fickett Magnetic Properties, F. R. Fickett and R. B. Goldfarb

[merged small][merged small][graphic][merged small][subsumed]

Fig. 1. Practical superconducting wires often are composite structures consisting of many fine superconducting filaments embedded in a matrix of normal metal to give the composite stability. When an ac current is passed through the superconductor, substantial energy losses can occur due to eddy currents. These losses must be minimized so that they do not heat the superconductor above its critical temperature. Subdividing the copper sheathing with sheets or fins of low conductivity Cu-Ni helps to reduce those losses. The figure shows a multifilamentary superconductor with cupronickel barriers (white areas) to prevent coupling currents from flowing around filament clusters and the outer copper stabilizing sheath. Copper areas are dark and the Nb-Ti filaments are gray. [Figure courtesy of Imperial Metal Industries.]

[merged small][merged small][ocr errors][merged small][merged small][ocr errors][merged small]

layman while remaining a valuable reference to even the old hands of cryogenics. This coherent presentation is exemplified by the chapter on thermal expansion by A. F. Clark. He begins with a simple observation: "Warming a solid body from absolute zero requires energy. In a free body, this energy manifests itself in two ways: an increase in temperature and a change in volume. Both of these are directly related to the additional vibrational energy of the individual atoms; the former simply because more atomic energy states are excited, the latter because the mean interatomic distance changes with energy. The ratio of the change in energy to the change in temperature is the specific heat. The ratio of the change in volume to the change in temperature is the thermal expansion." Thus, in this remarkably lucid introduction, Clark establishes the significant conceptual relation between two extremely important properties, specific heat and thermal expansion. He goes on to explain, "... the volume expansion is due to

the anharmonic behavior of atomic vibrations and the specific heat is due primarily to the vibrations themselves..." From there, one quickly proceeds to learn about the wide-ranging importance of understanding the thermal properties, from the dimensional stability needed to maintain the critical alignment of a large telescope operating in the very cold environment of space while being warmed by the sun, to the earthbound commercial consequences of the thermal expansion or contraction of a low-temperature storage tank. In a commercial storage tank for liquefied natural gas (LNG), the tank itself cools as the LNG is added. The volume of the fully cooled storage tank may be as much as thirty thousand liters smaller than it was at the ambient temperature! Having explained the concepts and the consequences, Clark then proceeds to review the essential theoretical elements and the principal measurement methods, and concludes with an evaluation of the pertinent data.

[graphic]

Fig. 2. An interior view of a liquefied natural gas tank lined with a low thermal expansion alloy. [Figure courtesy

of McDonnell Douglas, Astronautics Laboratory and Gaz-Transport.]

The approach exemplified by Clark's chapter is followed throughout the book by a collection of authors who, at the time of writing, were considered by many to be the world's best experts on virtually all aspects of cryogenic technology. To have some of the best researchers in the field outline and explain the lowtemperature behavior of materials creates an unusually complete reference. To have those same experts evaluate the available data, explain them, and indicate future research directions, creates a sound, thorough, and scientifically excellent work.

The book has been used as a text in courses on cryogenic technology and as a reference by cryogenic engineers and low-temperature physicists throughout the world. Book reviews of that time used such words as "extraordinarily useful," "extensive,” “understandably written," and "indispensable reference." Personal feedback to the authors was, and some cases still is, always very positive. Two categories of users were often more lavish in their praise, those new to the field and those in the depths of low-temperature experiments, responding respectively to the tutorial text and the data evaluations, the two primary objectives of the book.

The volume, published by the American Society for Metals, sold out rapidly. Of the approximately 1000 books printed, none are still available for purchase. Copies are widely distributed in university, government, and industrial laboratories throughout the world. These books have been seen on shelves from New England to New Zealand, from China to Scotland, and from Argentina to Finland. Visits to the many laboratories that have copies show them all to be both dog-eared and treasured. But probably the strongest indication of use has come from personal feedback to the authors, all of it anecdotal and all of it very positive. Typical are "It's always within reach,” “I never let it out of the office," and "It's always researched whenever we start something new."

The extent of the compilation and evaluation of data is truly comprehensive. Even today, newly measured

data fit within the ranges and predictions of the authors. The explanatory text is, of course, still valid and just as relevant as when the book was published. For anyone attempting to design, build, or interpret an experiment at low temperatures, even 17 years later or 17 years hence, the book is, and will be, an extraordinarily useful reference.

Richard Reed received his Ph.D. in physical metallurgy in 1966. At NBS, he served as Chief of the Fracture and Deformation Division through 1986. Officially retired now, he still consults on the lowtemperature properties of materials. Reed and Clark are credited with founding the International Cryogenic Materials Conference, and each has served as editor of many conference proceedings and journals.

Alan Clark received his Ph.D. in physics in 1964. With a specialty in low-temperature physics and superconductivity, he has served as Group Leader of both the Superconductor and Magnetic Measurements group (NBS-Boulder) and the Fundamental Electrical Measurements group (NIST-Gaithersburg). He also has served as a Liaison Scientist for the Office of Naval Research, London. He is now Deputy Chief of the Optoelectronics Division (NIST-Boulder).

Of the thirteen authors, six are still employed at NIST, three in EEEL (Al Clark, Jack Ekin, and Ron Goldfarb) and three in MSEL (Fred Fickett, Hassel Ledbetter, and Dave Read). Five are retired from NIST (Jerry Hust, Bud Kasen, Harry McHenry, Dick Reed, and Larry Sparks); several of these now have small consulting businesses. One (John Moulder) is deceased.

Prepared by Fred Fickett.

Bibliography

[1] Richard P. Reed and Alan F. Clark (eds.), Materials at Low Temperatures, American Society for Metals, Metals Park, OH (1983).

« PreviousContinue »