Page images

sands of three-dimensional volumes, each having linear dimensions of about 250 miles in the north-south and east-west directions, and a mile in the vertical direction. The task of making these boxes smaller is severely limited by the speed of even present-day supercomputers. For example, decreasing the horizontal size of a GCM from 250 to 25 miles would increase the required computer running time by a thousand fold—from about 2 weeks to more than 30 years of run-time to compute the resulting change in the equilibrium climate of the model! The Uses of Climate Models

Until the advent of supercomputers, our attempts at climate modeling were rudimentary. That situation changed roughly 25 years ago. Much of the recent attention by the public and decision-makers on climate change has been due to measurements indicating that warming has been occurring near the Earth's surface over the last century and to relatively recent projections from GCMs.

Climate varies naturally over both short and long time scales, sometimes rather dramatically over a few years or decades. This rapid variability was experienced in Europe during the Little Ice Age of 1400–1850. To understand climate change, scientists must understand the detailed nature of the extremely complex climate system. While we have learned a great deal, there is still much we do not know. Člimate models today can give us insights into what might happen under various assumed situations.

Currently, there are about 30 GCMs being developed and/or used by research groups around the world. Many of these models are related, with the differences among the models lying in the natural processes they include and how they integrate and treat these processes within a specific model.

As discussed above, computer models are necessary in the study of climate change because of the extraordinary complexity and number of the physical processes that are embodied in the climate system. Some of the factors that affect climate include:

• the concentrations of gases and aerosols;
• interactions between the atmosphere, the biosphere, and oceans;
• volcanic activity; and
• interactions of components within the atmosphere and ocean themselves.

The growth of computing capacity has allowed scientists to integrate complex climate-system processes into single computational frameworks. These frameworks can be used to develop an increasingly more comprehensive, but still incomplete, overall picture of the global climate system. The Roles of GCMs

The general uses of GCMs are:

First, the building and running of a model is a process by which theory and observations are mathematically evaluated, codified and integrated in a computer program. Models can thereby be used to identify needed refinements in theory and observation. Model building is a long process of back and forth comparisons between analytical description (“theory”) and field studies (“observational data”). These comparisons include end-to-end efforts to correlate observational findings with improvements in model representations.

Second, climate models are used to identify and then assimilate observational measurements that are initially incomplete. These measurements can then be used to derive more consistent, spatially specific estimates of meteorological quantities. Such model-assimilated data have proven to be of great utility to the research community in better understanding the observed and potential variability of the climate system.

Third, models can be used to focus observational activities. In regions where data are sparse, models can be used to define the frequency, coverage, and type of measurements that may shed the most light on the physics, chemistry and the composition of the atmosphere.

Fourth, climate models have recently predicted a few climate anomalies up to a year in advance. These model predictions, which are increasing in accuracy, incorporate information on the current state of the oceans and atmosphere. Predictions of El Niño and La Niña events and climate anomaly patterns associated with these phenomena have proven reasonably accurate and there is potential for this type of model prediction to be extended out beyond a year.

Fifth, climate models can be used to develop scenarios of possible future states of the climate system, given a specified set of assumptions (e.g., the future quantities of greenhouse gases, including ozone trends and aerosols). Such climate scenarios can then be used to develop projections of possible climate-related impacts


on human and natural systems. Models currently show large-scale climatic response to increased greenhouse gas levels: for instance, (1) there may be some warming at the surface, warming of the troposphere, and some cooling in the stratosphere; (2) there may be greater warming at high latitudes than at low latitudes; and (3) there may be an increase in low level humidity over the oceans. Such fingerprints of human-induced climate change have been compared with the observed climate to help detect its changes and attribute its causes,

From the GCM-based projections of climate change, analysts can begin to evaluate the potential impacts on market and non-market sectors of society. As these impact models become more sophisticated, increasingly better pictures of what might happen under different scenarios will develop. More research on impacts will help countries identify the seriousness of possible climate change and allow them to study the cost-benefits of various response options.

In addition, models can be used to facilitate an understanding of the lag time between causes and effects associated with human as well as natural causes of climate change. It is essential to keep in mind that model projections depend on the sophistication of the model: the estimates in the model, the assumptions used by the model, and what in nature is not yet understood and therefore not covered in the model. This is why the climate research community generally places so much emphasis on verifying model results with actual data. By exploring sets of these model projections, the policy community can begin to discuss the effects that policies, aimed at reducing greenhouse gases, might have on climate, humans, and economies. Models & Decision-Making

Existing GCMs can make "what if" projections of future global climate possibilities because they are the best available tools, even though they are currently limited in resolution and completeness. Regionally specific information is ultimately needed because, for example, while U.S. citizens have interest in what happens to the planet as a whole, they are especially interested in what happens to the U.S. and to their own neighborhood. Global climate projections from different models show a range of effects. The range of effects is largest for smaller regions. Partly, this is due to the natural local variability of climate and partly this is due to scientific uncertainties.

Just as global climate models have advanced, so have global economic impact models for estimating costs and benefits. Integrated assessment models, which take into account chains of events (if “A” happens, then results “B” could occur, but if “A” does not happen, then “C” will occur), are a tool to help understand long-term costs and benefits. Limitations of Models

Having discussed the uses and strengths of GCMs, one should not assume that they do not have weaknesses-in fact, some scientists would state that the weaknesses are so great as to question their value in near-term decision-making. Some of the features of the GCMs are less robust than others, partly because there is disagreement between the models about predicted climate changes. Furthermore, even if the models agreed, it does not necessarily make them correct. Phenomenological Feedbacks

Much of the uncertainty in current climate models is associated with "feedbacks”—how various phenomena interact with one another. Feedback mechanisms are clearly important. Climatologists agree that, without these feedbacks, a doubling of CO2 would give about a 1.8°F (1°C) rise in global-average temperature. Many phenomena have large impacts on others, some amplifying and some dampening effects. Some extremely important phenomena, the feedback consequences of which we do not fully understand, are the following:

• Clouds;
• Ice;
• Land surface processes;
• Ocean effects;
• Biological processes;
• Physical and chemical reactions in the atmosphere;
• Particulates;
• Solar cycle effects; and,

[ocr errors]



Tropical convection and rainfall. These phenomena are not yet adequately understood in isolation, let alone in combination with other factors. Thus, scientists must utilize approximations, estimates of aggregate regional effects, or ignore some phenomena all together for the time being. Other suspected feedback mechanisms are yet to be described or modeled.

For example, the role of clouds and water vapor in climate models is not well understood; yet water vapor is the most significant greenhouse gas in the natural (unperturbed) atmosphere and dramatically affects cloud cover and the transfer of radiant energy to and from the Earth's surface.

Also, modeling the impact of clouds is difficult because of their complexity and compensatory effects on both weather and climate. Clouds can reflect incoming sunlight and therefore contribute to cooling, but they also absorb infrared radiation that would otherwise leave the earth, thereby contributing to warming. Parameters

Models utilize observational data to adjust various model parameters to help make such parameters more realistic. “Tuned” models, however, cannot be validated by the data for which they were adjusted and must be validated by independent

As previously mentioned, the equations related to the climate to be modeled are so complex that they can only be solved at specific geographical and vertical locations, and only over specific time intervals. The limit on horizontal size imposed by present-day supercomputers also limits the physical processes that can be explicitly included in a GCM. As discussed above, GCMs using today's supercomputers explicitly include physical processes having horizontal sizes of approximately 250 miles and larger. Worse yet, the physical processes smaller than 250 miles cannot be ignored because their effects can significantly impact climate and climate change. Thus, climate modelers face the dilemma that their models cannot resolve the smallscale physical processes and they cannot ignore their effects. This is one, if not the major difficulty in modeling the Earth's climate. The approach taken to overcome this problem is to determine the effects of the small-scale physical processes on the larger scales that can be included in a GCM using information on those larger scales and statistical relationships. This approach is called "parameterization.” The principal differences among GCMs lie in their approaches to parameterization, particularly in the case of cloud and precipitation processes. These parameterization differences have a significant influence on differences in climate sensitivity—the change in the equilibrium global-mean surface temperature resulting from a doubling of the CO2 concentration-between various GCMs. Testing Models One

way that models are tested is to use them to reproduce past events and variations. The earth's climate has been changing for millions of years but we do not have detailed data on those changes because humankind was not acquiring relevant data until relatively recently. As such, we cannot accurately truth test climate models over past periods of time beyond much more than a hundred years. Thus, we are asking these models to assist us in decision-making in an environment of considerable scientific uncertainty. There is, however, significant effort underway to compare the general nature of model simulations of pre-historic time periods against data from proxies (e.g., tree-ring widths, borehole temperatures, and oxygen isotopes in sediments) of past climates. Human Resources

Compared to intermediate and smaller modeling efforts, such as those aimed at understanding the behavior of a particular climate process over a single locality, insufficient U.S. and international resources for research and computer hardware are being devoted to high-resolution global climate modeling. Data

Instrumental temperature measurements of varying quality exist for about 135 years. Relatively crude but useful information before then has been obtained from proxy data such as the width of tree rings and the abundance of certain isotopes trapped in ice cores taken from the ice caps and glaciers and in sediment cores taken from the deep sea and lakes.

Climate data are routinely collected for weather prediction. Much of this data gathering was not designed to detect subtle trends that occur on decadal or longer time scales. For climate modeling, we need more accurate and extensive data than even currently used in weather prediction. There is also a need for better organization and long-term archiving of climate data.

Advancement of Models

Model development has progressed considerably in the past decade. However, though there have been downward modifications in estimates of future climate change (e.g., through the inclusion in models of the effects of aerosol cooling), the limits of uncertainty in possible global-average warming for a future doubling of CO2 have not been narrowed; that uncertainty has been in the 2.7–8.1°F (1.5-4.5°C) range for the past 20 years most GCMs.

While the capacities and speed of supercomputers have progressed dramatically in recent years, climate models remain constrained by current computational capacity. In fact, the leading climate models are no longer in the United States because U.S. researchers do not have access to the more powerful Japanese computers that other nations (i.e., Canada, Japan, United Kingdom) are using. Current computer capabilities applied to climate modeling are modest compared to what is needed to run high-resolution simulations using GCMs. Current computer limitations require that we settle for grid sizes that are much larger than needed to model some important phenomena such as tropical convection and precipitation.

The participants in the discussion agreed with the National Research Council's Report “Capacity of U.S. Climate Modeling (1998)” statement of the Council's “summary results”, if not all the details of its Report. Conclusions

There are significant uncertainties in predicting future climates as a consequence of (a) natural climate variability; (b) the potential for uncertain or unrecognized climatic forcing factors (e.g., explosive volcanism, new or unknown anthropogenic influences, etc.); and (c) inadequate understanding of the climate system. We must expect that new observations or results from studies of global climate processes may yield information that causes us to re-evaluate and improve the capability of climate models. Our estimates of the credibility of climate system models can be, of necessity, consistent only with known facts and only based on the “best” current knowledge. Projections vs. Predictions

Thus, it was the consensus of the experts convened by the Annapolis Center that climate models may never be able to make greenhouse-warming PREDICTIONS with certainty because of the enormous number of variables involved and the uncertainty inherent in the future. On the other hand, models of greenhouse warming are essential in the learning process. Climate models can be used for making PROJECTIONS based on various assumptions that in turn may be useful in understanding the consequences of various human activities and policy alternatives. When such projections will represent possible real climate futures is difficult to judge because of the enormous scientific uncertainties involved.

CLIMATE PROJECTIONS are “what-if" scenarios about what might happen under a set of ASSUMED conditions. Projections may change as more knowledge is acquired.

When weather forecasters make predictions one day or a week in advance, they can verify their predictions soon thereafter. Climate projections for the next century cannot be verified so easily.

Continued climate warming year after year is not likely to occur. Periods of apparent cooling, however, would not necessarily mean that the Earth was not slowly warming over the long term. Similarly, if we were to experience warming year after year, we should not assume that man-made climate change was the primary or only


Though Better Understood, We Still Have A Long Way To Go

Global climate science has progressed significantly in recent years but our lack of knowledge is still great. A major vehicle for understanding the enormously complex global climate system has been computer modeling. Today's GCMs have developed rapidly relative to earlier models and provide improved estimates of what may happen in the future. Many believe that such models are still in a relatively early stage of development. Nevertheless, GCMs are important research tools that can help to focus the research and measurements needed to better understand climate change. Climate modeling will be increasingly more valuable as models and our understanding of basic processes are improved.

Models of climate changes are still evolving because we do not yet completely understand or model everything that can or will affect climate. Scientific uncertainty will always be a component of modeling climate change. Our challenge is to reduce this uncertainty.

Because of the Uncertainty, Care Must Be Used in Decision-Making

Care must be taken when using the results of climate models for major public policy decisions because of the existing uncertainty, as well as our lack of knowledge about important physical and chemical reactions in the atmosphere and oceans. Adapt Via Act-Learn-Act

Because man-made greenhouse gas emissions are likely to continue to increase in the future, the workshop participants endorse adaptive and affordable management strategies, such as "act-learn-act,” that are robust against what we do no yet know. We will surely be learning more about climate change over time. As we learn more, we must revisit greenhouse-related policies and adjust them accordingly.


UNIVERSITY OF WASHINGTON Climate change takes on real force when it combines with human activity. It produces multiple and compounded changes of the physical environment, and of ecosystems. The U.S. feels these impacts from beyond national boundaries, from the global atmosphere and ocean.

There are many points of contention: between modification of our environment and accommodation to it; between natural and human-induced climate change; within the scientific debate, between the need for prediction and the need for diagnosis. Improved observation and understanding of the current and past states of the environment (the atmosphere, ocean and land surface) may be just as important as attempts to predict its future.

As Dr. Schmitt has earlier this morning described, the ocean plays a particularly interesting role in climate: it dominates the storage of heat and carbon and water; it also contains a significant fraction of global biological activity: photosynthesis and respiration. It is a well-spring of diversity, harbors newly discovered forms of life, and in the search for natural pharmaceuticals it is richer than the land.

Large-scale oscillations of climate. El Niño/Southern Oscillation (ENSO), centered in the tropics, is an 'argument between ocean and atmosphere which radiates across N. America. With enormous impact on temperature, rainfall, storms, flooding, drought, there is some good news in an el niño winter, and much bad news.

In the far northern Atlantic Ocean, the paths followed by intense storms over the ocean have moved north since the early 1970s. These storms intensify as they suck heat from the ocean. This is a part of the so-called North Atlantic Oscillation (NAO), which can switch regimes from one month to the next, or from one 30 year period to the next: it has an element of unpredictability. It is intimately related to the jet stream and polar vortex, a 'tall mode that reaches to the stratosphere. The NAO is one of several important patterns of oscillation of the atmosphere outside of the tropics (others include north-south 'annular oscillation of the jet-stream system in the Southern Hemisphere, and a great wave round Antarctica that appears to be coupled between ocean and atmosphere).

In addition to its many impacts on weather, drought and flooding, the NAO is involved in the great, deep overturning circulation of the ocean. The temperature and salinity of the oceans both condition its fluid density . . . its ability to sink. It is at high latitude that the ocean is chilled by the atmosphere, and in rare and small regions, water sinks to the abyss. This global system fulfills the need for heat to be transported from the warm latitudes to the cold, where it radiates to space.

Nearly horizontal layering of the oceans, with dense waters sinking beneath buoyant surface waters, is the result of this ‘heat engine' and it is of great consequence to the distribution of ocean life. Photosynthetic life needs sunlight and nutrients. By controlling the flow of nutrients from their rich store at depth, upward to the sunlit surface, life of the ocean is determined by its patterns of its up/down, north/south circulation. This ‘meridional overturning circulation' provides a severe challenge to computer models, because of the small yet essential features and the complex shape of the solid Earth. While current computer models have many inaccuracies, they are increasingly being subjected to the acid test of focused, small scale seagoing observational programs.

ENSO and NAO are examples of the possible expression of global warming in ‘modes'. : . that is patterns of ocean and atmosphere response with warm and cold, wet and dry. The Titantic sank in 1912, during a cold period that encouraged icebergs to reach southward into shipping lanes. There followed two major periods of global warming this century, the 1930s-40s and 1970s–90s, which in fact correlate with phases of the NAO. These modes are good tests of computer models of climate, and indeed are the subject of intense simulation work at present.

« PreviousContinue »