Page images
PDF
EPUB

Summary of Reconstructing Past Climate from Noisy Data by Hans von Storch et al. (2004)

While attempting to measure anthropogenic effects on the earth's climate, it is necessary to create a reconstruction of past climate variations. Most studies have identified varying warm values in the 11th and 12th centuries followed by secular cooling periods in the mid16th, 17th and early 19th centuries. These cooler intervals were followed by warming that is still experienced today. The amplitude of these preindustrial variations is debated, although the most notable study on the subject and the most quoted, Mann et al. 1998 (MBH98), as well as the Intergovernmental Panel on Climate Change (IPCC), report that these variations were of small amplitude. However, recent studies have suggested that centennial variations may have been larger than previously thought. This study uses a coupled atmosphere-ocean model simulation of the past millennia as a surrogate climate to test the reconstruction method of MBH98.

Using this model as a virtual world to determine the skill of regression-based reconstruction models like MBH98, von Storch et al. found that the model is reasonably skilled at reproducing short-term variations but substantial underestimation occurs in the long-term estimations. On an inter-annual scale, the reconstruction has a calibration reduction-of-error statistic of .7 for perfect pseudo-proxies and .3 for pseudo-proxies with a higher degree of noise. However, only 20% of the 100-year variability is recovered when the noise level is approximately 50%. Similar results were obtained using the third Hadley Centre coupled model (HadCM3), indicating the results are not dependent on the model used.

Von Storch et al. also tested a number of other hypotheses. They found that including more instrumental data in the proxies does not improve results, expanding the proxy set in sparse areas improved results marginally, and that expanding the range of temperature variability present in the pseudo-proxies greatly improves the results. Additionally, von Storch et al. questioned the validity of linear regression models in general in estimating climate. Using pseudo-proxies to estimate local temperatures which were then spatially averaged to derive a Northern Hemisphere temperature, they found similar problems that occur in MBH98: underestimation of low-frequency variability for a given amount of noise. The authors conclude that climate simulations of the past millennium are burdened by model limitations and uncertainties in external forcing and therefore the output must be considered with care. Additionally, the linear regression methods as used in MBH98, suffer from marked losses of centennial and multidecadal variations.

Summary of The M&M Critique of the MBH98 Northern Hemisphere Climate Index: Update and Implications by Stephen McIntyre and Ross McKitrick (MM05a) (2005a)

In an extension of their 2003 paper (Corrections to the Mann et. Al. (1998) Proxy Database and Northern Hemispheric Average Temperature Series), McIntyre and McKitrick further detail their critique of Mann et. al. (1998) and respond to its subsequent update Mann et. al. (1999). In response to McIntyre and McKitrick (2003), Mann et. al. published new information regarding their original research that MM03 attempted to replicate. While the new information did not include the source code used to generate the original results, it did include an extensive archive of data and supplementary information on the methods at the University of Virginia FTP site.

In their article, M&M indicate that the individual data series (proxies) used to reconstruct the temperature index are important, and that errors within these series do not get washed out in a multi-proxy study. Specifically, MM05a found that the differences in MBH98 and MM03 can be almost fully reconciled through the variations in handling of two distinct series, the Gaspe "northern treeline" series and the first principal component (PC1) from the North American proxy roster (NOAMER). In MBH98, the first four years of both of these series were extrapolated. The extrapolation has the effect of depressing early 15th century results, and was not disclosed by Mann et al. until a later paper, Mann et al. (2004). The underlying dataset that was subject to extrapolation also fails to meet the data quality standards described by Mann et al. elsewhere in the paper.

In the MBH98 methodology, they used a principal component analysis, which they reported to be conventional or centered. However, in further disclosure of information on the UVA FTP site, it has been determined that the principal component analysis was not actually centered. In fact the mean used in their calculations is the 1902-1980 mean, but it was applied to the period 1400-1980. The effect of de-centering the mean is a persistent "hockey stick" shaped PC1, even when layered with persistent red noise. It follows from this shape that the climate of the late 20th century was unprecedented. Because the original code is in FORTRAN, which takes much more programming to run statistical processes than modern software such as R, it is very possible that this is due to a programming error, although Mann et al. have not admitted to any such error.

In the MBH98 de-centered principal component calculation, a group of twenty primarily bristlecone pine sites govern the first principal component. Fourteen of these chronologies account for over 93% variance in the PC1 and 38% of the total variance. The effect is that it omits the influence of the other 56 proxies in the network. In a centered version of the data, the influence of the bristlecone pine drops to the fourth principal component, where it accounts for 8% of the total variance. The MM03 results are obtained if the first two NOAMER principal components are used. The MBH98 results can be obtained if the NOAMER network is expanded to five principal components. Subsequently, their conclusion about the climate of the late 20th century is contingent upon including low-order principal components that only account for 8% of the variance of one proxy roster. Furthermore, the MM03 results occur even in a de

centered PC calculation, regardless of the presence of PC4, if the bristlecone pine sites are excluded.

In the Gaspe "northern treeline" series, MM05a found that the MBH98 results occur under three conditions: 1) the series must be used as an individual proxy; 2) the series must contain the portion of the series that relies only on one or two trees for data; and 3) it must contain the ad-hoc extrapolation of the first four years of the chronology. Under all other conditions, including using an archived version of the series without extrapolation, MM03 type results occur.

66

MM05a also addresses the MBH98 claims of robustness in their findings. The sensitivity of the 15th century results to slight variations in the data and method of two individual series show a fundamental instability of the results that flatly contradicts the language used in MBH98 and in Mann et al. (2000) where it states .whether we use all data, exclude tree rings, or base a reconstruction only on tree rings, has no significant effect on the form of the reconstruction for the period in question..." Additionally, MM05a notes much of the specialist literature raises questions about these indicators and at the least these questions should be resolved before using these two series as temperature proxies, much less as uniquely accurate stenographs of the world's temperature history.

In response to MM03, Mann et al. wrote several critiques that appeared in Nature magazine as letters and as separate articles. The Mann et al. (2004) paper argued that the MM03 use of centered principal components calculations amounted to an "effective omission" of the 70 sites of the North American network. However, the methodology used omits only one of the 22 series. A calculation like this should be robust enough that it is relatively insensitive to the removal of one series. Also, "effective omission" is more descriptive of the MBH98 de-centering method, which uses 14 bristlecone sites to account for over 99% of explained variance.

In another response, Mann et al. claim that the PC series are linear combinations of the proxies and as such cannot produce a trend that is not already present in the underlying data. However, the effect of de-centering the mean in PC analysis is that it preferentially selects series with hockey-stick shapes and it is this over weighting that yields a pattern not representative of the underlying data. Additionally, Mann et al. responded to the MM03 critique of the bristlecone pine, which pointed out that the bristlecone pine had no established linear response to temperature and as such was not a reliable temperature indicator. Mann et al. responded by stating that their indicators were linearly related to one or more instrumental training patterns, not local temperatures. Thus, the use of the bristlecone pine series as a temperature indicator may not be valid.

The authors of MM05 concluded that the various errors and adverse calculations that were not disclosed exhibit the limitations of the peer review process. They also note the limited due diligence of paleoclimate journal peer review and that it would have been prudent to have checked the MBH98 data and methods against original data before accepting the findings as the main endorsement of the Intergovernmental Panel on Climate Change.

Summary of Hockey sticks, principal components, and spurious significance by
Stephen McIntyre and Ross McKitrick (2005b)

In their critique of Global-scale temperature Patterns and Climate Forcing Over the Past Six Centuries (MBH98) by Mann et al., McIntyre and McKitrick (M&M) note several errors in the methodology and subsequent conclusions made by Mann et al. First, M&M discuss the incorrect usage of principal component analysis (PCA) in MBH98. A conventional PC algorithm centers the data by subtracting the column means of the underlying series. For the 1400 to 1450 data series, the FORTRAN code contains an unusual data transformation prior to the PC calculation, which was never reported in print. Each tree ring series was transformed by subtracting the 1902-1980 mean and then dividing by the 1902-1980 standard deviation and dividing again by the standard deviation of the residuals from fitting a linear trend in the 1902-1980 period. For PCA, if the 1902-1980 mean is close to the 1400-1980 mean, then there will be very little impact from this linear transformation. However, if the means differ, then the explained series variance is inflated. Since PCA gives more weight to series that have more explained variance, the effect is preference for the 'hockey stick' shape seen in Mann et al.. This 'hockey stick' shape supports the conclusions that climatic conditions in the late twentieth century are anomalies.

M&M also ran a Monte Carlo Simulation on 70 of the stationary proxy data series. When applying the linear transformation described above that was found in MBH98, nearly every simulation yielded first principal components (PC1) with a 'hockey stick' shape. Without this transformation, the 'hockey stick' shape appeared in the PC1 only 15.3% of the time. Additionally, the MBH98 method creates a PC1 that is dominated by bristlecone pine and foxtail pine tree ring series (both closely related species). Out of the 70 sites in the network, 93% of the variance in the MBH98 PC1 is accounted for by only 15 bristlecone and foxtail pine sites, all with data collected by one man, Donald Graybill. Without the transformation, these sites have an explained variance of less than 8%. The substantially reduced share of explained variance coupled with the omission of virtually every species other than bristlecone and foxtail pine, argues strongly against interpreting it as the dominant component of variance in the North American network. There is also evidence present in other articles calling the reliability of bristlecone pines as an effective temperature proxy into question.

M&M also evaluated the MBH98 usage of the Reduction of Error statistic in place of the more reliable and widely used Monte Carlo Model to establish significant benchmarks. By using the Monte Carlo Model, M&M found that a more accurate significance level for the MBH98 procedures is .59, as opposed to the level of 0.0 reported in the original study. A guard against spurious RE significance is to examine other statistics, such as the R2 and CE statistics. However, MBH98 did not report any additional statistics for the controversial 15th century period. The M&M calculations indicate that these values for the 15th century section of the temperature reconstruction are not significant, thereby refuting the conclusions made by MBH98.

Summary of Highly Variable Northern Hemisphere Temperatures Reconstructed from Low- and High-Resolution Proxy Data by Anders Moberg et al. (2005)

In their study, Moberg et al. reconstruct a climate history for the past 2,000 years using low resolution proxies (proxies that provide climate information at multi-centennial timescales, such as ocean sediment cores) and high resolution proxies (proxies that provide climate information on a decadal scale, such as tree rings). Due to the high profile of high-resolution proxies in reconstructions, mostly from Mann et al. 1998, views have been expressed that only tree ring and other high resolution data are useful for quantitative large scale temperature reconstructions. However, tree ring data has a well documented unreliability in reproducing multi-centennial temperature variability. By using low-resolution data for multi-centennial information combined with high-resolution data for decadal information, the most unreliable timescales for each proxy can be avoided.

The dataset used for this study was limited since proxies were required that dated back 2,000 years. Seven tree-ring series and eleven low-resolution proxy series were used. To obtain a reconstruction covering the complete range of timescales Moberg et al. created a wavelet transform to ensure tree-ring records contribute only to timescales less than 80 years and all low-resolution proxies contribute only to longer timescales. To calibrate the reconstruction, its mean value and variance were adjusted to agree with the instrumental record of Northern Hemisphere annual mean temperatures in the overlapping period 1856-1979.

The reconstruction indicates two warm peaks around A.D. 1000 and 1100 and pronounced cold periods in the 16th and 17th centuries. The peaks in medieval times are comparable to those of the 20th century, although warmth seen in post-1990 seems to be unprecedented. Reconstructions of the temporal evolution of warming variables (volcanic aerosols, solar irradiance and greenhouse gases) have been used to drive simple energy balance climate models as well as fully coupled atmosphere-ocean general circulation models. Moberg et al. note that the Northern Hemispheric temperature series obtained from such an experiment with the coupled model ECHO-G bears a strong qualitative similarity to their reconstruction. This supports the case of a pronounced hemispheric low-frequency temperature variability resulting from the climate's response to natural changes in radioactive forcing.

There are notable differences in the Moberg et al. reconstruction and that of Mann et al. 1998. While there is a large amount of data in common between the two reconstructions, Mann et al. combined tree-ring data with decadally resolved proxies without any separate treatment at different timescales. Additionally, this study's dataset contains centennially resolved data from the oceans while Mann et al. used only annually or decadally resolved data from continents or locations near the coast. Mann et al. also used a different calibration method (regression versus variance scaling as in this study).

Further study in the process of weighting different timescales and spatial representation of the data should be conducted to see which method most accurately depicts past climate

« PreviousContinue »