Re: [asa] Fw: Temperature Records

From: Rich Blinne <>
Date: Thu Dec 10 2009 - 00:03:18 EST

On Wed, Dec 9, 2009 at 7:56 AM, Bill Powers <> wrote:

I am in agreement with you. My cursory and perhaps naive impression is that this long trend data, if indeed such creative "homogenization" is required entails that the whole data set needs to be discarded and we simply confess that we don't know the long term trend (i.e., over 100 years).


That's what GISS did. They started their data series in 1963 and did very little correction. Interestingly enough, the author complained that they got a same or similar answer. BTW, what Ian proposed is normally done:
In creating each year’s first difference reference series, we used the five most highly correlated neighboring stations that had enough data to accurately model the candidate station.

There's also an obsession with 1941 which was when the station was probably bombed. So it needed to be moved and therefore corrected. In the infamous e-mails the hiding the decline was where the tree data at high lattitudes started deviating from the instrumental records after either 1960 or 1980 depending on the proxy set. Which means the tree ring data confirmed the instrumental record BEFORE these dates. We have concordance between all the instrumental records and the computer models and the proxies up to this date. But the evil GISS only analyzed after 1963 in order to miss the decline. Guys, you cannot have both ways. When's the decline? Before or after 1960? But, maybe there is a conspiracy amongst the data collecting agencies. Problem is there are also national meteorological groups. So what does Australia's Bureau of Meteorology have to say?

Here's an explanation of homogenization from the Australian Bureau of Meteorology:

Australia’s high-quality climate change datasets

Monitoring changes in Australia’s climate requires observational datasets that are not only good quality, but also homogeneous through time.

A homogeneous climate record is one in which all observed climate variations are due to the behaviour of the atmosphere, not other influences, such as changes in location, exposure of the observation site, instrumentation type or measuring procedure.

A change in the type of thermometer shelter used at many Australian observation sites in the early 20th century resulted in a sudden drop in recorded temperatures which is entirely spurious. It is for this reason that these early data are currently not used for monitoring climate change. Other common changes at Australian sites over time include location moves, construction of buildings or growth of vegetation around the observation site and, more recently, the introduction of Automatic Weather Stations.

The impacts of these changes on the data are often comparable in size to real climate variations, so they need to be removed before long-term trends are investigated. Procedures to identify and adjust for non-climatic changes in historical climate data generally involve a combination of:
investigating historical information (metadata) about the observation site,
using statistical tests to compare records from nearby locations, and
using comparison data recorded simultaneously at old and new locations, or with old and new instrument types.
"High-quality" Australian climate datasets have been developed in which homogeneity problems have been reduced or even eliminated. These datasets represent only a small fraction of the total Australian climate archive. To date, high-quality datasets have been developed for rainfall and temperature and work is underway to produce datasets for other variables including evaporation and humidity.

As to there wasn't warming at Darwin Airport look at the follow graph from the Australian Government of the high quality data from Darwin Airport:

They found a 1.3 degree per century trend at Darwin Airport while GHCN had a 1.2 degree per century trend there. The conspiracy widens. Not just a few guys sending e-mails or three or four organization in cahoots we now have the national meteorological organizations and the data centers and the computer models and the proxies all producing the same results. Sorry, but this defies credulity.

NCDC explained the purpose of homogeneity adjustments is seen by the following passage not quoted by the web page:

A great deal of effort went into the homogeneity adjustments. Yet the effects of the homogeneity adjustments on global average temperature trends are minor (Easterling and Peterson 1995b). However, on scales of half a continent or smaller, the homogeneity adjustments can have an impact. On an individual time series, the effects of the adjustments can be enormous. These adjustments are the best we could do given the paucity of historical station history metadata on a global scale. But using an approach based on a reference series created from surrounding stations means that the adjusted station’s data is more indicative of regional climate change and less representative of local microclimatic change than an individual station not needing adjustments. Therefore, the best use for homogeneity-adjusted data is regional analyses of long-term climate trends (Easterling et al. 1996b). Though the homogeneity-adjusted data are more reliable for long-term trend analysis, the original data are also available in GHCN and may be preferred for most other uses given the higher density of the network.

As we see above having a large homogeneity adjustment on a single series is not a double yikes. Note the difference of scale between the homogeneity series and the actual adjustments. The anomaly is only of the size of 1/2 degree C. It's not surprising at all to have this large change that produces a comparatively smaller change in the anomaly. Again, after looking a little more closely at the data there's no there there.

Rich Blinne
Member ASA

To unsubscribe, send a message to with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Thu Dec 10 00:03:41 2009

This archive was generated by hypermail 2.1.8 : Thu Dec 10 2009 - 00:03:47 EST