Climate Experts, Statisticians, Programmers Meet in England on Temperature Records

An international group of 80 climate scientists, statisticians, and computer programmers recently gathered in Exeter, England, to discuss how to expand and improve surface temperature data (also see earlier Forum article).

The conference, titled “Creating Surface Temperature Datasets to Meet 21st Century Challenges,” took place September 7-9 and brought together researchers from NOAA’s National Climate Data Center (NCDC), the Hadley MET office, and the World Meteorological Organization, and also some independent scientists and interested parties. It focused on creating a plan to assemble all the world’s temperature records into a new database that would be fully and transparently documented and available for the public to access.

The conference was particularly notable in that it included a diverse range of researchers from different fields, along with some notable climate “skeptics.” In addition to the climate scientists from NCDC and the MET office, for instance, there were a number of statisticians (like Ian Jolliffe), computer programmers (including Nick Barnes and Steve Easterbrook), the Google.org team, and climate scientist and “skeptic” John Christy of the University of Alabama-Huntsville.

In Part a Response to Hacked E-Mails Concerns

The conference in part was a response to criticism of the availability of surface temperature datasets in the light of the release of University of East Anglia Climatic Research Unit e-mails in the fall of 2009, several individual conference participants indicated. They say much of the focus was on consolidating various temperature datasets available through different organizations, and creating a system by which data in all forms (raw and adjusted) is freely available and transparently documented. There was also an emphasis on making adjustments to correct for discontinuities in the data more transparent, and provide validation exercises to test the ability of different adjustment algorithms and detect and correct for different types of data conflicts (e.g., station moves, microsite changes, instrument changes, time of observation changes, etc.).

The only global climate data currently readily available is in the form of monthly mean values. Daily and sub-daily (e.g., hourly) temperature data exist, but generally not assembled in formats useful for researchers. In addition, much historical data that has been gathered but not yet digitized.

Millions of Pages Scanned … With No Place to Go

NOAA’s NCDC alone has more than 56 million pages of temperature data scanned but not yet keyed into a database, an effort the agency has not had adequate resources to pursue on its own. The conference included extensive discussion on the possibility of “crowdsourcing” the digitization of historical data: harnessing thousands of volunteers to examine scanned temperature logs and key in the values. Assembling that data in a central well-documented and easily-accessible database would enable researchers and concerned citizens to readily analyze all available climate data.

Conference attendees interviewed for this report say there was a focus also on improving station metadata, information that describes the station location, sensor type, history of moves or other changes, site characteristics/urbanity, and other factors. Even such basic information as station latitude and longitude coordinates are sometimes inaccurate, making it difficult to use remote sensing and other high-resolution spatial databases to classify site characteristics. Collecting station histories, using GPS devices to pinpoint instrument locations, and tying into remote sensing tools like Google Earth could all help improve the quality of station information, conference experts said.

NCDC used the Exeter conference to introduce its newest update to the Global Historical Climatological Network (GHCN). GHCN version 3 includes a new algorithm to automatically detect station moves and other factors that create unusual jumps or falls in temperature by comparing stations to their nearby neighbors. The net result is a record quite similar to that of GHCN version 2 globally, though some regions have larger changes than others.


View larger image

Finally, there was a push among some attendees to set up a global Climate Reference Network (CRN) of well-sited and distributed stations specifically selected to track global climate changes going forward, similar to the U.S. CRN set up over the last decade. Temperature stations originally had been set up for measuring weather rather than long-term changes over time, and creating climate records from weather stations presents many difficulties. Having a dedicated network of stations with excellent siting and record-keeping could provide a huge benefit to future generations of climate researchers, experts said.

The conference also set up a website and blog to provide updates and seek feedback on the effort going forward.

Zeke Hausfather

Zeke Hausfather, a data scientist with extensive experience with clean technology interests in Silicon Valley, is currently a Senior Researcher with Berkeley Earth. He is a regular contributor to Yale Climate Connections (E-mail: zeke@yaleclimateconnections.org, Twitter: @hausfath).
Bookmark the permalink.

5 Responses to Climate Experts, Statisticians, Programmers Meet in England on Temperature Records

  1. fred ohr says:

    The inaccuracies in the data base are so profound that it renders any “science” derived from them unreliable and unusable. Yet, peewee minded climatologists tell us that the planet is on a path to climatic destruction. Har, Har, Har.

    Furthermore, the data bases have been manipulated by unethical scientists with a political agenda; to wit the southward migration of data base thermometers, the drop in average altitude of same,the mysterious “adjustments” of older temperature data, always downard to enhance the “hockey stick”, critical to the alarmist storyline. I could go on, but why bother.
    Sensible adults have already decided, AGW theory is a crapsandwich made by men with pencils.

  2. Fred,

    I’d suggest this page as a good resource for various questions about the surface temperature record. There are a lot of misconceptions floating around out there, many of which are not particularly well-founded:

    http://rhinohide.wordpress.com/faq/

  3. Troy_CA says:

    Zeke,

    You mentioned previously that you were attempting to track down pre-1990 U.S. population data for comparing population trends against temperature trends by station.

    I’ve done some of this analysis:

    http://troyca.wordpress.com/2010/11/08/a-more-robust-method-of-finding-a-uhi-signal/

    Using years 1970-2000 by matching census population data to USHCN stations. Included in the package is population data for years 1970, 1980, 1990, and 2000 matched to a good chunk of the USHCN stations, if you’re still interested.

  4. Zeke Hausfather says:

    Troy,

    Interesting work. There is a lot of noise, but definitely some signal in there. This might be a useful resource for you to extend your analysis a bit further back: http://www.ncdc.noaa.gov/oa/climate/research/population/popdata.html

    I’m working on a paper that uses this data (among other sources) to classify stations as urban or rural based on population growth and compare resulting station pairs to try and detect any differences.

  5. Troy_CA says:

    Thanks for the link to that data, Zeke, it was a good format to work with. I posted some results using this data back to 1930 up at http://troyca.wordpress.com/2010/11/12/amrmo-finding-a-uhi-signal-p3-using-noaa-population-data/

    Although the results are less impressive when compared with previous tests, even for the same period. I’m a bit curious why the NHGIS and NOAA population data yield such different results, especially for the 1970 – 2000 period.

    Your paper sounds interesting, and definitely a topic I’ve been curious about as well. Any estimate on when you might submit or post a working copy?