An international group of 80 climate scientists, statisticians, and computer programmers recently gathered in Exeter, England, to discuss how to expand and improve surface temperature data (also see earlier Forum article).

The conference, titled “Creating Surface Temperature Datasets to Meet 21st Century Challenges,” took place September 7-9 and brought together researchers from NOAA’s National Climate Data Center (NCDC), the Hadley MET office, and the World Meteorological Organization, and also some independent scientists and interested parties. It focused on creating a plan to assemble all the world’s temperature records into a new database that would be fully and transparently documented and available for the public to access.

The conference was particularly notable in that it included a diverse range of researchers from different fields, along with some notable climate “skeptics.” In addition to the climate scientists from NCDC and the MET office, for instance, there were a number of statisticians (like Ian Jolliffe), computer programmers (including Nick Barnes and Steve Easterbrook), the Google.org team, and climate scientist and “skeptic” John Christy of the University of Alabama-Huntsville.

In Part a Response to Hacked E-Mails Concerns

The conference in part was a response to criticism of the availability of surface temperature datasets in the light of the release of University of East Anglia Climatic Research Unit e-mails in the fall of 2009, several individual conference participants indicated. They say much of the focus was on consolidating various temperature datasets available through different organizations, and creating a system by which data in all forms (raw and adjusted) is freely available and transparently documented. There was also an emphasis on making adjustments to correct for discontinuities in the data more transparent, and provide validation exercises to test the ability of different adjustment algorithms and detect and correct for different types of data conflicts (e.g., station moves, microsite changes, instrument changes, time of observation changes, etc.).

The only global climate data currently readily available is in the form of monthly mean values. Daily and sub-daily (e.g., hourly) temperature data exist, but generally not assembled in formats useful for researchers. In addition, much historical data that has been gathered but not yet digitized.

Millions of Pages Scanned … With No Place to Go

NOAA’s NCDC alone has more than 56 million pages of temperature data scanned but not yet keyed into a database, an effort the agency has not had adequate resources to pursue on its own. The conference included extensive discussion on the possibility of “crowdsourcing” the digitization of historical data: harnessing thousands of volunteers to examine scanned temperature logs and key in the values. Assembling that data in a central well-documented and easily-accessible database would enable researchers and concerned citizens to readily analyze all available climate data.

Conference attendees interviewed for this report say there was a focus also on improving station metadata, information that describes the station location, sensor type, history of moves or other changes, site characteristics/urbanity, and other factors. Even such basic information as station latitude and longitude coordinates are sometimes inaccurate, making it difficult to use remote sensing and other high-resolution spatial databases to classify site characteristics. Collecting station histories, using GPS devices to pinpoint instrument locations, and tying into remote sensing tools like Google Earth could all help improve the quality of station information, conference experts said.

NCDC used the Exeter conference to introduce its newest update to the Global Historical Climatological Network (GHCN). GHCN version 3 includes a new algorithm to automatically detect station moves and other factors that create unusual jumps or falls in temperature by comparing stations to their nearby neighbors. The net result is a record quite similar to that of GHCN version 2 globally, though some regions have larger changes than others.


View larger image

Finally, there was a push among some attendees to set up a global Climate Reference Network (CRN) of well-sited and distributed stations specifically selected to track global climate changes going forward, similar to the U.S. CRN set up over the last decade. Temperature stations originally had been set up for measuring weather rather than long-term changes over time, and creating climate records from weather stations presents many difficulties. Having a dedicated network of stations with excellent siting and record-keeping could provide a huge benefit to future generations of climate researchers, experts said.

The conference also set up a website and blog to provide updates and seek feedback on the effort going forward.

Filed under: , , ,