Eli has been thinking a bit about the
Muir Russell Inquiry into the stolen Climate Research Unit Emails. One of the issues (if McIntyre and his ilk are about, when has it not been) is what sharing of data and methods is required by the act of publication. Now Eli is an OLD bunny. Maybe not quite so old that he wrote his thesis with a quill pen, but old enough that he did the drawings in India ink and lettered them with
Leroy templates. He remembers when the copying machine was a rexograph, mimeographs being too expensive, and when you paid a couple of hundred bucks for the original and three carbon copies of your thesis. The original went to the university library, one copy to you, one copy to your adviser and the third to University Microfilms, who, well microfilmed it.
In those days you actually paid for reprints of your papers, because once the type was set, the offprints were cheaper than copying, and people actually sent you postcards from third world countries like England and Japan, begging for copies, which you mailed off, because you wanted to keep them coming to get the stamps to give to your kid brother, who was grateful for a nanosecond, and postage cost nothing or was paid for by the department.
There were NO data repositories, no hard discs, paper tape took a lot of room and magnetic tape was something you ooed and aahed about. This, as all things changed. As it changed journal requirements changed also. Today, Eli wandered into the depths of the library to look at some dead trees. Specifically Nature. Turns out that in
1996 Nature's requirements changed from
Nature requests authors to deposit sequence and crystallography data in the databases that exist for this purpose
those fields being the first to establish such data archives, to the current
Materials: As a condition of publication authors are required to make maerials and methods used freely available to academic researchers for thier own use. Supporting data sets must be made available on the publication date from the authors directly and by posting on Nature's web site, by depostion in the appropriate data base or on the internet.
Most other journals have much less stringent policies, and these too have changed over time. Data retention is another area where forever is no answer. In the
US NIH policy is
Period of retention. Data should be retained for a reasonable period of time to allow other researchers to check results or to use the data for other purposes. There is, however, no common definition of a reasonable period of time. NIH generally requires that data be retained for 3 years following the submission of the final financial report. Some government programs require retention for up to 7 years. A few universities have adopted data-retention policies that set specific time periods in the same range, that is, between 3 and 7 years. Aside from these specific guidelines, however, there is no comprehensive rule for data retention or, when called for, data destruction.
Kings College (London) has
a flow chart for the engineering bunnies where the recommendation is seven years for funded research and four for unfunded.
All this goes to the accusations against Phil Jones and the CRU for "destroying data". It's been clearly established that the CRU was never a data depository for data from the National Meteorological Services, but there has been plenty of noise that they had an obligation to plasticize every piece of paper in the place.
Nonsense. As
Jones wrote:
No one, it seems, cares to read what we put up on the CRU web page. These people just make up motives for what we might or might not have done.
Almost all the data we have in the CRU archive is exactly the same as in the Global Historical Climatology Network (GHCN) archive used by the NOAA National Climatic Data Center [see here and here].
The original raw data are not “lost.” I could reconstruct what we had from U.S. Department of Energy reports we published in the mid-1980s. I would start with the GHCN data. I know that the effort would be a complete waste of time, though. I may get around to it some time. The documentation of what we’ve done is all in the literature.
If we have “lost” any data it is the following:
1. Station series for sites that in the 1980s we deemed then to be affected by either urban biases or by numerous site moves, that were either not correctable or not worth doing as there were other series in the region.
2. The original data for sites for which we made appropriate adjustments in the temperature data in the 1980s. We still have our adjusted data, of course, and these along with all other sites that didn’t need adjusting.
3. Since the 1980s as colleagues and National Meteorological Services (NMSs) have produced adjusted series for regions and or countries, then we replaced the data we had with the better series.
In the papers, I’ve always said that homogeneity adjustments are best produced by NMSs. A good example of this is the work by Lucie Vincent in Canada. Here we just replaced what data we had for the 200+ sites she sorted out.
The CRUTEM3 data for land look much like the GHCN and NASA Goddard Institute for Space Studies data for the same domains.
Apart from a figure in the IPCC Fourth Assessment Report (AR4) showing this, there is also this paper from Geophysical Research Letters in 2005 by Russ Vose et al. Figure 2 is similar to the AR4 plot.
I think if it hadn’t been this issue, the Competitive Enterprise Institute would have dreamt up something else!
Yes indeedy
There is a good chance that Virginia’s Attorney-General Ken Cuccinelli will come up with the “smoking gun” — where other socalled investigations have only produced one whitewash after another.
We know from the leaked e-mails of Climategate that Prof.Michael Mann was involved in the international conspiracy to “hide the decline” [in global temperatures], using what chief conspirator Dr.Phil Jones refers to as “Mike [Mann]’s trick.” Now at last we may find out just how this was done.
A lot is at stake here. If the recent warming is based on faked data, then all attempts to influence the climate by controlling the emissions of the so-called “pollutant” carbon dioxide are useless –and very costly. This includes the UN Climate Treaty, the Kyoto Protocol, the Waxman-Markey Cap & Trade (Tax) bill, the EPA “Endangerment Finding” based on the UN’s IPCC conclusion, and the upcoming Kerry-Lieberman-Graham bill in the US Senate.
There go all the windfarms, both onshore and offshore, the wasteful ethanol projects, and the hydrogen economy. Maybe Al Gore will cough up some of his ill-gotten $500 million, gained from scaring the public, from carbon trading, carbon footprints, and all the other scams.
So – good luck, Ken Cuccinelli. We are with you all the way.
S. Fred Singer, PhD
Professor Emeritus of Environmental Sciences, University of Virginia
Chairman, Virginia Scientists and Engineers for Energy and Environment