Saturday, October 31, 2009

Freak this poll

Tom Fuller has put up a push poll, here is the you are a pigeon question

Which, if any, of the following statements comes closest to capturing your attitudes and opinions about global warming? (We'll give you a chance to amplify in your own words later--but I need to pigeonhole--umm, stereotype--umm, put you in a 'box' if at all possible. If necessary, just pick the least objectionable statement, or indicate that you prefer not to say.)

O I believe global warming is the crisis of this generation, and should be the highest priority for policy makers right now.

O I think global warming is undoubtedly real and a serious problem, but I think it has been 'overplayed' by the press, politicians and some organisations.

O It looks to me like global warming probably has a grain of truth in it, but it's almost certainly not as bad as it has been made out to be.

O I believe global warming is true, but not man-made.

O I don't believe global warming is true. I think natural forces account for the changes in climate and there's no need to look at human contributions--which in any event have not been proven.

O This issue is not even at the top of my radar screen. I don't pay much attention to global warming or climate change, it doesn't influence how I live, how I spend my money, who I vote for--I don't really pay too much attention to this.

O I prefer not to say.
Note the airspace between the first and the second choice and the fine gradations between the rest. Tom is also pushing the Superfreaks and the Breakdown Institute pretty hard as a charter member of the Roger Pielke Jr. Climate Blogging Delay and Bad Science Society***. Who could have guessed.

*** Alcolytes include the rebuilt mothership, Kloor-a-Mole, the Fuller nuts man and My Way featuring RP Sr. and the emeriti. There is poll pushing going on in blog city friends, this is featured at the Roger and the older Roger. Freak that poll!!

And in case you wonder why Eli writes for free

For those who don't play in this sandbox. Fuller's first statement
O I believe global warming is the crisis of this generation, and should be the highest priority for policy makers right now.
Is the extreme. There are many gradations between this and the next
O I think global warming is undoubtedly real and a serious problem, but I think it has been 'overplayed' by the press, politicians and some organisations.
such as

O I think global warming is undoubtedly real and a serious problem, that requires immediate and serious action on the national and international levels. It is imperative to cut back emissions significantly as soon as possible


O I think global warming is undoubtedly real and a serious problem. We must immediately slow the growth of emissions.

You get the point.

Friday, October 30, 2009

Mike Powell returns

Mike Powell wrote a rather detailed disambiguation of the manuscript used to shill the second OISM petition project. Mike's analysis was originally housed on a newspaper forum web site, but has disappeared. Mike has graciously agreed to let his work appear on Rabett Run, till, in the nature of things, it has to move again. Mike's work is linked and cited in many places on the web, so if you come across any let the site owners know that the paper has relocated.

Eli has explicitly added links and figures referred to in the analysis. One of the issues in translating the paper to the web format is that 1oC, didn't mean 10 C but 1o. Hopefully all of those were corrected.

UPDATE: Another critical review by Michael McCracken is available


Critical Review of Robinson, Robinson, and Soon’s “Environmental Effects of Increased Atmospheric Carbon Dioxide”

prepared by Mike Powell (December 20)

This document is a critical review of the first four pages of “Environmental Effects of Increased Atmospheric Carbon Dioxide,” by Arthur B. Robinson, Noah E. Robinson, and Willie Soon, which was published in the Journal of American Physicians and Surgeons in 2007 (12:79-90). Mike Powell (Kennewick, Washington) prepared this review in December 2007. The review is organized in a point-by-point format starting with the second paragraph of Robinson et al. and working forward through their paper. Quoted text below is from Robinson et al. unless otherwise noted.

UPDATE: Another critical review, by Michael McCracken is available

1. first page, 2nd paragraph. “When we reviewed this subject in 1998 (1, 2), existing satellite records were short and were centered on a period of changing intermediate temperature trends.” It’s not entirely clear what they mean by “changing intermediate temperature trends, but it’s worth noting that their previous paper devotes a considerable amount of attention to the old Spencer and Christy MSU data that did not show a warming trend (e.g., Figures 6 through 8 in that paper). Now that some errors in the Spencer/Christy analysis have been fixed, the satellite data do show warming. It appears as though the Robinson et al. (2007) appeal to “changing intermediate temperature trends” is an attempt to avoid direct mention of the fact that one of the principal arguments they made in 1998 has since proved to be false.

2. 3rd paragraph and Figure 1. The temperature data shown in Figure 1 are for the Sargasso Sea – not the “average temperature of the Earth”. Note that the Sargasso Sea area of 2 million sq. miles amounts to only 1% [Corrected from 0.0011% thanks to Anon in the Comments] of the Earth’s surface area. That’s a rather small fraction and it’s unlikely that temperature variations in such a small region provide a direct measure of the magnitude of global temperature variations.


Figure 1: Surface temperatures in the Sargasso Sea, a 2 million square mile region of the Atlantic Ocean, with time resolution of 50 to 100 years and ending in 1975, as determined by isotope ratios of marine organism remains insediment at the bottom of the sea (3). The horizontal line is the average temperature for this 3,000-year period. The Little Ice Age and Medieval Climate Optimum were naturally occurring, extended intervals of climate departures from the mean. A value of 0.25 °C, which is the change in Sargasso Sea temperature between 1975 and 2006, has been added to the 1975 data in order to provide a 2006 temperature value.
The data at the left-most end of the plot (showing a rise from 24 to 25 oC) do not appear in their earlier paper (Robinson et al. 1998), nor does it appear in the Keigwin (1996) paper they reference for this figure. It’s not clear why this additional data would’ve been added to the figure.

Keigwin notes that the temperature variability seen in the plot (Figure 4B in his paper) reflects both sea surface temperature (SST) and salinity variations. He estimates that only about 2/3 of the apparent variation is due to SST changes. This means that the magnitude of the variations shown in the graph are probably less by something around one third.

Keigwin’s paper is not an attempt to reconstruct global temperatures or even northern hemisphere temperatures. For global temperature reconstructions, we should look at any one of about a dozen temperature reconstructions published since the Keigwin paper. See Figure 6.10 in the IPCC WH1 report and this graph from the Wikipedia

Reconstructions of Northern Hemisphere temperatures for the last 1,000 years according to various older articles (bluish lines), newer articles (reddish lines), and instrumental record (black line)
(To continue click)
first page, 4th paragraph and Figure 2. There are two significant problems with Figure 2 and the related text. First, the figure shows the yearly consumption rates of fossil fuels rather than the cumulative quantity consumed, which would be more representative of the greenhouse-gas forcing. In addition, a linear trend in glacier shortening may actually imply a non-linear trend in temperatures – as glaciers recede, the terminus usually moves upslope to higher altitudes (which usually have lower temperatures). If Robinson et al. (2007) were being honest about the science they would be plotting global temperature instead of “glacier shortening.”
Figure 2: Average length of 169 glaciers from 1700 to 2000 (4). The principal source of melt energy is solar radiation. Variations in glacier mass and length are primarily due to temperature and precipitation (5,6). This melting trend lags the temperature in crease by about 20 years, so it predates the 6-fold in crease in hydrocarbon use (7) even more than shown in the figure. Hydrocarbon use could not have caused this shortening trend.

first page, 4th paragraph. “Shortening lags temperature by about 20 years, so the current warming trend began in about 1800.” No reference is given for this statement and it’s not clear why this should be true.

4. first page, 5th paragraph and Figure 3. There are several problems with this graph. Why is only the Arctic temperature plotted along with total solar irradiance (TSI)? Shouldn’t the global temperature be used instead? And on what basis is the TSI scale adjusted to create the apparent match between Arctic temperatures and TSI? It appears as though the scales were adjusted “by eye” and there is not a physical basis for selection of the axis scales.
Figure 3: Arctic surface air temperature compared with total solar irradiance as measured by sun spot cycle amplitude, sunspot cycle length, solar equatorial rotation rate, fraction of penumbral spots, and decay rate of the 11-year sunspot cycle (8,9). Solar irradiance correlates well with Arctic temperature, while hydrocarbon use (7) does not correlate.

Also, again we have “hydrocarbon use” plotted rather than cumulative carbon emissions. Finally, note that TSI data after the year 2000 are not included. Perhaps this is because the TSI went down while Arctic temperatures continued to increase. Basically there has been no secular change in TSI since 1980, but global temperatures have risen significantly (see, for example the pmod TSI reconstruction.

Also, the source of the TSI data is apparently one of the authors (Soon) rather than one of the more widely accepted TSI reconstructions. See Figure 2.17 in the IPCC WG1, which provides the TSI reconstructions of Lean (2000) and Wang et al. (2005). These more accepted reconstructions show considerably less 20th century variation than does Soon’s reconstruction.
Figure 2.17. Reconstructions of the total solar irradiance time series starting as early as 1600. The upper envelope of the shaded regions shows irradiance variations arising from the 11-year activity cycle. The lower envelope is the total irradiance reconstructed by Lean (2000), in which the long-term trend was inferred from brightness changes in Sun-like stars. In comparison, the recent reconstruction of Y. Wang et al. (2005) is based on solar considerations alone, using a flux transport model to simulate the long-term evolution of the closed flux that generates bright faculae.

5. second page, Figure 4. Again, why not use global temperatures? Why are the U.S. temperatures of so much interest here?

Figure 4: Annual mean surface temperatures in the contiguous United States between 1880 and 2006 (10). The slope of the least-squares trend line for this 127-year re cord is 0.5 ºC per century.

6. second page, first paragraph. The claim is made that Figure 1 (see above) is “illustrative of most geographical locations,” but no reference is given. Further, there have been about a dozen groups that have created long-term temperature proxies reconstructing the northern hemisphere temperatures and the variability observed is not as great as shown in Figure 1 which applies only to the Sargasso Sea. See Figure 6.10 in the IPCC WG1 report

Figure 6.10. Records of NH temperature variation during the last 1.3 kyr. (a) Annual mean instrumental temperature records, identified in Table 6.1. (b) Reconstructions using multiple climate proxy records, identified in Table 6.1, including three records (JBB..1998, MBH..1999 and BOS..2001) shown in the TAR, and the HadCRUT2v instrumental temperature record in black. (c) Overlap of the published multi-decadal time scale uncertainty ranges of all temperature reconstructions identified in Table 6.1 (except for RMO..2005 and PS2004), with temperatures within ±1 standard error (SE) of a reconstruction ‘scoring’ 10%, and regions within the 5 to 95% range ‘scoring’ 5% (the maximum 100% is obtained only for temperatures that fall within ±1 SE of all 10 reconstructions). The HadCRUT2v instrumental temperature record is shown in black. All series have been smoothed with a Gaussian-weighted fi lter to remove fluctuations on time scales less than 30 years; smoothed values are obtained up to both ends of each record by extending the records with the mean of the adjacent existing values. All temperatures represent anomalies (°C) from the 1961 to 1990 mean. (click to enlarge)

7. second page, first paragraph. The last sentence claims that the “Medieval Climate Optimum” was about 1 oC hotter than the current temperature. To support this claim, a paper in Energy and Environment is cited. This is a “science advocacy” journal that, like JPANDS, exists principally to publish “skeptical” research that can’t make it through peer review in the mainstream science journals. The claim that the Medieval Warm Period (MWP) was warmer than present cannot be supported by the science. See the NAS report examining Mann et al.’s “hockey stick” work.

8. second page 2nd paragraph and Figure 5. “Recovery” from the “Little Ice Age” is not a concept that has any scientific basis. It presumes there is some temperature that the Earth “likes” to be at and, therefore, “recovers” from periods that are too hot or too cold. In reality, changes in global temperature are driven by changes in various cooling and heating effects. In this instance, changes in solar activity and volcanism resulted in increased temperatures in the early 20th century. See Figure 2.18 in the IPCC WG1 report, which shows reduced volcanism in the early 20th century compared with the 19th century.
Figure 2.18. Visible (wavelength 0.55 μm) optical depth estimates of stratospheric sulphate
aerosols formed in the aftermath of explosive volcanic eruptions that occurred between 1860 and 2000. Results are shown from two different data sets that have been used in recent climate model integrations. Note that the Ammann et al. (2003) data begins in 1890.

And again, we have the U.S. temperatures compared with TSI rather than global temperatures. Why is that? Note that in Figure 5, the data for the more recent reduction in TSI (as part of the solar cycle) are included.
Figure 5: U.S. surface temperature from Figure 4 as compared with total solar irradiance (19) from Figure 3.
Most interesting, though, is the fact that the “scaling factor” applied to the temperature axis on the graph is different from the factor applied in Figure 3. In both cases, it appears the temperature axis scale was adjusted to obtain the best-looking match between the temperature data and the TSI data. Nowhere is there any discussion of how the appropriate scale factor was determined (I suspect it was simply “by eye”), nor is there any admission of the fact they are using a different scaling relationship in Figure 5 than they are in Figure 3. This isn’t a trivial difference in scale factors – the factor used in Figure 5 is more than 40% greater than the factor used in Figure 3.

9. second page, 2nd paragraph and Figure 6. The temperature change in the U.S. over the past century is compared with other temperature variations on Earth to make the observed temperature increase appear inconsequential. First, they should be using the value for increase in global temperature, which is about 1 oC over the past century rather than 0.5oC, which they get from the U.S.-only data. Second, they compare again to the reported 3oC variation in Sargasso Sea temperature from Keigwin (1996) without noting that the Sargasso Sea amounts to only about 0.001% of Earth’s surface and without noting that Keigwin estimates about a third of the 3oC variation is likely an artifact of variations in salinity, so the true Sargasso SST variation over that period is, according to Keigwin, about 2oC. Third, the appropriate point for comparison here would be the average temperature for a much larger area, such as the Northern Hemisphere. See: Figure 6.10 in the WG1 report above (paragraph 6), which shows about a 1oC variation in NH temperature over the past 1300 years.

Figure 6: Comparison between the current U.S. temperature change per century, the 3,000-year temperature range in Figure 1, seasonal and diurnal range in Oregon, and seasonal and diurnal range throughout the Earth.

The other two bars on the histogram (Oregon Day-Night and Seasonal Temperature Range; and Earth Day-Night & Seasonal) are truly ridiculous. The “Oregon” bar refers to the difference between the warmest daytime (Summer) temperature in Oregon (approximately +40oC) and the coldest nighttime (Winter) temperature in Oregon. Why should that temperature range serve as a valid point of comparison? If global warming was thought likely to turn the coldest part of Winter in to the hottest part of Summer, then maybe that’d be a valid comparison. It’s not.

Similarly, the “Earth Day-Night & Seasonal” bar is the difference between the hottest spot on the earth in Summer and the coldest spot on the Earth in Winter. But why stop there? Why not have a bar showing the difference in temperature between the temperature at Earth’s core (6500oC) with the temperature in central Antarctica during winter (-100oC), thereby making the U.S. Temperature increase bar look really small? I’m surprised they didn’t think of that.

10. 2nd page, 3rd paragraph. Again, we have a discussion of the 0.5 oC increase for the U.S. instead of the 1.0 oC increase for global temperatures.

Putting aside that error for the moment, Robinson et al. (2007) are claiming that solar activity increased by 0.19% since 1900 and the 0.5oC temperature change (0.21%) is somehow corroborating evidence that the temperature change is due to changes in solar activity.

First of all, the 0.19% solar intensity increase is not consistent with more reputable sources of the change in TSI over the past century (e.g., Lean (2000) and Wang et al. (2005)), which yield an estimated increase of about 0.06% since 1900).

Second, a 0.5oC temperature change amounts to 0.21% only if the “baseline” temperature is equal to 238 Kelvin, which is a rather strange baseline temperature to use since it’s about 50oC too cold.

(x+0.5)/x gives x=238

Since they are computing the percentage increase for 0.5oC on top of a 284K (11oC U.S. Temperature) baseline, the percentage increase should be 0.5/284 = 0.176%. It’s unclear to me how or why they came up with 238K as a baseline temperature. Perhaps they are using the effective radiating temperature as a baseline.

With regards to the statement, “This is in good agreement with estimates that Earth’s temperature would be reduced by 0.6oC through particulate blocking of the sun by 0.2%,” I read through the Teller et al. (1997) reference and I can find nothing that supports that claim.

Further, most estimates for changes to global temperatures in response to changes in various forcings (including solar forcing) amount to about 0.8oC of warming per W/m^2 increase in atmospheric forcing. Thus, a 0.2% decrease in incoming solar irradiance should result in about 0.16oC of cooling.

11. 2nd page, 4th paragraph. This one-sentence-long paragraph is just a restatement of topics that have already been addressed. First, it’s not clear that solar activity is “closely correlated” with U.S. temperatures. If more reputable TSI data are used, the “correlation” is not so clear. Second, the “correlation” is apparently the result of scaling the temperature axis “by eye” rather than by some sort of defensible algorithm based on the physics involved. Third, the temperature data should be global temperature data rather than U.S. data. Fourth, “world hydrocarbon use” is an inappropriate measure of the degree of greenhouse gas forcing (at a minimum, the cumulative CO2 production should be used instead of the yearly consumption rate).

12. 2nd page, 5th paragraph. It is probably true that most people would not notice a 0.5oC temperature increase. However, the global temperature increase (1oC) has been twice the U.S. value (0.5oC). Further, the concern is not really so much the temperature increases we’ve already experienced, but it is the temperature increases to come that are cause for concern. It’s also worth noting that there’s no reason to suppose that Earth’s climate and biospheric systems are as insensitive to temperature changes as is human skin.

13. 2nd page, 6th paragraph. It is noted that over the past century, the U.S. has experienced a slight increase in rainfall, fewer tornadoes, and no increase in “hurricane activity.” It’s not clear what these data are supposed to prove, though. First of all, the concern is global climate change – not U.S. climate change. Second, although increases in rainfall are expected for some areas, increased drought is expected in others. It’s not clear at all what the expected effect of global climate change is for tornadoes. Tornadoes are much too small to be represented in the grid structure of most climate models. There is also considerable debate as to whether hurricane frequency and/or intensity will increase as Earth warms.

Robinson et al. (2007) present the data in Figures 7, 8, 9, and 10 (not plotted here) as if it somehow was in direct contradiction to the predictions made for global warming, but they make two mistakes: (1) the data they show are for only for the U.S.; and (2) the weather events they choose to plot cannot be directly compared against predictions for a warmer world. So these graphs amount to nothing more than a school of red herrings.

14. 2nd page, 6th paragraph, sea level rise discussions. Figures 11 (not plotted here) and 12 show the changes in sea level over the past two centuries. It’s not entirely clear what point they’re trying to make by noting that the sea level rise data show “3 intermediate uptrends and 2 periods of no increase.” Actually, the periods of “no increase” are really just periods where the rate of sea level rise is somewhat reduced (but not zero). Interestingly enough, the paper Robinson et al. (2007) references for the sea level data (Jevrejeva et al. 2006) observes that the periods of reduced sea level rise rate correspond temporally to periods when volcanic activity was increased... except, that is, for the observed sea level rise since about 1980 where volcanic activity has been higher but the rate of sea level rate is still high (See Figure 6 in Jevrejeva et al. 2006).

Robinson et al. make the statement that “if this trend continues... sea level would be expected to rise about 1 foot during the next 200 years,” without noting that the principal concern with respect to global warming is that the rate of sea level rise is predicted to increase.
Figure 12: Glacier shortening (4) and sea level rise (24,25). Gray area designates estimated range of error in the sea level re cord. These measurements lag air temperature increases by about 20 years. So, the trends began more than a century be fore in creases in hydrocarbon use.

3rd page, 1st paragraph. Again, the statement is made that because the “trends” in sea level and glacier shortening appear to have begun before major increases in “hydrocarbon use,” then “hydrocarbon use” must not be the cause. This argument misses that point that there are multiple, competing factors acting on the climate including solar effects, volcanism, manmade aerosols, and greenhouse gases. To determine whether greenhouse gases are affecting the climate, it is essential to include an analysis of the other factors known to affect climate. Refer to Chapter 9 of the IPCC WG1 report for a discussion of how attribution studies are conducted in climate science.

16. 3rd page, 2nd paragraph. “Much of that CO2 increase is attributable to the 6-fold increase in human use of hydrocarbon energy.” That’s true, although it’s more accurate to say that all of the observed CO2 increase can be attributed to human activities including fossil fuel combustion and vegetation burning for land clearing. See this discussion of how we know that recent CO2 increases are due to human activities.

The claim that “Figures 2, 3, 11, 12, and 13 show human use of hydrocarbons has not caused the observed increases in temperature” is not supported by the data provided in the Robinson et al. (2007) paper for reasons already discussed.

17. 3rd page, 3rd paragraph. The claim is made here that increased CO2 concentrations have improved the “extent and diversity of plant and animal life.” No reference is given for this statement and no data are provided to support the statement.

18. 3rd page, 4th paragraph. “There are no experimental data to suggest this.” This is a rather cute bit of deceptiveness. It’s rather difficult to run experiments that involve the whole Earth ... that is, other than the uncontrolled experiment we’re already conducting by pumping greenhouse gases into the atmosphere. Contrary to the impression Robinson et al. (2007) tries to give, though, there is plenty of direct, experimental evidence demonstrating the basic physics underlying the theory that global climate change is caused by manmade greenhouse gases.

19. 3rd page, 5th paragraph. “The empirical evidence – actual measurements of Earth’s temperature and climate – shows no man-made warming trend. Indeed, during four of the seven decades since 1940 when average CO2 levels steadily increased, U.S. average temperatures were actually decreasing.” Here we go again with U.S. temperatures. The concern here is global warming. Further, it’s clear that Robinson et al. are confused about how attribution studies need to be conducted. It takes more than just plotting U.S. temperature along with “hydrocarbon use” on the same graph to determine the fraction of temperature change that can be attributed to human activities.

Also, “...and humans have been responsible for part of this increase...” is misleading. Human activities quite clearly account for all of the observed increase in atmospheric CO2 concentrations over the past century.

20. 3rd page, 6th through 9th paragraphs. This is all opinion and wild speculation with no evidence, references, or science to back it up. No further comment is necessary.

21. 4th page, 1st paragraph. Same comment as 20.


22. 4th page, 2nd and 3rd paragraphs. The Sargasso Sea temperature cannot be said to be representative of global (or even hemispheric) temperatures. The idea that the Earth would “rebound” from a cold period such as the Little Ice Age is unsupportable. Changes in global temperature are principally attributable to changes in forcings (solar, volcanic, greenhouse gases, etc.). The Earth went into the Little Ice Age cold period (to whatever extent it was a global phenomenon) because of changes in solar and volcanic forcing.

23. 4th page, 4th paragraph. It is stated that “temperatures have been higher than they are now during much of the last three millennia.” However, no reference is given to support that statement. Sargasso Sea temperature measurements cannot be used to support the claim that global temperatures have been higher in the past few millennia. Further, the argument that the “historical record does not contain any report of ‘global warming’ catastrophes” is off the mark. First of all, historical records do show that climate changes contributed to the demise of societies throughout history (see Collapse by Jared Diamond). Second, the principal concern with global warming is the increasing global temperature expected to occur in coming decades, so comparing today’s global temperatures to past temperatures isn’t terribly appropriate. Instead, we should compare past temperatures to the approximately 2oC higher temperatures expected in the coming century. If we do that, we find that the last time global temperatures reached that level was approximately 3 million years ago. And at that time, sea level was about 25 to 35 meters higher than present. Undoubtedly there were many other regional climate differences as well.

24. 4th page, 5th paragraph. It is stated that “The 3,000-year range of temperatures in the Sargasso Sea is typical of most places.” but no citation is given for this statement. Further, it’s not clear what this statement is supposed to mean. The remainder of this paragraph attempts to make the argument that “individual records” are more meaningful than hemispheric or global averages. That is a rather strange argument to make given that the concern here is global warming. The Essex et al. (2007) paper that is cited in support of this argument is, in my opinion, just a cute bit of sophistry. Their argument is that it’s inappropriate to try to compute an “average” temperature of a system as complicated as the Earth’s climate. Their argument basically comes down to the point that averages tend to obscure the details of the variations within the system. Sure, that’s true, but that’s exactly the point of taking an average of a complicated system – it helps us to see the forest instead of getting distracted by the multitude of trees.

25. 4th page, 6th paragraph and Table 1. This paragraph discusses the results of an analysis by Soon et al. (2003) that was published in the Energy & Environment, which is a well-known “journal” that seems to exist principally to publish “skeptical” articles. In this case, the Soon et al. article was soundly debunked by several groups. Here are two critical reviews, the first in Science, by Raymond Bradley, Malcolm Hughes and Henry Diaz, and the second a statement from the AGU.

26. 4th page, 7th paragraph. It is claimed here that “mean and median world temperatures in 2006 were, on average, approximately 1oC to 2oC cooler than in the Medieval Period.” To back up this statement, they reference a Web page. That’s not exactly peer-reviewed science in a reputable journal. In addition, the authors of this “paper” (Idso and Idso) are well-known global-warming “skeptics” with ties to the fossil fuel industries.

27. 4th page, 8th paragraph. Other than the use of the phrase “cycle of recovery,” which is intended to again imply the misleading notion of a “recovery” from the Little Ice Age, this paragraph is largely accurate.

28. 4th page, 9th paragraph. This paragraph is a restatement of arguments made earlier in the paper.


There are, of course, many more errors and distortions in the Robinson et al. (2007) paper, but there comes a point when enough should be enough. The errors described above should be sufficient evidence for any fair-minded person to conclude Robinson et al. (2007) is not a serious scientific paper. Instead, it appears to be nothing more than a clumsy attempt to distort the evidence for anthropogenic global warming in order to sow confusion in the minds of people not already familiar with the evidence.

Mike Powell
PE, MS Chemical Engineering
Kennewick, Washington

Advanced Head Banger (second semester work)

Eli is assigning this to his class

Via LGM and pharyngula

Thursday, October 29, 2009

Rabett Goes Romm

Ian Plimer has written the most Titanic load of nonscience in the history of make believe and there are some real contenders out there. And Eli does mean TITANIC with a capital TITANIC. Big, really huge. Not one word you can believe, not one graph not photochopped.

Tim Lambert (aka Deltoid) calls Plimer out as Tim Lambert (aka Deltoid) calls Plimer out as Plimer the Plagerist, Ian Enting got a good laugh, but after 46 pages of listing Plimerrors was found rolling on the floor and had to be sedated, Eli Rabett, that excellent lagomorph, went out for a few jars of absinthe after a couple of paragraphs and has mercifully forgotten what he read.
Plimer's response, the woe-is-me whining, caused brain damage in readers from its self pity. The whine was so loud ambulances could not be heard. Substance-free ad hominem innocent victim Plimer thinks his only sin is to have — cue violins — patiently and persistently written excellent, peer-reviewed research on aspects of the climate. All with a room temperature IQs disagree.
In the real world Plimer routinely tries to drown the reputation of top scientists — including all of NOAA— with no justification whatsoever. Chicken Plimer big mouths challenges to debate, and then wankers out.
So let’s set the record straight. Ian Plimer is the most debunked Aussie in the science blogosphere, possibly the entire universe. Heck, computer scientist Tim Lambert (aka Deltoid) has a whole category just for Ian, which I commend to anyone who still takes the man seriously. There are some things that call for a full frontal assault and Mt. Plimer is one of them.
But Eli digresses, what really is needed here is a first class fisking, and there is so much to frisk. Plimer has the most (unintentionally) damning book I’ve ever seen, parts of which I’m going to reproduce here since I’m sure progressives will want to use them in explaining why we must never go back to the deny-delay policies of the fossil fool industry. Take IP's take on Mauna Loa, please, in Plimer Purple Prose.

The measurement of CO2 in the atmosphere is fraught with difficulty. There is a 180 year record of atmospheric CO2 measurement by the same method. It has been measured with an accuracy of 1-3% from 1812 until 1961 by a chemical method. (2090).
Where Ian signs on to the Diplom Beck express. Eli has had a word or two on these Beckies. Beck, the credulous, accepted any measurement without a critical evaluation of whether it made any sense.
Between 1812 and 1961 there have been more than 90,000 measurements of atmospheric CO2 by the Pettenkofer method. These showed peaks in atmospheric CO2 in 1825, 1857 and 1942. In 1942 the atmospheric CO2 content measured by these methods (400 ppmv) was higher than now. 2091 A plot of the CO2 measured by these methods shows that for much of the 19th Century and from 1935 to 1950, the atmospheric CO2 was higher than at present and varied considerably.
As Ferdinand Engelbeen said
Besides the quality of the measurements themselves, the biggest problem is that most of the data which show a peak around 1943 are taken at places which were completely unsuitable for background measurements. In that way these data are worthless for historical (and current) global background estimates. This is confirmed by other methods which indicate no peak values around 1943.
and Ferdinand has the data to show that the middle of Vienna (500 ppm minimum and huge variations) ain't the best place in the world to measure CO2, nor are most of the other Pettenkofer hangouts.

The method itself sucks. 1-3% would be 3 - 9 ppm CO2, but that is if the operator gets it right, the moon is in Aquarius and you believe six impossible things before breakfast. There is lots of evidence that lots of them couldn't find a CO2 concentration in a paper bag with handles. More on that later.

Ralph Keeling pointed out that reporting 90,000 measurements of nonsense is simply a lot of nonsense
It should be added that Beck’s analysis also runs afoul of a basic accounting problem. Beck’s 11–year averages show large swings, including an increase from 310 to 420 ppm between 1920 and 1945 (Beck’s Figure 11). To drive an increase of this magnitude globally requires the release of 233 billion metric tons of C to the atmosphere. The amount is equivalent to more than a third of all the carbon contained in land plants globally.
But Plimer charges forward with an argument that might impress a dumb frog

There are great variations in CO2. A simple home experiment indoors can show that in a week, CO2 can change by 75 ppmv. A variable CO2 content is exactly as expected, a smooth CO2 curve rings alarm bells.
This is crazy Eddy stuff. Your house is a small closed chamber in which cooking and heating goes on and CO2 can collect. The atmosphere is large, open, and has winds which mix stuff together. There can be issues close to the ground in agricultural or urban areas, but once you get into the free troposphere above the inversion layer thing get smooth.

Plimer continues
In 1959, the measurement method was changed to infra-red spectroscopy with the establishment of the Mauna Loa (Hawaii) station, and measurements were compared with a reference gas sample.
Well, that is true
Compared to the Pettenkofer method, infra-red spectroscopy is simple, cheap and quick.
Quick yes. Simple, in the sense of straightforward, but cheap only after someone buys the equipment. OTOH, the Pettenkorfer method is a titration which only requires some inexpensive glassware.
The infra-red technique has never been validated against the Pettenkofer method.
Plimer also, some how, neglects to think (that is asking a lot perhaps) that calibrations against primary standard mixtures is a more accurate method of measuring accuracy than calibration of two methods against each other. Since the IR method was calibrated against standard mixtures, its accuracy has been confirmed. One could ask if the Pettenkorfer method was calibrated against accurately mixed standards. The literature, at least the papers Eli has read are quiet on that matter. Still the Pettenkofer methods, or at least the Pettenkorfers, were compared with the calibrated IR method. The Pettenkorfer method was found very wanting. There is a small footnote in Charles Keeling Rewards and Penalties of Monitoring the Earth describing how
At two stations in Finland, samples collected by station personnel had been sent to Scripps. These samples yielded nearly the same concentrations as those measured at Mauna Loa Observatory, proving that the errors in the Scandinavian program were mainly analytical rather than due to variable CO2 in the air being sampled.
Still Ian Plimer persists
The raw data from Mauna Loa is 'edited' by an operator who deletes what is may be considered poor data. Some 82% of the raw CO2 measurement data is "edited" leaving just 18% of the raw data measurements for statistical analysis. With such savage editing of raw data, whatever trend one wants to show can be shown.
For this, Peter Tans of NOAA called Plimer out as a con artist, showing that there are well studied meteorological reasons for excluding excluded data, and that even if one includes the non-background data the differences are only slight, with the seasonal cycle becoming a bit larger due to upslope winds, esp. during the summer and a bit here is maybe 1 ppm or so. Plimer is making an argument from ignorance here. Tans didn't see the infamous lie that Plimer next uttered or he might go Rush Limbaugh on Revkin Plimer.

In publications, large natural variations in CO2 were removed from the data by editing in order to make an upward-trending curve showing an increasing human contribution of CO2.
It is difficult to imaging a fate nasty enough for someone who would attack others in this way. Plimer should get down on his knees and beg forgiveness but he continues

The early Mauna Loa and South Pole CO2 measurements were considerably below measurements made at the same time in northwestern Europe from 21 measuring stations using the Pettenkoffer method. During the period these 21 stations were operating (1955-1960), there was no recorded increase in atmospheric CO2. There is poor correlation between temperature and the greatly fluctuating atmospheric CO2 content measured by the Pettenkofer method.
Of course, we now know that these 21 stations were measuring strange and local conditions badly done. As Charles Keeling writes
At the IUGG meeting there were also presentations of CO2 data obtained by the chemical methods, and an honorary address by Dr. Kurt Buch, who had championed atmospheric CO2 measurements in Finland as early as 1920. Atmospheric CO2 measurements at an array of stations over Scandinavia, reported routinely since 1955 in a new journal, Tellus, were presented.

This Scandinavian program, started by Rossby in 1954, had been a major factor in triggering interest in measuring CO2 during the IGY. Nevertheless it was quietly abandoned after the meeting, when the reported range in concentrations, 150–450 ppm, was seen to reflect large errors. Rejected along with the Scandinavian sampling program was Rossby’s hypothesis that CO2 concentration data could be useful to tag air masses (14).
Some people can recognize the truth, but not Ian Plimer, he likes the mumbo jumbo

The Pettenkofer method measurements in northwestern Europe showed that CO2 varied between 270 and 380 ppmv, with annual means of 315-331 ppmv. There was no tendency for rising or falling CO2 levels at any one of the measuring stations over the 5 year period. Furthermore, these measurements were taken in industrial areas during post-World War II reconstruction and increasing atmospheric CO2 would have been expected.
It ain't a feature, the large changes are a bug, showing that the Pettenkorfer measurements are useless or in error because they are being measured in places where the CO2 is not well mixed. The fact that they were taken in industrial areas shows that they are completely untrustworthy as even Plimer could see from this jaunt through Essen Germany while measuring CO2. The wild assed guess about WWII is just that.
While these measurements were being undertaken in northwestern Europe, a measuring station was extablished on top Mauna Loa in order to be far away from CO2 emitting industrial areas. The volcano Mauna Loa emits large quantities of CO2, as do other Hawaiian volcanoes. (2006). During a volcanic eruption the observatory was evaculated for a few months and there was a gap in the dat record which represented the period of no measurement. There are now no gaps in the Mauna Loa data set. (2097)
Except that the volcanic CO2 at MLO is well characterized. Most days it does not even contribute 1 ppm at the times when it is flowing downslope (the times which are excluded). This is dealt with in a 2001 article by Steve Ryan Ryan, S. (2001), Estimating volcanic CO2 emission rates from atmospheric measurements on the slope of Mauna Loa, /Chem. Geol., 177/, 201-211 and Quiescent Outgassing of Mauna Loa Volcano 1958-1994. The background emission from the Mauna Loa has been studied, and even during an eruption the extra amount of CO2 at the Observatory is not large

Plimer clearly is selling some funny weed,

The annual mean CO2 atmospheric content reported at Mauna Loa for 1959 was 315.93 ppmv. This was 15 ppmv lower than the 1959 measurements for measuring stations in northwestern Europe. Measured CO2 at Mauna Loa increased steadily to 351.45 ppmv in early 1989. (2098) The 1989 value is the same as the European measurements 35 years earlier by the Pettenkofer method, which suggests problems with both the measurement methods and the statistical treatment of data.
Yes, it suggests that the Pettenkorfer measurements were bollocks. Plimer has swept down the memory hole the fact that the IR measurements are continually calibrated against standard mixtures of CO2, which means that the Pettenkorfer measurements which were not calibrated against such mixtures, are the ones that are wrong.
In fact, when the historical chemical measurements are compared with the spectroscopic measurements of air trapped in ice and modern air, there is no correlation. Furthermore, measurement at Mauna Loa is by infra-red analysis (2099, 2100) and some of the ice core measurements of CO2 in trapped air were by gas chromatography (2101)
Let's see, two methods of measurement come up with the same answer (GC and IR), one method is wildly different. Which do you trust if you are Ian Plimer
The Mauna Loa results change daily and seasonally. Night time decomposition of plants and photosynthesis during sunlight change the data, as does traffic and industry. Downslope winds transport CO2 from distant volcanoes and increase the CO2 content. Upslope winds during afternoon hours record lower CO2 because of photosynthetic depletion in sugar cane field and forests. The raw data is an average of four 4 samples from hour to hour. In 2004 there were a possible 8784 measurements. Due to instrumental error 1102 samples have no data, 1085 were not used due to up slope winds*, 655 have large variability within 1 hour but were used in the official figures, and 866 had large hour by hour variability but were not used.
The Excellent Tim Lambert has shown how this was lifted from Ferdinand Engelbeen's blog, an example of plagerism by Plimer. Tim points out that
So it seems that the reason why Plimer didn't cite Engelbeen was that Engelbeen conclusively refuted Plimer's claims about data selection at Mauna Loa being used to manufacture a trend.
Plimer copied and omitted to print a false claim.

The Mauna Loa CO2 measurements show variations at sub-annual frequencies associated with variations in carbon sources, carbon sinks and atmospheric transport. (2103) Air that arrives during the April-June period favours a lower CO2 concentration. Seasonal changes derive from Northern Hemisphere deciduous plants that take up CO2 in spring and summer and release it in autumn and winter due to the decay of dead plant material. Every April, the Northern Hemisphere reduction of atmospheric CO2 shows that Nature reacts quickly to CO2 in the atmosphere and can remove large amounts in a very short time. This is not news. For millennia farmers have called this time the growing season.
And for generations farmers measured the variation in CO2. Right Ian? Oh you didn't think of that. How surprising. The measurement of the seasonal variation was the indication that convinced almost everyone that Keeling and the MLO measurements were a huge improvement on the wet chemical ones, which could not resolve this variation. In short, it's a feature.
There may be errors in sampling and analytical procedure. (2104). Measuring stations are now located around the world and in isolated coastal or island areas to measure CO2 in air without contamination from life or industrial activity to establish the background CO2 content of the atmosphere. The problem with these measurements is that land-derived air blowing across the sea loses about 10 ppm of its CO2 as the CO2 dissolves in the oceans
And inland measurement stations get the same values as the isolated ones, for example Schauinsland, located on a ridge above the Rhein and other places. Plimer continues to emitepizootics of the blowhole

Eli has gotta get more Romm.

Wednesday, October 28, 2009

Elections have consequences

Eli stumbled across a very jumbled story in the last couple of days. It's been blogged on before, but the bunny has a little twist. On one level it is the usual state university president jumping on an inconvenient faculty member. Buried a bit under the leaves is a lesson on how elections have consequences and bureaucracies have momentum. There remains an interesting story here for any reporter who wants to dig a bit. Eli, is just a techno-bunny, but this one could be much more amusing than the various state climatologist dust ups. (Twist 1, context)

The short story is that about a year and a half ago NOAA was gung ho to allow exploratory drilling in Bristol Bay Alaska, a wild place with major commercial fisheries. Eli is not going to get into the rights and wrongs of this, that is another discussion and you have the links, what is interesting is that Rick Steiner, a marine scientist and a faculty member at the University of Alaska got side-ways about this with Mark Hamilton the President of the University of Alaska, NOAA which was pushing the drilling and the NOAA Sea Grant Program.

Shell, NOAA and the university had had public meetings about how fishing and drilling could co-exist in Bristol Bay and there was a planned oil lease sale. Steiner was excluded from these meetings (and, as things go these days, he had the Email to prove it). He held a press conference in March 2008, during which he cast derision and scorn on the fish and drill crowd. A couple of months later, one of the Sea Grant honchos at UAlaska, Denis Wiesenburg, Dean of the School of Fisheries, visited NOAA in downtown Silver Spring. He reports

I spent the morning of May 7 in Silver Spring, MD, meeting with NOAA program managers. In the morning I met with the national Sea Grant Director Leon Cammen, Deputy Director Jim Murray and Associate Director Nicola Garber. . . .We also had a discussion about the appropriate role of Sea Grant advisory agents in providing information to the public.

At that meeting, Jim Murray advised me that they had an "issue with Rick Steiner." They felt he was acting as an advocate and asked if he was being paid with Sea Grant funds. I told him that he received one month of salary from our Sea Grant grant. Jim expressed concern about this and stated that "one agent can cause problems nationally." The suggestion was made that he not be paid from Sea Grant funds. I asked them for some documentation that they did not wish Sea Grant agents (aka MAP in Alaska) to advocate. They gave me a copy of the Sea Grant manual on extension agents and opened it to page 36 on Neutral Brokers of Information. The next day, I sent the e-mail below to Professor Steiner's supervisor, Paula Cullenberg with a copy of the two pages they noted during the discussion.

Our Sea Grant funding is a grant. We have no obligation to pay faculty from the grant unless they are contributing to the goals of the grant. From my discussions with the national Sea Grant office, they do not believe Professor Steiner is contributing to their mission. On the contrary, they worry that his actions in Alaska could have negative implications nationally as stated above by Jim Murray during the meeting.

At the Sea Grant meeting in Silver Spring, I did not bring up this topic. Jim Murray did. During FY09, Professor Steiner is receiving one month slary from our Sea Grant grant. The grant is up for renewal this year. It will be my recommendation that Professor Steiner's salary not be included in the grant and the (sic) he continue to receive his nine month salary from our Fund 1 budget as required by by the CBA
Wiesenburg's December 2008 evaluation letter to Steiner is slightly less friendly. In it he quotes the operative paragraph of the Sea Grant Policy
But as neutral providers of science based information to decision makers, we do not suggest what those decisions should be. We help them understand their choices and the implications of those choices. We do not take positions on issues of public debate.
However, Steiner had, at least in a letter sent to the University of Alaska and NOAA, not advocated as a Sea Grant agent, but as a Professor at the University. Long story short, Steiner, got moved out of his office into a broom closet and his life was made difficult. This caused him to protest against the usual brick wall. He got the letter removed from his file and a one day AAUP seminar on academic freedom, but little else.

In February 2009, Public Employees for Environmental Responsibility (PEER), took up the cudgels for Steiner and sent a letter to Jane Lubchenco who had been nominated to become Under Secretary of Commerce for Oceans and Atmosphere and the NOAA Administrator. A short time thereafter any oversight of the Alaska Sea Grant program was removed from James Murray. Elections have their consequences. (Twist 2, consequences) FOA requests for emails to and from Dr. Jim on the matter might be interesting to reporters.

Steiner's Sea Grant funding got chopped in April with a one year extension from university funds. Didn't the locals get the news that there was new management in DC? NOAA has now reversed its position and says there should not be drilling in Bristol Bay. (Twist 3, President McCain would not approve.) Elections have their consequences.

Steiner plans to resign after 30 years.

Anyone interested in the entire 15 year saga, can look here.


Punching bag

Unnoticed in the recent pile-ons, our friends from outer space have been taking it in the chin. Rasmus @ Real Climate wondered how an idea can be knocked down so often and still get off the mat.

Things Break has posts about two (one, two) new papers that falsify two of the most repeated claims of the guys from interstellar space.

Overholt et al. examine the evidence in a paper entitled Testing the link between terrestrial climate change and galactic spiral arm transit and surprise surprise, find it wanting
and another titled

Atmospheric data over a solar cycle: no connection between galactic cosmic rays and new particle formation
which says it all.

And over at delayed oscillator, we have a new participant, transient eddy who looks at a new claim by Dengel, et al, who find a correlation between galactic cosmic rays and tree growth in a Scottish grove. Eddy, finds it wanting on statistical and methodological grounds, in particular because Dengel and friends studied from a young managed forest, where the trees were pampered and not stressed. Not the sort of place you would expect to find a strong climate signal.

In following the link to Atmospheric Chemistry and Physics Discussions, one of the open review journals, Eli happened to find another:

Results from the CERN pilot CLOUD experiment by J. Duplissy and a lot of alls including the Svensmarks.

If Eli were paying taxes in the EU he would be throwing tea bags at the entrance to CERN. The paper describes some test runs from like 2006, which according to the paper show that

a) The designers of the experiment were not very aware of wall effects and
b) The aims of the experiments were not thought out as in, at the end of the day what was going to be proved

Although in the introduction very positive, the referee's comments make it clear why wall effects are important and that could be a show stopper.
To paraphrase, the experiment is designed to study how aerosols can form around ions created by cosmic rays which then grow to cloud condensation nucleii size (larger aerosols) over periods of hours, if not days. To do that, they have to have excellent control of the loss of ions, aerosols and the CCNs on the surface of the chamber.

Wall loss is an obsession in reactive chemical kinetics and is usually dealt with by coating the walls with various substances. For free radical kinetics, phosphoric acid (an early favorite), lots of strange kinds of waxes, teflon and secret sauce (Eli kids you not, he knows one aerosol guy who swears by anodized aluminum about the roughest surface you could think of on the micro/nano scale). It's not clear that any of these would work for CLOUD, because of the long design times for holding the aerosols in the chamber which means that wall loss has to be minimized so most of the aerosols are unaffected and remain long enough to be able to measure slow growth.

Worse, what the CLOUD team found was that they were getting outgassing off the walls and that the walls had a memory of what they were previously exposed to. This showed up when the temperature went up by a little bit. They say that they can clean the walls by flowing clean gas through the system for a long time based on a decrease in the bumping with time exposed to gas flow. In the original proposal it said that the walls would be cleaned by heating in vacuum (a standard procedure), but they only remain clean as long as the vacuum is maintained below 10^-9 atm. Since the cell is going to have significant amounts of air, water and SO2 in it during the experiment, the walls are always going to be saturated with water, SO2, sulfuric acid and similar during the experiment. Eli's first guess is that wall chemistry is going to be, well, complicated and confusing.


Monday, October 26, 2009

Statisticians reject global cooling

On slow news days, reporters go trolling for man bites dog stories. The AP decided to try statisticians bite denialists and scurried up a few statisticians, showed them the data and asked, hey guys, we got global cooling here. Actually, they were smarter, they just showed the unlabeled numbers and asked, is there a trend? John Grego at South Carolina remarked

"If you look at the data and sort of cherry-pick a micro-trend within a bigger trend, that technique is particularly suspect,"

David Peterson a smarter emeriti from Duke pointed out that

Saying there's a downward trend since 1998 is not scientifically legitimate,

Identifying a downward trend is a case of "people coming at the data with preconceived notions," said Peterson, author of the book "Why Did They Do That? An Introduction to Forensic Decision Analysis."

They shopped quotes from Don "the denialist" Easterbrook

"Should not the actual temperature be higher now than it was in 1998?" Easterbrook asked. "We can play the numbers games."

and let Prof. Grego flatten him

That's the problem, some of the statisticians said.

Grego produced three charts to show how choosing a starting date can alter perceptions. Using the skeptics' satellite data beginning in 1998, there is a "mild downward trend," he said. But doing that is "deceptive."

The trend disappears if the analysis starts in 1997. And it trends upward if you begin in 1999, he said.

Followed by a short drive by on Dubner and Levitt's Superfreakonomics

A line in the book says: "Then there's this little-discussed fact about global warming: While the drumbeat of doom has grown louder over the past several years, the average global temperature during that time has in fact decreased."

That led to a sharp rebuke from the Union of Concerned Scientists, which said the book mischaracterizes climate science with "distorted statistics."

Levitt is trying to roll it back to rescue some respectability, but, of course, the damage is being done both by the book, and it's reception. In this blogs are playing an important role. It is important to a) keep it up and b) realize that this has been effective on at least a personal level with Levitt. He knows that he has a long way to go before people he cares about take him seriously again.

Levitt, a University of Chicago economist, said he does not believe there is a cooling trend. He said the line was just an attempt to note the irony of a cool couple of years at a time of intense discussion of global warming. Levitt said he did not do any statistical analysis of temperatures, but "eyeballed" the numbers and noticed 2005 was hotter than the last couple of years. Levitt said the "cooling" reference in the book title refers more to ideas about trying to cool the Earth artificially.

This should be greeted by a loud chorus.

and Ken Caldeira ain't gonna take it no more
To talk about global cooling at the end of the hottest decade the planet has experienced in many thousands of years is ridiculous," said Ken Caldeira, a climate scientist at the Carnegie Institution at Stanford.

Sunday, October 25, 2009

Ian Plimer is a con artist

One of the important things happening on the Internet is that people are communicating with each other. While this multiplies nonsense, it lets other use common sense to follow up on the con jobs. Unlikely forums have become important venues for much of this. At the Bigfooty Forum, Bit Pattern, a distinguished senior member, wrote to NOAA about one of the stranger Plimies, (weirdly an Aussie Rules Football discussion spawned this discussion) but one that has made many guest appearances on blogs including this one (Eli finds blogosphere sophmoric and undistinguished. He would not want to belong to such an organization). In particular, Plimer wrote:

UPDATE: A bit of internet telephone, some minor corrections from the text and addition of a very significant sentence at the end of the first paragraph. More to come

The raw data from Mauna Loa is 'edited' by an operator who deletes what is may be considered poor data. Some 82% of the raw CO2 measurement data is "edited" leaving just 18% of the raw data measurements for statistical analysis. With such savage editing of raw data, whatever trend one wants to show can be shown. In publications, large natural variations in CO2 were removed from the data by editing in order to make an upward-trending curve showing an increasing human contribution of CO2. . . .

The raw data is an average of four 4 samples from hour to hour. In 2004 there were a possible 8784 measurements. Due to instrumental error 1102 samples had have no data, 1085 were not used due to up slope winds*, 655 have large variability within 1 hour but were used in the official figures, 866 had large hour by figure, and 866 had large hour by hour variability but were not used.
Rabett Run readers know from earlier posts that the meteorology at MLO is intensively studied and that CO2 measurements that go into the published monthly means are selected based on the meteorology to reflect that in the free troposphere and eliminate times when there are contributions from down (forest) and up slope (volcano), but what Eli had not made clear is the extent of those excursions. Bit Pattern got an answer from a NOAA scientist
Anybody, including Plimer, can download the actual measurement records, complete, warts and all, from our web site, or by clicking on the appropriate places. To illustrate how misleading Plimer is I made a plot of 3 years of all hourly data, with 2004 in the middle because Plimer discussed 2004. I have also attached a description of our MLO measurements, which Plimer and anybody else can download from the second web page mentioned above. In the plot, "selected" data means that we have used it in constructing the published monthly mean because those hours satisfy the conditions for "background" measurements. The red stripes are extremely close to the published monthly means. The published data has another step, first from hourly to daily averages, then to monthly, which I did not do here. Also plotted in purple-blue are all non-background data. If one constructs monthly means from ALL data, incl. non-background, one obtains the purple-blue stripes. The differences are only slight, with the seasonal cycle becoming a bit larger due to upslope winds, esp. during the summer.
This confirms the hypothesis. RTFL dear readers. Oh yes, click on the figure to blow it up if desired.

Saturday, October 24, 2009

The arrogance of physicists

Arthur Smith has a long and smart post on how the native arrogance of physicists leads them astray when they wander into the climate thicket. He starts with our old friends Gerlich and what's his name before settling on Robert Austin and the rest of the crew at Princeton.

Some time ago, Eli pointed to a paper by Myanna Lahsen which makes much the same point

This paper identifies cultural and historical dimensions that structure US climate science politics. It explores why a key subset of scientists—the physicist founders and leaders of the influential George C. Marshall Institute—chose to lend their scientific authority to this movement which continues to powerfully shape US climate policy. The paper suggests that these physicists joined the environmental backlash to stem changing tides in science and society, and to defend their preferred understandings of science, modernity, and of themselves as a physicist elite - understandings challenged by on-going transformations encapsulated by the widespread concern about human-induced climate change.
Go read both.


When the ice melts

Ear tip to Alexander Ac for pointing out an important article in Nature News. Eli has been on a tear about the consequences of climate change for India, drying up the great rivers upon which it depends for drinking water. Bhutan, located on the periphery of the Himalayas is reaping the reverse of this, with enlarged glacial lakes threatening to flood the countryside when they burst out of their bounds. Anjali Nayar writes

Glaciers in the Himalayas are retreating faster than in any other part of the world and they could disappear completely by 2035 (ref. 1). This puts the mountainous nation of Bhutan at a special risk. In an area smaller than Switzerland, it has 983 glaciers and 2,794 glacial lakes, some of which have burst to produce deadly glacial lake floods

As a poor nation without even its own helicopter, Bhutan lacks the resources to combat global warming. It is carrying out the work at Thorthormi glacier with the help of money from various international donors, including US$3.5 million from the Least Developed Countries Fund, created under the United Nations Framework Convention on Climate Change. The global cost of adaptation could total hundreds of billions of dollars a year — orders of magnitude more than what is available to poor countries at the moment.
That was only two years ago. Of course if the glaciers disappear then Bhutan has the other problem that agriculture will collapse. Here clearly the legacy of the delayers and the denialists becomes clear. We must adapt now to what we have wrought and it will be at great cost, but we must also start to mitigate, and it ain't gonna be a walk in the park, because what lies ahead if we don't is disaster.

It is vital to recognize that greenhouse gas levels are cumulative, and that ANY action now reduces the danger, even if it does not eliminate it in the future.

The picture is from Tiempo Climate Newswatch


Friday, October 23, 2009

We interrupt this very serious blog

To show you how it has to be done

and in the belly of the beast

More, much more

Going from step 3 to 4

  1. First, they fear you.
  2. Then they match you.
  3. Then they laugh at you.
  4. Then they ignore you.
  5. Then you lose
in just a matter of a few days. The great Pielke meltdown was featured first at Rabett Run. Then Tim Lambert at Deltoid decided to throw Roger a pity party. Now Thers in Whiskey Fire moves the ball forward starting with Roger's plaint
The big fish then feed on the minnows, for instance, Real Climate and Brad DeLong have cited Tim Lambert as an authority, including on my own work, yet to my knowledge Lambert has never actually engaged anything I've published in the peer reviewed literature much less any substantive arguments that I've made. Of course he doesn't
Thers points out that this is really a piece of splendidiferous writing (Parental discretion advised) equal to the best Tom Friedman ever wrote and expands upon it
Big fish eat tiny minnow fish, except for one of the big fish, who has a fanciful aquatic imagination and also employs powerful farm machinery for the purposes of forward progress. The large fish and the devoured minnows and the farmer fish are all in the same family. . .
There is a truth here, Roger has become the Tom Friedman of climate blogging, a writer of mush, but an awful lot of mush.

UPDATE: Since this is getting linked to and what follows is more of a tweet than a post, Eli will answer the question of what does Roger want: To rob the IPCC and US Global Change Research Program of any legitimacy. Given the rather prominent position in CIRES of several key contributors to the IPCC and the incestuous connections of the university to the local government labs involved in the USGCRP, this is not without dangers for him.


Wednesday, October 21, 2009

Roger vs. Roger, a game any bunny can play

Eli came back from work today to find Ethon playing with his new Wii, a huge grin on his beak. The bunny asked what's up, and the large liver loving eagle replied that he had just gotten a new game, Roger vs. Roger. How's that go?

Well, it gives you statements that Roger has made in the past and you have to parse them so they agree with statements Roger will make in the future. It's quite complex, sometimes you have to twist the words so hard that they scream. When you first play the game they give you a fine, and much quoted example from RP's Nature interview to start:

Climate change may not be the dominant factor, but it has become clear that a relevant portion of damages can be attributed to global warming.
and, from the same source
Clearly since 1970 climate change (i.e., defined as by the IPCC to include all sources of change) has shaped the disaster loss record.
When this was pointed out Roger invoked the Humpty Dumpty defense, a favorite of his

`When I use a word,' Humpty Dumpty said in rather a scornful tone, `it means just what I choose it to mean -- neither more nor less.'

`The question is,' said Alice, `whether you can make words mean so many different things.'

`The question is,' said Humpty Dumpty, `which is to be master - - that's all.'

At several points the author of the Nature article conflates “climate change” and “global warming” which is something we at the workshop were careful not to do.
and of course, he was misquoted.

Wanna play?

Tuesday, October 20, 2009

Two Questions

First, when was the last time that Lindzen ever published any science that was right? Serious question. Eli gets ~ 1990 but is willing to listen. Polemics need not apply.

Second, if Mike Powell still reads Rabett Run, could he stop by. His opus on OISM has link rotted away. If there is a new location Eli would like the URL. If not, the bunny would be pleased to host the remnants.

SO2 Stinks

Well it really does, and it has other nasty habits you would know about if you ever got a face full. Eli has had that experience too, it ain't no walk in the park, and having written serious SOPs for students working with the stuff (we are oh so DOE Tiger Team at Rabett Labs when we ain't sniffing gases out of cylinders) he was always a bit cynical about proposals for blasting zillions of tons of SO2 into the upper trop, besides the question of how you hope to get it up there.

The bunny was going to post something oh so smart on this, among other things pointing to Ken Caldeira's papers on the subject, another reason why Levitt and Dubner seemed a bit full of it when reporting on Caldeira's opinions, however, James Wimberley says it better

(a) this and all such schemes on the relevant scale are seriously dangerous, since we don’t understand the possible side-effects, including changes in regional weather patterns, and anyway leave untouched ocean acidification and other non-greenhouse effects of increased CO2,

(b) the geoengineering options must be studied in depth as an emergency Plan B in case humanity doesn’t cut emissions enough, or the climate turns out to be more sensitive than we thought to those already made. . . .

That's pretty much Caldeira's position.

It’s possible, however, that these climate experts don’t know much about global governance, and I know a little, having spent my working life in intergovernmental cooperation, so here goes on that side of the problem.

1. Because large-scale geoengineering is dangerous, it will only become a live option when emission control efforts have clearly failed and things have reached a crisis: hundreds of thousands dying every year in droughts, hurricanes, coastal floods and so on. The polar bears will already have gone. Whoever does it will need cast-iron political cover against the unforeseen consequences – including the risk of killing millions more.

2. For the same reasons, the measures cannot be national or regional in scale. They will be inherently global in their effects, even if carried out by or in a single country. The political cover accordingly has to be global.

3. There’s only a little room for experiment – primarily to test engineering feasibility and cost (say of Venetian blinds in space.) There’s so much noise in the climate system that the effects of small-scale pilot projects won’t be properly measurable. It will have to be live or nothing.

4. The knowledge required to manage an emergency global geoengineering scheme is very considerable, and very rapid and expensive action will be essential when things go wrong, as they probably will. Accordingly the scheme cannot be run democratically with any hope of success, only technocratically. Thought experiment: you have a project running on ocean fertilisation with iron in the Pacific. Evidence has come up that this is pumping up the El Niño cycle, with droughts and fires in Australia and the collapse of Peruvian fisheries. Do you suspend or not?

That's only the beginning. The bottom line is that geoengineering requires fleets of black helicopters to get done. The requirement for something that will not amuse the guys at the Breakthrough Institute and their CEI/Heartland type funders. (OK, that's a WAGNER, but Eli is a smart bunny). Stuff like that on a global scale requires a global Ghengis Khan to pull the strings.