Saturday, June 30, 2007

Lubos Runs a Three Card Monte Game

The greenhouse effect is a wicked combination of spectroscopy, radiation and collision dynamics superimposed on fluid flow. Gravity makes an appearance through the lapse rate, which among other things means that it is colder the higher you go and there are a few other things thrown in for the joy of it. Explaining it to your mom is not recommended for a family fun night. One of the associated issues is that increasing greenhouse gas concentrations in general, and CO2 in particular, does not increase the greenhouse warming in a simple linear way. Spencer Weart (popular version) and Ray Pierrehumbert (more detail) recently assayed an explanation on Real Climate.

Into the swamp wanders one Lubos Motl, a wanna be climate guru who is running for Bjorn Lomberg of the Czech Republic. Lubos then plays an elegant game of three card Monte with Richard Lindzen as the outside man on the short con. Watch the cards as you listen to the spiel. The trick here is to bury the switch into a morass of polemic and nuttiness. Rabett Labs has filmed the whole thing in slow motion, stripped out the flim to show you the flam. The rubber meets the road about a third of the way down
focus on the gases and frequencies where the absorption rate significantly differs from 0% as well as 100% - will effectively eliminate oxygen and ozone. You end up with the standard gases - water, carbon dioxide, methane, and a few others.

Moreover, you can use the approximation that the concentration of water in the atmosphere rapidly converges to values dictated by other quantities.

which are determined by the extra forcing from increased CO2 concentrations which raises the vapor pressure of water in the atmosphere. But Lubos goes on
The conventional quantity that usually measures the strength of the greenhouse effect is the climate sensitivity defined as the temperature increase from a doubling of CO2 from 0.028% of the volume of the atmosphere in the pre-industrial era to 0.056% of the volume expected before 2100. Currently we stand near 0.038% of the volume and the bare theoretical greenhouse effect, including the quantum-mechanical absorption rates for the relevant frequencies and the known concentration, predicts a 0.6 Celsius degrees increase of temperature between 0.028% and 0.038%,
This is a low estimate of the increase in global temperature due ONLY to the direct effect of adding CO2 without amplifying feedbacks direct from Richard Lindzen. Low, but in bounds. And here comes the switch and there goes the Queen into the grifter's pocket. . .
roughly in agreement with the net warming in the 20th century.
The measured net warming INCLUDES the feedback from increased water vapor in the atmosphere due to warming of the ocean surface. Lucky Lubos covers it back up again
This bare effect can be modified by feedback effects - it can either be amplified or reduced (secondary influence on temperature-driven cloud formation etc.) - but it is still rather legitimate to imagine that the original CO2 greenhouse effect is the driving force behind a more complex process
The direct influence of CO2 concentration increases from the feedbacks AMPLIFY period. So let's see. The direct effect of doubling CO2 would be an increase in global temperature of 0.6 to 1 K. The total effect, the climate sensitivity, will be 2-5 K, with a best estimate of 3K (see the AR4 WGI and all over). And now, the fellow behind the cardboard box asks you to pick the shell with the pea under it
In terms of numbers, we have already completed 40% of the task to double the CO2 concentration from 0.028% to 0.056% in the atmosphere. However, these 40% of the task have already realized about 2/3 of the warming effect attributable to the CO2 doubling. So regardless of the sign and magnitude of the feedback effects, you can see that physics predicts that the greenhouse warming between 2007 and 2100 is predicted to be one half (1/3 over 2/3) of the warming that we have seen between the beginning of industrialization and this year. For example, if the greenhouse warming has been 0.6 Celsius degrees, we will see 0.3 Celsius degrees of extra warming before the carbon dioxide concentration doubles around 2100.
In the reality based community, we have had about 0.7 K of warming in the last century. That is 0.7/3.0 or about 30% of the warming predicted for a doubling, but because the Earth appears not to have re-established radiative equilibrium it is generally considered that there is an additional 0.5K of warming built even if greenhouse gas concentrations cease to grow. 1.2/3.0 = 40%.

As Eli said, watch the switch. The observed warming must be compared to the 3K climate sensitivity which INCLUDES feedbacks. By comparing the observed warming to the much lower 0.6-1K estimate without feedbacks it appears that much more of the warming has already occurred, and that adding more CO2 will have little effect.

There, is, as there always is, much more. Anyhow, Lubos as part of his pitch tells us in detail that he is going to explain it all and starts by getting it wrong in strange ways. After a page or two of polemics, Motl starts out on the science ok, sort of, but quickly falls into two of the usual traps
The requirement that low-energy transitions must be allowed within the molecule is why the mono-atomic inert gases such as argon or even di-atomic molecules such as nitrogen are not greenhouse gases. Those absorbed infrared rays that are relevant for the greenhouse effect are quickly transformed to kinetic energy of the atmosphere and this energy is either re-emitted in the downward direction or it is not re-emitted at all.
Molecules and Motl's basically don't know which end is up, so the energy is re-emitted in all directions, not only downward and although the thermal energy can be moved considerable distances by convection, it eventually has to be re-emitted to space. How the energy gets there in the end is what the greenhouse effect is all about. Lubos continues by showing a figure

The meaning of the ordinate is not clear, but Lubos does provide a link to David Hodell's version at the University of Florida from a course on Global Change. Hodell, btw, does not agree with Motl's foolishness. We learn that Motl's unattributed graph (Eli thinks he has seen it elsewhere, but can't remember) really shows the percentage of incoming solar radiation at a particular wavelength that is absorbed by a molecule before hitting the surface (Hodell's abscissa is backwards from Motl's, Motl's spectra, wherever they came from are a bit higher resolution). The Wikipedia article on global warming provides the context that Lubos lacks

UPDATE:7/2/2007 Lubos has now added this figure to his post. It was not originally there.

The sun's spectrum is a 5000 K black body with a whole bunch of sharp atomic lines in it. Ozone (305-195 nm) and O2 (<0.2>4 um) absorb UV light The emission from the surface is a smooth black body.

From the Wikifigure you can see that in the range where the earth emits IR, water vapor is the most important absorber, but CO2 plays an important role and tropospheric O3 also has an effect. (Note to the interested: The total emission from the sun is much higher than shown in the figure, but only a small fraction of the sun's light intersects the earth's orb and the average amount of energy hitting the surface is further decreased because the sun don't shine anywhere half the time on average and the fact that most places it hits the surface obliquely. This all works out to a difference of a factor of 4 less than the solar intensity hitting a disk held at right angles to the sun in orbit. Thus the relative intensities of sunlight falling on the surface and IR emission from the earth are as shown). Eli could niggle a bit more, both CO2 and H2O have strong absorption spectra below 200 nm which Lubos does not show (they don't play any significant role in absorbing solar radiation), etc. but it is more fun to look at the next lubosaloser
You can see that water is by far the most important greenhouse gas. We will discuss carbon dioxide later but you may also see that we have included oxygen and ozone, for pedagogical reasons. They don't have too many spectral lines but there is a lot of oxygen in the air, a thousand times the concentration of carbon dioxide! So you might think that the precise concentration of oxygen or ozone will be very important for the magnitude of the greenhouse effect, possibly more important than the concentration of carbon dioxide.
Lubos has not quite figured out the difference between a vibrational band and a spectral line. Bands consist of lots of lines. The resolution of these figures is too poor resolve the lines. Ozone does contribute to the greenhouse effect, at least the ozone that we see in the troposphere and ozone has lots of lines in its vibrational spectrum. Oxygen (O2) has a lot of spectral lines, its just they are mostly below 300 nm (there are some very weak ones in the near IR and red, from forbidden triplet-singlet transitions which, as Lubos says, you can see because there is a lot of O2). Motl continues down the bunny trail
The reason why it's not true is that there is actually so much oxygen in the air that the radiation at the right frequencies is completely absorbed - 100% - while the radiation at wrong frequencies is of course not absorbed at all - 0%. At least ideally - when you neglect the collisional broadening and the Doppler width of the lines and other effects - it should be so. That's why the greenhouse effect of the oxygen doesn't depend on the concentration of oxygen in any significant way.
Cod's wallop. The reason why O2 does not contribute to the greenhouse effect is that just like N2, it is homonuclear diatomic. While the oxygen and nitrogen molecules can vibrate, because of their symmetry these vibrations do not create any transient charge separation. Without such a transient dipole moment, they can neither absorb nor emit infrared radiation.
***************
On top of this there are language problems which are not Lubos' alone. I/Io is the ratio of the intensity of light after passing through some distance L of anything. I and Io are functions of wavelength. (Io-I)/Io x 100 is the percent transmitted. Transmission is almost always given as a percentage. Transmission must be between 100% and 0%. Absorption can be found from the Beer-Lambert law
I/Io = exp (-SNL)
where S is the molecular absorption , N the concentration and L the path length. S can in principle take any value (negative values of S would correspond to situations in which there were emitters in the path, not absorbers in which case the transmission would be > 100 %). SNL is called the absorbance and is unitless.

People are throwing around the word saturation. Saturation has a specific meaning for optical transitions. For a simple two level system, when the population of the upper and lower states are equal, the probability of absorbing a photon which matches the energy difference between the states is the same as emitting a second photon (there are caveats) and the transition is said to be saturated.
Down the Rabett Hole

Anonymous and Steve Bloom bring word that NOAA has disappeared the Historical Climate Network site locations down the Rabett hole. This occurred shortly after Roger Pielke Sr., Climate Fraudet and Anthony Watts declared jihad against the surface station network. Big City Lib had it right when he said:
Frankly, this sounds to me to be exactly what the email suggests it is…an attempt to protect volunteers from harassment by the kind of conspiracy nuts that Surface Stations is likely to inspire.
there must have been a considerable kickback from those who maintain the station when McIntyre's crew came around to give lectures.

UPDATE: BCL has some more words on this at his blog

The usual suspects have donned the harrumphing regalia, but Doug Hoyt wrote something interesting
I just got back from trying to photograph a weather site and was told that the weather service had told them that people might come around asking to photograph the site. They were ordered to tell people that photographs are not allowed.
The back story will be interesting.

UPDATE: Anonymuse returns in the comments

When Climate Auditors attempted,
NOAA bureaucrats to cajole,
The station data disappeared,
Down a Rabett hole.

"I swear its a conspiracy.
They want to hide the stations.
They've painted Stevenson screens in camo,
And severed all relations."

"They won't reply to my e-mails,
And refuse to pick up the phone.
I know the station manager's there,
I see him all alone."

"By night, they're removing blacktop
and laying down the sod,
To make it all look A-OK
An A-1 site, by God."

"With guards around the perimeter,
I can't get near enough,
To take a decent photo,
You know, it's really tough."

We gotta show the world,
What NOAA's trying to hide,
These Global-warming believers
Are taking us for a ride.

"If only we could use satellites,
We could get the incriminating pics,
Without risking life and limb,
And dealing with the dicks."

Thursday, June 28, 2007

NOx


Hans Erren was kind enought to point to an interesting site in the Netherlands with a lot of data on trace gas concentrations from various satellites. He said:
If you really want to get an impression of the distribution of antropogenic CO2 emissions you can use the spectral NOx proxy.
Eli took a look at the June 2003 (that's what came up on entry) NO2 map (yes, I know NOx is NO2 + NO + PAN) from the GOME satellite instrument and replied

NOx is a measure of high temperature combustion without trapping or catalytic conversion. It arises in situations where the temperature is high enough that N2 thermally dissociates to N atoms which then react with O2 to form NOx.

NOx is the principal precursor to tropospheric ozone. It will be reduced relative to CO2 in any location where there are legislated ozone limitations. Thus California, the US and Europe have relatively low NO2/CO2 ratios and the industrial areas of China are huge.


Rabett Run had just blogged on CO2 distributions using a direct global mapping so this did not appear that interesting. However, idle time being the curse of long ears, we googled back to the site that Hans had pointed to, and looked at the January distribution (actually a couple of years of monthly distributions). It turns out that the tropsphereic NO2 is much higher in the winter than in the summer. Why?

Well one of the first things you can think of is that there is almost twice as many hours of sunlight in the summer as in the winter (in these parts). That means that the first step in the cycle that creates tropospheric ozone

NO2 + hv --> NO + O(3P)

is much more likely to occur which depletes NO2. BTW, this requires light at wavelengths shorter than 400 nm, e.g. UV-A and UV-B light and there is more of that in the summer too. We discussed this with our friend who does air pollution modelling and that seemed like a good place to start, but, of course there is the problem that the rest of the cycle goes
  1. NO2 + hv --> NO + O(3P)
  2. O + O2 + M --> O3 + M
  3. OH + CO --> H + CO2
  4. H + O2 + M --> HO2 + M
  5. HO2 + NO --> OH + NO2
----------------------------
CO + 2O2 + hv --> CO2 + O3
where the last line shows the net reaction. The net is why CO is often taken as a proxy for tropospheric ozone, e.g you can only complete the catlytic cycle if you convert CO to CO2 by reaction with OH radicals! Otherwise you only have a single step and very little ozone generated smog. Step 5 regenerates NO2, so we don't quite have the answer.

There is one other thing we need to add, termination
OH + NO2 --> HNO3 (nitric acid which gets washed out.)
OH is produced by O3 + hv (<310> O(1D) + O2 followed by abstraction of an H atom from water vapor O(1D) + H2O --> OH + OH. The O atom must be in the first electronically excited state. There is much less OH in the atmosphere in the winter because there are fewer daylight hours and much less UV. That means less H atom generation (reaction 3) thus less HO2 (reaction 4), and a slower recycling of NO back to NO2 (reaction 5), but most importantly, in the winter there will be less conversion of NO2 to HNO3

This was such a pleasing discussion and change of subject (they are changing the accounting system at work, and we are all shell shocked) that we celebrated by going out and raising a glass of carrot juice to Hans.

Under the Hood


Thoreau at High Clearing has a very nice article describing what peer review really is (hint: it ain't auditing). He describes how he handled a paper with the four questions (the details are at his blog, this is just the questions, not the answers)
1) What the hell are they doing here?
2) Do their results make sense? Are the methods solid?
3) Are these calculations relevant to anything? Do they teach us something new?
4) Would these calculations be of widespread interest?

Wednesday, June 27, 2007

What tenure is for:

John Whitehead writes at the Environmental Economist:

An email sent to my Provost and cc'd to my Chancellor:

In regards to the report titled "Measuring the Impacts of Climate Change on North Carolina Coastal Resources" by the Walker School of Business, I am curious to know if this document was peer reviewed prior to its release?

This report completely ignores potentially positive aspects of global warming from which Coastal North Carolina would benefit significantly. . . . . . .

Since Appalachian State is a publicly-funding institution, I am going to make my extreme displeasure known to my legislative representatives. I expect nothing less than a full retraction, public apology and discipline of the authors.
The son also rises

The NY Times published a survey of Americans between the ages of 17-27. Among the interesting findings were:

53. Which comes closer to your view: 1. Global warming is a very serious problem and should be one of the highest priorities for government leaders OR 2. Global warming is serious but does not need to be a high priority OR 3. Global warming is not serious and can be addressed years from now.
  • Very Serious and should be one of the highest priorities for government leaders
    All adults 52% Age 17-27 54%
  • Serious, but does not need to be a high priority now
    All adults 37% Age 17-27 35%
  • Not serious and can be addressed years from now.
    All adults 8% Age 17-27 10%
  • Don't know/No answer
    All adults 3% Age 17-27 1%
31. How important to you are government policies that try to reduce gas and oil use by consumers -- very important, somewhat important, not too important, or not at all important?
63% Very - 27% Somewhat - 5% Not too important - 4% Not at all - 1% DK/NA

This should put paid to many amusing claims. Obviously the word has gotten out.

Monday, June 25, 2007

Retired Reading

For those evenings when you want to settle in with a good mag, the good anonymice point to Rolling Stone and Tim Dickinson's article
The Secret Campaign of President Bush's Administration To Deny Global Warming.
Of course Eli did not think it was a secret. As the article describes the US's current policy on climate change.
The National Academy of Sciences blasted the policy, saying it lacked a "guiding vision, executable goals, clear timetables and criteria for measuring progress." Even the technology promoted in the president's plan was bogus. "It's as if these people were not cognizant of the existing science," one member of the academy remarked. "Stuff that would have been cutting-edge in 1980 is listed as a priority for the future."
Phil Cooney then chief of staff of the White House Council on Environmental Quality plays a starring role as editor:
In May 2002, the administration released its Climate Action Report, a dispatch to the U.N. that documents progress on climate-treaty obligations. The report was developed by the EPA, but internal documents reveal that Cooney edited it to reflect positions advocated by the API and Ford. On the opening page of the chapter on climate impacts, Cooney inserted a litany of language in bold intended to cast doubt on the science: "the weakest links in our knowledge . . . a lack of understanding . . . uncertainties . . . considerable uncertainty . . . perhaps even greater uncertainty . . . regarded as tentative."

But the clumsy caveats weren't enough to obscure the report's real science. With the help of an EPA source, The New York Times filtered out Cooney's waffling and filed a front-page story that called the report "a stark shift for the Bush administration." The report, the Times observed, detailed "far-reaching effects that global warming will inflict" and "for the first time mostly blames human actions for recent global warming."

Cooney was horrified: An obscure government report he had tried to whitewash now threatened to undermine his former employers in the energy industry. Panicked, he called on an old friend for help. Myron Ebell had been a key member of the coalition that crafted the disinformation "action plan." In fact, casting doubt on global warming is Ebell's full-time job: He heads the climate-denial campaign at the Competitive Enterprise Institute, a think tank that was underwritten in part by ExxonMobil.

Funny how Myron pops up

Also Eric Bates interviews Al Gore,
The world's leading climate scientists - the Intergovernmental Panel on Climate Change - issued a report earlier this year that shows global warming is far more advanced than even the most dire predictions had led us to believe. Is there any one finding from the most recent wave of science that alarms you?
The degree of certainty the scientists are willing to assign to their conclusions has gone up. But what's more interesting to me than the IPCC report is the stream of evidence just in the last five months since that report. Many scientists are now uncharacteristically scared. The typical pattern in a dialogue between scientific experts and the general public, of which I'm a part, is for the scientists to say, "Well, what you've heard is a little oversimplified. It's a lot more textured than that, and you need to calm down a little bit." This situation is exactly the reverse. Those who are most expert in the science are way more concerned than the general public.
Robert Kennedy Jr. describes how industry (well the smart ones) see meeting the challenge of man made climate change as an opportunity, not a cost
"We haven't even touched the low-hanging fruit yet," Kim Saylors-Laster, the vice president of energy for Wal-Mart, told the assembled CEOs. "We're still getting the fruit that has already fallen from the trees."

As the discussions at the summit demonstrated, America's top executives know something that the Bush administration has yet to realize: America doesn't need to wait for futuristic, pie-in-the-sky technologies to cut its reckless consumption of oil and coal. Our last, best hope to stop climate change is the free market itself. There is gold in going green, and the same drive to make a buck that created global warming in the first place can now be harnessed to slow the carbon-based pollution that is overheating the planet.

And Evan Serpic writes about putting together the Live Earth concert

Sunday, June 24, 2007

Forest fires

(This is a continuing part of the mob blog on the carbon cycle. Tamino at Open Mind is posting on the latest trends in atmospheric CO2 concentrations and Simon Donner at Maribo wants to tell you about where all the carbon goes.)

There is an ongoing discussion in our mob blog about the role of forest fires. Horatio Algeranon lit the incendiary device. Tamino picked it out as particularly interesting and re-published the comment. Eli was wondering if there was a way to get at this.

Because most of the combustion occurs at relatively low temperature (relative here is still high enough to cook bunnies and little deer plus the occasional house) where the system is very much NOT optimized for emissions, forest fires are huge sources of CO (carbon monoxide). This is also true of agricultural burning, whether to clear ground for farming, or to reduce crop stubble after a season. The AIRS instrument can also track CO as well as CO2



The above map from September 2002 shows the major agricultural burning going on in Amazonia, Africa and Indonesia. This burning is quite interesting, also being a major source of tropospheric ozone in the Southern Hemisphere. Mongabay has a spectacular series of false color images showing how the burning in Africa follows the seasons. SAFARI and EXPRESSO are two measurement campaigns that have given us more information on this.

However, back to the point. There are two videos that all mice should see. The first shows CO emissions at the 500mb level in August and September of 2005. The second shows CO emissions from the huge fires in Alaska in 2004. They occured from May to Sept. Eli is not sure what the calibration is on the last one so it is hard to compare. On the other hand we can look at the CO2 readings at Barrow Alaska, and see if there is a jump in the period of the fires
       Mar    Apr    May    June   July   Aug    Sept  
2003 381.44 381.43 382.21 380.76 371.00 364.74 368.28
2004 382.17 383.80 383.47 380.50 371.75 366.50 367.86
And we can do the same for Alert Station Canada which is downwind of the fires
       Mar    Apr    May    June   July   Aug    Sept
2003 380.90 381.39 382.38 381.02 373.78 367.97 368.55
2004 381.58 383.21 383.58 382.59 374.58 368.69 368.55
The jump, if any is well within the range of natural variability which can be gauged from the Mar and Apr reading. Seasonal variability is 12-15 ppm at these high latitudes. Anyone interested in more information can go the linked cdiac site.

On the other paw, the issue is not so simple. Forest fires put CO2 into the air from burning, but they also can decrease CO2 emissions from soils after the fire by among other things destroying ground cover.
Coming soon to your comments section

Eli was reading Stoat, when he came across this comment from one T. Allen

Its a big mistake we are making talking to alarmists as if they have evidence of their own. We get caught out in various dead end arguments and it creates the false impression that there are some good points on both sides of the debate.

Whereas in reality whats going on is the refusal of the alarmists to show up with evidence for their paradigm. A paradigm which is already falsified.

This refusal to put forward evidence is a good tactical move considering the pack-instinct and common purpose of the alarmists. Because if they never put forward any evidence the relatively easy process of overturning this evidence cannot proceed.

Now, besides lack of content, there are some exceedingly odd constructions in there, which naturally raises the astroturf flag. The bunnies at Rabett Labs rolled out the Googler for: alarmist paradigm climate, and came up with the excellent Bob Carter at the Daily Telegraph.

Carter, of course, is a geologist, a rock head if you will, who sees no difference between six million years and six years. Geologists tend to ignore exponents evidently:
The first is a temperature curve for the last six million years, which shows a three-million year period when it was several degrees warmer than today, followed by a three-million year cooling trend which was accompanied by an increase in the magnitude of the pervasive, higher frequency, cold and warm climate cycles. During the last three such warm (interglacial) periods, temperatures at high latitudes were as much as 5 degrees warmer than today's. The second graph shows the average global temperature over the last eight years, which has proved to be a period of stasis.
Looking at the change per year (or the rate of warming) for 3 million years (btw, those were great weather stations then, not like the kind that Ross and Roger are going on about, so we can trust them to the dot), let's make that 6 C to be generous, that is about 0.00001 C/year.
Our current global warming is about 0.015 C/year. What is a few orders of magnitude, the sign is correct? However this is not the only place where Bobby does not understand rates.
Our devotee will also pass by the curious additional facts that a period of similar warming occurred between 1918 and 1940, well prior to the greatest phase of world industrialisation, and that cooling occurred between 1940 and 1965, at precisely the time that human emissions were increasing at their greatest rate.
In this case a rate is inappropriate. CO2 forcing depends on the absolute amount of CO2 in the atmosphere, roughly as log [CO2] (the logarithm of the forcing). Starting from [CO2]=[CO2]o + d[C02] we get that the difference in the forcing is d[CO2]/[CO2] where [CO2]o is the concentration at the start and d[CO2] is the difference between the start and the finish. The change in the forcing depends on the ratio of the change to the concentration in the base period. Rates of change per unit time (d[CO2]/dt) can be very large if you start from a low level, but this has little to do with the effect which depends on the amount of change relative to the pre-existing concentration.

Of course, there is the bottom (and we are trolling here for bottom feeders) line
First, most government scientists are gagged from making public comment on contentious issues, their employing organisations instead making use of public relations experts to craft carefully tailored, frisbee-science press releases. Second, scientists are under intense pressure to conform with the prevailing paradigm of climate alarmism if they wish to receive funding for their research. Third, members of the Establishment have spoken declamatory words on the issue, and the kingdom's subjects are expected to listen.
which T. Allen has swallowed. BTW, notice the neat attempt to claim that NASA and NOAA are suppressing Landsea and Christy and Spencer. It's as if George Deutsch never existed and Thomas Knutson was being pushed forward daily by NOAA. As Warren Washington said

Warren Washington, a senior scientist at the National Center for Atmospheric Research in Boulder, said that Bush appointees are suppressing information about climate change, restricting journalists' access to federal scientists and rewriting agency news releases to stress global warming uncertainties.

"The news media is not getting the full story, especially from government scientists," Washington told about 160 people attending the first day of "Climate Change and the Future of the American West," a three-day conference sponsored by the University of Colorado's Natural Resources Law Center.

.....Washington said in an interview that the climate cover-up is occurring at several federal agencies, including NASA, the National Oceanic and Atmospheric Administration, and the U.S. Forest Service. NOAA operates several Boulder laboratories that conduct climate and weather research.

Washington's comments echoed statements made by NASA climate researcher James Hansen in a Jan. 29 article in The New York Times. Hansen said the Bush administration tried to stop him from speaking out after he called for prompt reductions in emissions of greenhouse gases linked to global warming.

......Washington insisted that government officials are "trying to confuse the public" about climate change and the scientific consensus that global warming is a real problem.

Rick Piltz at Climate Science Watch has the details
BBC Panorama: Has the Bush administration covered up the findings of global warming scientists? 6/07/06 - with remarks from NOAA's Thomas Knutson and Jerry Mahlman

NOAA censors speech by science experts on endangered salmon 6/04/06

Rep. David Wu requests GAO investigation of science manipulation and censorship 5/9/06 - including NOAA

House Science Committee Chair Calls for Reform of NOAA Public Affairs Policy 4/12/06

Washington Post reports "Climate Researchers Feeling Heat From White House" 4-9-06

Senators call for National Academy auditing of government reports on climate change 3-29-06 Directed toward actions of Dr. James Mahoney, Assistant Secretary of Commerce for Oceans and Atmosphere

Providence Journal: "NOAA hiding truth about hurricanes, scientists say" 3-29-06.

Former NOAA Lab Director: "Climate scientists within NOAA have been prevented from speaking freely" 3-10-06

Senate committee lets Adm. Lautenbacher off the hook on NOAA media restrictions 2-24-06 (Eli wrote on this one earlier)

Sen. Mikulski's letter requesting GAO report on openness in federal science communication 2-21-06 (specifically mentions NOAA)

Jim Hansen: NOAA "by fiat" put out "biased information" on hurricanes 2-16-06

The NOAA Media Policy: Political pre-approval for public communication by scientists 2-14-06
Claiming that otheres are doing to you what you are doing to others is an old denialist trick. Do not accept delivery.

Saturday, June 23, 2007

Latest trends in measuring CO2 sources

This is an addition to our mob blogging on carbon in the atmosphere and elsewhere. Tamino at Open Mind has a great article on trends in CO2 mixing ratios, Simon Donner at maribo tells you where the CO2 goes, and Rabett Run had a crude model showing how to model the flow of carbon using a box model. The comments from the anonymice have been everything we hoped for. Thanks.

Today, while looking for something else, Eli came across the answer to another question, where does the CO2 come from. Over the years there has been a fair amount of discussion about why measure CO2 on Mauna Loa or the South Pole, or Barrow Alaska, or the other places where it is measured. Another strand we have seen the denialists use as a thin comb over is that China should cut its emissions first, and, of course that evergreen that North America (they always say the US, swallowing Canada) absorbs more CO2 than it emits.

AIRS is the Atmospheric Infrared Sounder on the Aqua satellite. Originally designed to measure water vapor (why do you think it is on Aqua), it has a ~2500 element IR spectrometer. Relatively recently NASA has figured out how to use this to measure CO2 in the atmosphere.

This is a mid-troposphere image at 8 km. Simon Donner has a recent series of posts on CO2 emission intensities but this figure is worth a thousand posts (even Simon's:) Clearly the two most intense sources are the east and west coasts of the US. While there is a region of higher emissions to the west of China, it is much weaker than that from the US. In the southern hemisphere there is a band that passes through the River Plata area (Buenos Aires, South Africa and the more settled parts of Australia. The data is available, and we can look forward to the movie (starting in 2002).

Thursday, June 21, 2007

Monday, June 18, 2007


Something new

Something new. Three of us, Tamino, Maribo and Eli, have joined together to mob blog. Mob blogging, maybe our invention but Eli doubts we can patent it, brings together a group with similar interests, climate in this case, and creates a set of posts, one or more on each participating blog, about the same general subject, different aspects, linked together in time and purpose. As a first topic, we have chosen the carbon cycle. All of us encourage comments and discussion. We will happily cross-post and feature comments that we think add value to the mob. There will also be a links to other resources on the carbon cycle that the mob and the mice thinks are really, really good.

Sunday, June 17, 2007

Doc Martyn gets the boots (with box model)

This is Eli's humble contribution to our first mob post. Tamino at Open Mind is posting on the latest trends in atmospheric CO2 concentrations (buy, they are going up) and Simon Donner at Maribo wants to tell you about where all the carbon goes.

To get back to the shoe store, for those of you who don't know, Doc Martins are steel toed boots favored by the British version of football hooligans who want to get up close and personal. It is also the name of a climate hooligan on Real Climate who pretends he knows something about the carbon cycle and flims the flam. Eli ran into this character when he proposed to calculate the atmospheric lifetime of CO2
To do this I needed an estimate of the amount of CO2 released by Humans per year. . . I also used the Hawaiian data from 1959 to 2003 (averaging May and November).

The steady state equation for atmospheric [CO2] ppm is as follows:-

[CO2] ppm = (NCO2 + ACO2)/K(efflux)

Where:-
NCO2 is the release of CO2 into the atmosphere from non-human sources
ACO2 is man-made CO2 released from all activities
and K(efflux) is the rate the CO2 is removed from the atmosphere by all mechanisms.

You can calculate that NCO2 is 21 GT per year, ACO2 in 2003 was 7.303 GT and K(efflux) is 0.076 per year. This last figure gives a half-life for a molecule of CO2 in the atmosphere of 9.12 years.

The Rabett's reply was
Flow of carbon (in the form of CO2, plant and soils, etc) into and out of the atmosphere is treated in what are called box models. There are three boxes which can rapidly (5-10 years) interchange carbon, the atmosphere, the upper oceans, and the land. The annual cycle seen in the Mauna Loa record (and elsewhere) is a flow of CO2 into the land (plants) in the summer and out of the atmosphere as the Northern Hemisphere blooms (the South is pretty much green all year long) and in reverse in the winter as plants decay. Think of it as tossing the carbon ball back and forth, but not dropping it into the drain. Thus Doc Martyn's model says nothing about how long it would take for an increase in CO2 in the atmosphere to be reduced to its original value.

To find that, we have to have a place to "hide" the carbon for long times, e.g. boxes where there is a much slower interchange of carbon with the first three. The first is the deep ocean. Carbon is carried into the deep ocean by the sinking of dead animals and plants from the upper ocean (the biological pump). This deep ocean reservoir exchanges carbon with the surface on time scales of hundreds of years. Moreover the amount of carbon in the deep ocean is more than ten times greater than that of the three surface reservoirs.

The second is the incorporation of carbonates (from shells and such) into the lithosphere at deep ocean ridges. That carbon is REALLY lost for a long long time.

A good picture of the process can be found at
http://earthobservatory.nasa.gov/Library/CarbonCycle/Images/carbon_cycle_diagram.jpg

A simple discussions of box models can be found at
http://www.nd.edu/~enviro/pdf/Carbon_Cycle.pdf

And David Archer has provided a box model that can be run online at
http://geosci.uchicago.edu/~archer/cgimodels/isam.html

and here is a homework assignment
http://shadow.eas.gatech.edu/~jean/paleo/sets/undergrads1.pdf

There ensued a great deal of posing on Doc's part and confusion ensued about rather simple kinetics and similar things. Thus this post and some which will follow.

Eli has created a spreadsheet, called Box for Doc's boots, and the anonymice are free to play with it and improve thereon. You can download it at Rabett Labs, a new Google Group. The file walks you through a number of simple models. Eli does not have the space and you the patience to go through this in detail, but we can start with the simple stuff which will give you an idea how to roll your own. There are lots of better ways to do this. Symbolic algebra (and more) packages such as Mathematica, Maple and Mathcad spring to mind, as do such things as Origin and Igor on the spreadsheet/graphing side, but lots of people have Excel, and the folks that have the other packages, probably don't need this.

On the first sheet called first order decay, you get to play with a simple, first order decay where the rate of change of [A] at any time t is simply proportional to the the amount of A. The rate equation for this is

d[A]/dt = -k[A]


and if you don't speak calculus you can just read that as the rate of change of the concentration of A with time is equal to some number (the rate constant, ka) times the concentration of A. You can change the rate constant, and the initial concentrations. Eli, being a simple Rabett used Excel, and a simple differential equation solver called the Euler method. You can Google it, but the idea is that if you measure A at some time (t), and then later at (t+dt) then
[A(t+dt)] = [A(t) ]- k[A(t)]dt
This works if dt is small. In this case there is an exact solution A(t) = A(0) exp(-kt). Eli set the spreadsheet up so that you enter negative numbers (-k) for the rate constants.

This, basically is the idea that DocMartyn was pushing. The problem, of course, is that you move the carbon from the fossil fuel deposit into the atmosphere, and from there it goes into the land (soils/vegetation/rabetts) and the upper ocean, but it does not get lost, it comes right back into the atmosphere in a few years.

To model this, we have a simple two box model called "Opposing Reactions" You have box A and box B. And not only can you move carbon from box A to box B, but also from box B to box A. That looks like this

The rate constants are kab and kba respectively. After some time, the system will come to equilibrium. The rate equations are

d[A]/dt = - kab [A] + kba [B] and d[B]/dt = - kba [B] + kab [A]
if you add the two you find that d([A]+ [B])/dt = 0 , in other words the total amount of carbon just moves between the two boxes. There is an exact solution of the system of equation, but here we simply use Euler integration. The ratio of the equilibrium values of [A] and [B] are simply given by

kab / kba = [B]eq/[A]eq

You can play with this by changing the initial amounts of [A] and [B] or the rate constants at the top of the spreadsheet. Remember this is a toy you can play with to get some feel for the system. If things start oscillating wildly or diverge in strange ways you probably have to decrease either the step size or the rate constants. There are a couple of other simple spreadsheets. The next, CO2 pulse, shows what happens if you push a pulse of CO2 into the atmosphere. This is followed by "fossil fuel, where a constant amount of CO2 enters the two box system each time step.

We are now in position to put a more realistic carbon box model together. We can use the figure up towards the top to estimate how many Gt of carbon there are in each reservoir. We will for now exclude the geological reservoirs. Carbon moves very, very slowly into and out of rocks and sediments and we can treat them as being roughly constant. The first thing that one observes is that the deep ocean, with ~38,000 Gt C is MUCH bigger than the Atmosphere ( ~750) and the upper or surface ocean (~ 1000) and the land (~2000). The land includes both C in soils and in biosystems.

Us bunnies like to keep things simple and Mom Nature has helped us out. To a first approximation there is no flow of carbon between the land and the upper or deep ocean and between the atmosphere and the deep ocean for sure. Think about that for a moment, they really don't touch much. But Mom ain't that nice. We have been using first order rates to describe the flows between the various boxes. What that means is the change per unit time is equal to -kxy Mx where Mx is the mass of carbon in an box and kxy is the rate constant. You can see what happens when everything is linear in the spreadsheet called linear box model. One of the things you have to do in this model and the more realistic one below is adjust the rates so that with no fossil carbon flowing into the system they balance each other and the flow into each reservoir equals the flow out. Eli has done this in the spreadsheet called equilibria.

The linear model does not work because two of the flows are actually highly non-linear, the flow from the surface of the (upper) ocean into the atmosphere, and the one from the atmosphere into the land/soils. The former is governed by a series of chemical equilibria between CO2, H2CO3, and the negative ions HCO3(-1) and CO3(2-) as well as other ionic species dissolved in sea water. Roger Revelle's major contribution was giving us an understanding of these complex equilibria. The bottom line is that the flow from the upper ocean to the atmosphere is proportional to the ninth power of the mass of carbon in the upper ocean (Mu^9).

The rate at which CO2 flows from the atmosphere into the land is controlled by photosynthesis. This is much slower than linear, proportional to Ma^0.2. This flow depends more on biological and solar factors than the amount of CO2 in the atmosphere.

Rabett Labs created the final two worksheets which include the correct functionality for the fluxes, between the land, air, and upper and deep oceans. In the first, the emission of CO2 from fossil fuel is constant over time. That is called "Constant Fossil". In the last one, called "Stop Emitting" you can enter your own scenerio and see what the effects are. In the example shown below Eli let emissions of CO2 from fossil fuel continue for 200 years at a bit more than today's rate and then cut it off. Notice that the decay takes hundreds of years. The

Anyhow, have fun with the new toy. David Archer has a more realistic model, Shodor has one,

the Maryland Virtual High School has one, here is yet another and there are more out there. The purpose of the Rabett Lab model is to simplify things as much as possible for the mice to play.

Wikipedia SLAPPed?

Eli, the innocent bunny, appears to have wandered into the mother of all SLAPP suit between Mark F. "Thor" Hearne and the Wikipedia. The Wikipedia is playing it tight so this looks serious folks. Will the Electronic Frontier Foundation ride to the rescue?

UPDATE: The Wikis have their say:

For those who are reading all kinds of wild stuff into this (yes, I've read Rabett's blog), I'll just say a few words about OTRS volunteers, who we are and what we do.

Wikipedia gets quite a lot of email from people who have a problem with an article, and it's our self-selected job to handle it. Usually when we get an email, it's because there really is something wrong with an article. It's not a matter of people saying "do stuff or we'll sue", because if it ever got to that level we'd have screwed up and the Foundation would be handling it (commonly known as Wikipedia:Office). We're ordinary editors and administrators who try to fix problems. If you're dissatisfied with the way we perform our edits or use our administrative powers, we're approachable and will fix a problem if we screw up. If you can't get agreement from us that we did screw up, you can use the normal dispute resolution procedure, up to and including Arbitration. We don't make an edit that we wouldn't do anyway, with due regard to the biographies of living persons policy and under all other relevant policies of the community. We're not special and we don't carry a badge. But we do ask you to please make allowances for the fact that we can't talk about emails we receive, which are confidential. --Tony Sidaway 06:13, 18 June 2007 (UTC)

Frankly, as Eli has said before, if this were the case the REASON for wiping the entry could be posted without posting the name. What we do know is that Hearne is a lawyer who was energetically editing his own biography and anything he could lay his hands on to eliminate all references to the ACVR.

CONTINUED: First some background. Mark Hearne is a partner at a St. Louis Law firm, with strong connections to the Republican Party in the US. He was, until not so long ago, also General Counsel and one of the founders of the American Center for Voting Rights, a 501c(3) (tax free) organization. According to its entry in the Wikipedia

The American Center for Voting Rights or ACVR was a non-profit organization founded by Mark F. "Thor" Hearne that operated from March 2005 to May 2007 and pushed for laws to reduce voter intimidation and voter fraud, including photo ID for voters. Its lobbying arm was called the American Center for Voting Rights Legislative Fund. Critics noted that it was "the only prominent nongovernmental organization claiming that voter fraud is a major problem," and called the Center a Republican Party front group whose support of a photo ID requirement was intended to suppress the minority vote.[1]
The Wiki entry for Hearne, in addition to large sections from his law firm bio, included
In February 2005, with encouragement from Karl Rove and the White House, Hearne founded the American Center for Voting Rights. Although the group posed as a nonpartisan watchdog group looking for voting fraud, critics, such as People for the American Way and various state chapters of the League of Women Voters, have said that the group was a Republican effort and pursued only allegations of voting fraud by Democrats.
As the scandals in the US Department of Justice heat up, and it has became increasingly clear that the firings of old US attorneys and the appointment of new ones was closely connected with Republican attempts to suppress voting, the ACVR disappeared Mark Hearne disappeared all references to it in his bio at the law firm. If you want more detail about Hearne's involvement, go read Murray Waas' article.

And then things began to disappear in the Wikipedia. It started as your usual revert war, but then the editors stepped in and sent Mark F. "Thor" Hearne down the memory hole by redirecting any Wiki search on Mark Hearne to the page for the ACVR. And when the redirect was reverted, they protected the revert against anonymice, and when it was reverted, they protected the revert against anyone but an editor (listening o' Stoat). BUT all the redirecting was done without placing ANY notice in the discussions, or in the talk pages about WHY this was done, so it looked like your usual Wikivandals at play.

Now, as we said at the top, Eli is just your average carrot munching Rabett, and so he inquired (at 3:06 on June 16) WHY this was done on the page for asking about such things:

Mark F. "Thor" Hearne (edit|talk|history|links|watch|logs)ZScout370 has deleted a substantial article redirected anyone from this page to four times since 13 June an. Tizio has protected the page. These actions have eliminated a substantial article. The last edit by 146.115.58.152 should be restored and protected against ZScout370's vandalism.
and got this reply
Declined - talk to the protecting administrators who seem to have good reasons. Accusing them of vandalism is a bad idea. Kusma (talk) 13:42, 16 June 2007 (UTC)
which seemed a mite touchy. So the bunny first wrote to Kusma

Zscout370 provided no information in the "Mark Hearn" page discussion as to why he had taken action. This by itself violates the spirit of the Wikipedia which asks that major changes be commented on and justified in the discussion. After I inquired ZScout370 wrote that someone (no identification) sent an email asking that the article be removed and ZScout370 redirected it. ZScout370 provided no further information. This has no explicative power whatsoever. I have put a series of questions about this action on ZScout370's talk page. To justify ZScout370 action, to show that it was not vandalism, ZScout370 should provide satisfactory answers, such as, in what way was the article unfair or incorrect. Please note that in asking this I am not claiming that the article was fair or correct, but a short search shows that it was pretty accurate, with much of the information coming from Hearne's own bio at his law firm.

and then got hisself over to Z's talk page and saw this:

There is no reason to have Thor Hearnes page directed else where except for political motivations. Ignisdu 17:55, 14 June 2007 (UTC)

OTRS is the email communication system people outside of Wikipedia use to email us. Someone complained about the article and in my decision as an OTRS user, I decided to redirect the article. To answer Yellowdesk's questions, OTRS emails are private and only those with access can read the emails. No political considerations were made when doing this action, since Democrats also use the system. User:Zscout370 (Return Fire) 18:09, 14 June 2007 (UTC)

Which says very little, but it does establish the fact that there was a letter from someone somewhere somehow that said nasty things about the Wiki article on Mark F. "Thor" Hearne and the editors (whomsoevertheymaybe) assigned old Z to pull the plug. To their credit, they did provide some back up (see Kusma's backhander up above and)

  • I don't know if this helps, but I have reviewed the ticket, and looked at Zscout's actions here. The article on Hearne appeared to be what we are now coming to call a coatrack, an article that was actually about something else. Such articles can be POV forks in disguise, or may simply give undue weight to a small part of someone's life - especially where the rest is not especially visible, as appears to be the case here. The solution - a redirect - is sane and practical, and does not preclude discussing Hearne's role within the group, provided it is done in accordance with our policy on living individuals and is properly supported by sources. Guy (Help!) 19:20, 14 June 2007 (UTC)
  • Do you think there is a method to rectify the topic complained about so that editing may resume on the article? -- Yellowdesk 19:24, 14 June 2007 (UTC)
  • OK, thank you both, JzG / Guy and Zscout370 for the clarifications on this surprising topic. -- Regards, Yellowdesk 23:01, 14 June 2007 (UTC)

Guy and Yellowdesk are good buddies and backup for Z while he plunges the redirect in. Eli tho is a Rabett who reads, so first he asked Z:
*So let me understand this, you get a mysterious Email from somewhere, you will not disclose the contents, but you claim this validates your action??? The fact that you get Emails from Democrats also proves what?? Do you also take action on Emails from Democrats? Like what?? Sorry, that does not even pass the silly test.

It appears that there are several open questions. The first is what was the specific complaint about the content. Specific, e.g. factual errors, omissions not just and "I don't like it". If there were no specific complaints your action is against the spirit of the Wikipedia.
Second what did you do to check the validity of the complaint. Please point to primary sources which show the specific complaints were valid. Other issues noted belowUser:Eli Rabett

and Guy, well Eli asked Guy
*What specific part of Hearne's career is being neglected. This is a person who has been very prominent in the discussion about voting law in the US. Much of the text comes from his bio at his law firm, which was found at the top of a simple Google search http://www.lathropgage.com/people/detail.aspx?attorney=1584
A lot of this stuff in the Wiki bio has a "citation" needed in the last edit before you started to redirect the page. I just gave you one. What is missing in his law firm bio is his connection to the redirect page American Center for Voting Rights, information which he has only recently edited out of his law firm bio. BTW, there is a picture on that page also.

Since Hearne has actively tried to eliminate all references to his involvement with ACVR a further question arises as to whether the letter of complaint came from his law firm or from him and if it did, if any actions were threatened? User:Eli Rabett
Z proceeded to blow the gaff
I am not allowed to say. User:Zscout370 (Return Fire) 03:38, 17 June 2007 (UTC)
If you follow the droppings, even a young bunny could tell you what is going on.
Are you allowed to say why you are not allowed to say? Was it because legal action was threatened? If so, what actions is the Wikipedia taking? User:Eli Rabett
Tim got his Ball handed to him. We may hope for a similar outcome, but judging from some other editing going on at Wikipedia by the editors, there is a foul wind coming out of St. Louis.

====================================================

On the blogging side, Brad Friedman has been the go to and read guy on Mark Hearne and Josh Marshall has owned the US Attorney scandal story. Oh yeah, here is a real goodie from Murray Waas

Hearn was reported to have urged, through a senior Justice Department official, William Mateja, that United States attorneys investigate voter groups registering individuals likely to be democratic voters, or to cease investigations of Republican candidates.[3]

Friday, June 15, 2007

Tim Ball Folds


Or as DeSmogBlog put it Ball Bails.
The self-styled Canadian climate change expert, Dr. Tim Ball, has abandoned his libel suit against University of Lethbridge Professor of Environmental Science Dan Johnson. . . .

Johnson said he is now considering whether to accept basic costs or to seek special costs, adding, “I also deserve an apology. I think the nation deserves an apology.”
This, hopefully, is the beginning of the end for denialist SLAPP suits.
A Strategic Lawsuit Against Public Participation ("SLAPP") is a form of litigation filed by a large organization or in some cases an individual plaintiff, to intimidate and silence a less powerful critic by so severely burdening them with the cost of a legal defense that they abandon their criticism
For too long they have used their financial support to suppress those who disagree with them. Balls suit was just another in a long line of such actions which started with S. Fred Singer's suit against Justin Lancaster.

Best wishes to Dan Johnson and here is hoping he makes Ball sweat.

UPDATE: More links at Dan's web site

Wednesday, June 13, 2007

Max Weber and Climate Science

Henry Farrell at Crooked Timber has a useful take on how to handle the various flavors of popular denial. While couched in the clothes of Why we shouldn’t play nice with David Horowitz: A Response to What’s Liberal about the Liberal Arts the last being a book by Michael Berube about his encounters with the various beasts inhabiting the right wing of Ethon's US eagle cousins.
This is the problem of ‘John’ – a pretty obnoxious sounding conservative student in one of Bérubé’s seminars whose stridency in class was underpinned by an apparent unwillingness to think through the implications of his viewpoints. Second, the hacks – the Michelle Malkins, Dinesh D’Souzas, Abigail Thernstroms and David Horowitzes of the world, who purport to be public intellectuals, but who appear willing to be more or less dishonest in pursuit of political goals, and who in many cases don’t seem to believe in the ideal of independent inquiry that motivates the academy. Third are the serious critics, conservative and otherwise, whom Bérubé is willing to engage with (albeit rather impatiently) while complaining about their sometime unwillingness to dissociate themselves from the hacks.

Bérubé’s ideal academy is one that has a place for conservatives, and for people whom he disagrees with radically. Indeed, this is key to the “pragmatic anti-foundationalism” that underpins his specific form of procedural liberalism. Substantive liberals – those who believe in the importance of equality etc – don’t have a monopoly on the truth. Therefore, we need procedural liberalism too,
There is much here to read, and Eli is most happy to see the argument couched in terms of Max Weber's writings. Eli was a great Weber fan in college, and sees in Weber the kind of hard headed realism coupled to analytical honesty that he aspires to himself. There is a great discussion of how to handle opinionated but not very knowledgeable students, the parties of the first part. If nothing else it shows the relative immaturity and shallowness of the various framing discussions found in these blogs. Eli has an excuse, he is a Rabett, but others have a powerful lot of reading to do. The parties of the second part, the hacks, that is a different fish in the kettle
However, I think that they don’t provide good guidance when dealing with, say, David Horowitz, on, say, the horrible state of the academy. It’s worth examining Horowitz’s modus operandi to see why. His main line of attack is that of the standard political hack, concocting a farrago of innuendoes, half-truths and out-and-out lies in order to beat down those whom he sees as his political opponents. However, when he’s attacked in the same terms as those he himself engages in, he’s perfectly happy to appeal to academic norms of reasoned debate in order to accuse his accusers of themselves being politicized. When academics on the contrary try to engage him in reasoned debate, they’ve lost the battle before they’ve started it. They grant his (often preposterous) claims a credibility that they don’t deserve, and set themselves up to have the bejasus beaten out of them through distortion, selective editing etc.
Berube put it pretty clearly
In this context—the Chronicle, as opposed to Hannity & Colmes—this grants Horowitz, and his complaints about academe, a certain legitimacy. My job, therefore, is to contest that legitimacy, and to model a way of dealing with Horowitz that does not give him what he wants: namely, (1) important concessions or (2) outrage. He feeds on (2), of course, and uses it to power the David Horowitz Freedom Center and Massive Persecution Complex he runs out of Los Angeles; and most of the time, we give it to him by the truckload. Liberal and left academics need to try (3), mockery and dismissal, and thereby demonstrate, as I put it on my blog, that when someone tries to blame tuition increases on Cornel West’s speaking fees, that person needs to be ridiculed and given a double minor for unsportsmanlike bullshit.
Which, again, is something that Eli has aspired to and it is a lesson that those who counsel constant comity need to learn. Go read the whole thing.

Sunday, June 10, 2007

Childhood Origins of Adult Resistance to Science

There has been considerable discussion about how to "frame" arguments about evolution, climate science, stem cells, etc. In the May 18th issue of Science, Paul Bloom and Skilnick Wisberg provide a cognitive scientists view of the problem. They attempt to explain the refusal of large numbers of citizens to acknowledge evolution while accepting the supernatural as everyday. This short review and the underlying cognitive research also has important implications for science education. Psychological studies have shown that from birth children form models of their physical environment that enable them to manipulate the world. Quoting from Cary, they point out that
The problem with teaching science to children is, “not what the student lacks, but what the student has, namely alternative conceptual frameworks for understanding the phenomena covered by the theories we are trying to teach”[2].
Surprising examples abound.For example, Bloom and Wisberg offer that a child soon learns that thing fall when dropped, but research has shown that this makes it hard for them to understand that the Earth is a sphere. After all, would not the people on the bottom of the sphere fall off the Earth? This is exacerbated by today’s ubiquity of digital media where things do not always function as limited by physical laws. A correct virtual simulation on a computer or video has no more validity than an incorrect one, and students are daily exposed to innumerable incorrect simulations in games and videos of all descriptions.

We have all seen cartoons where, as a joke, a character tunnels through the Earth, and comes out the other side, only to fall into space.This has an entirely different meaning to a young child than to an adult.

In a world filled with complex constructed visual images why should the image shown in a science class or an INTERNET applet be of more value than one in the video game the student plays after class (or sometimes during).

Bloom and Wesiberg offer an instructive example. In the figure at the left, students were asked whether a ball would follow the path shown in A or B after it leaves the curved tube. Although many chose incorrectly, it was found that actual observation could change minds and models. However, one must also realize that a simulation could show either A or B, and it is only the “real thing” that allows a proper differentiation. There is another barrier
The examples so far concern people's common-sense understanding of the physical world, but their intuitive psychology also contributes to their resistance to science. One important bias is that children naturally see the world in terms of design and purpose. For instance, 4-year-olds insist that everything has a purpose, including lions ("to go in the zoo") and clouds ("for raining"), a propensity called "promiscuous teleology". Additionally, when asked about the origin of animals and people, children spontaneously tend to provide and prefer creationist explanations. Just as children's intuitions about the physical world make it difficult for them to accept that Earth is a sphere, their psychological intuitions about agency and design make it difficult for them to accept the processes of evolution.
Often direct observation is not possible, or extremely time consuming. People learn to deal with these situations by evaluating the trustworthiness of the source that offers the information. This explains the central role of the teacher in learning and the difficulties associated with issues which have become entangled in faith and politics (evolution, climate change). It also explains why it is important to directly challenge those offering false information. The bottom line is

These developmental data suggest that resistance to science will arise in children when scientific claims clash with early emerging, intuitive expectations. This resistance will persist through adulthood if the scientific claims are contested within a society, and it will be especially strong if there is a nonscientific alternative that is rooted in common sense and championed by people who are thought of as reliable and trustworthy. This is the current situation in the United States, with regard to the central tenets of neuroscience and evolutionary biology. These concepts clash with intuitive beliefs about the immaterial nature of the soul and the purposeful design of humans and other animals, and (in the United States) these beliefs are particularly likely to be endorsed and transmitted by trusted religious and political authorities. Hence, these fields are among the domains where Americans' resistance to science is the strongest.

These are truths that can cause one to despair, but they also offer insight into what must be done to educate people about science based issues and a more sophisticated analysis of the tactics that those who are spreading disinformation use.

Thursday, June 07, 2007

Twitchy whiskers

Dano has pointed out the folks over at Climate Audit are undergoing mentalpause, gone all twitchy about the bunny they have. Even hauled out the anyone who talks to himself in the third person is a nutjob line. They obviously failed imagination and literature, but keep the Sitemeter spinning.

Now in general Eli approves of such behavior at the beginning of the summer. Eli and Ms. Rabett have hied off to the beach to look at the bunnies, the sea and the shopping, all low brain function activities. We wish you and yours much the same.

However, in the middle of this Hans Erren asked a thought provoking question over here
Given the fact that the updated Labrijn series for De Bilt was already available since 1995, don't you think that somewhere down the line GHCN and GISS did a very sloppy job with their homogenisation adjustment QC?
Eli provided a simple answer
Eli is seriously at the beach, but within those constraints, the GISS adjustments are based on a method they apply across the board, so they probably prefer to be uniform. Don't know enough about the GHCN homogenisation adjustments.
Which is both right and incomplete after some more investigations. This being later at night, nothing on the tube, and too late to hop in the car and find a beer, with the sweet lassitude of summer nights upon us, the Rabett hied off to CA to shake up the blood, and ran into the same thing amidst the sea of bile. Ethon says that there must be liver there with so much bile and went off with his straw. Even beach places have T1 these days, soafter a bit of poking about, it became clear that these different homogenizations were optimized for and are best used for different things.

A good place to start is Hansen, Sato, et al. from 1999 explaining how they combine records at any location to obtain a single record.
The single record that we obtain for a given location is used in our analyses of regional and global temperature change. This single record is not necessarily appropriate for local studies, and we recommend that users interested in a local analysis return to the raw GHCN data and examine all of the individual records for that location, if more than one is available. Our rationale for combining the records at a given location is principally that it yields longer records. Long records are particularly effective in our “reference station” analysis of regional and global temperature change, which employs a weighted combination of all stations located with 1200 km as described below.
For urban stations, they apply a homogeneity adjustment
An adjusted urban record is defined only if there are at least three rural neighbors for at least two thirds of the period being adjusted. All rural stations within 1000 km are used to calculate the adjustment, with a weight that decreases linearly to zero at distance 1000 km. The function of the urban adjustment is to allow the local urban measurements to define short-term variations of the adjusted temperature while rural neighbors define the long-term change.
Contrast this with the method currently used at de Bilt (can't find the Engelen and Nellestijn article) in Brandsma, T., G.P. Können en H.R.A. Wessels, Empirical estimation of the effect of urban heat advection on the temperature series of De Bilt (The Netherlands), Int. J. Climatology, 2003, 23, 829-845
The influence of urban heat advection on the temperature time series of the Dutch GCOS station De Bilt has been studied empirically by comparing the hourly meteorological observations (1993-2000) with those of the nearby (7.5 km) rural station at Soesterberg. Station De Bilt is in the transition zone (TZ) between the urban and rural area, being surrounded by three towns, Utrecht, De Bilt and Zeist. The dependence of the hourly temperature differences between De Bilt and Soesterberg on wind direction has been examined as a function of season, day- and night-time hours and cloud amount. Strong dependence on wind direction was apparent for clear nights, with the greatest effects (up to 1 °C on average) for wind coming from the towns. The magnitude of the effect decreased with increasing cloudiness. The analysis suggests that most of the structure in the wind direction dependence is caused by urban heat advection to the measuring site in De Bilt. The urban heat advection is studied in more detail with an additive statistical model. Because the urban areas around the site expanded in the past century, urban heat advection trends contaminate the long-term trends in the temperature series (1897-present) of De Bilt. Based on the present work, we estimate that this effect may have raised the annual mean temperatures of De Bilt by 0.10 ± 0.06 °C during the 20th century, being almost the full value of the present-day urban heat advection. The 0.10 ± 0.06 °C rise due to urban heat advection corresponds to about 10% of the observed temperature rise of about 1.0 °C in the last century.
Where they carefully concentrate upon a single station, and a paired rural site. This study attempts to optimize the correction (and thus the record) for a single station. The correction is based on one very local comparison. Which is best? Well what are you trying to do? Obtain the optimal reconstruction for the de Bilt site, or the best reconstruction on a global scale?

Even in the latter cases there are different methods, each of which arguably can be useful. We see that with the USHCN data set. RTFR