Sunday, September 14, 2014

Nevada gambles on Tesla gigafactory

Electric vehicles, if they are charged by green electricity, can reduce carbon emissions. Battery technology is a key factor holding back electric cars. Physics Nobel laureate Burton Richter in his admirable 2010 book, Beyond Smoke and Mirrors: Climate change and Energy in the 21st Century, recommends more research into battery technology.

Accordingly, there were national as well as local issues at stake this week, when the Nevada Legislature met in special session. They voted unanimously to give the Tesla company $1.3 B in tax breaks as incentive to build a $5B Gigafactory (battery factory) near Reno. Tesla claims the factory will create 6,500 new jobs, which works out to $200,000 per job.

I hope they got some of that in writing, because verbal promises are worthless. If Tesla ends up creating only half the number of jobs that it touted, are the tax breaks cut in half also?

Nevada's governor, Brian Sandoval, kept the legislature in the dark until the special session met, and presented it to the legislature as a take-it-or-leave-it deal.

Tesla was negotiating with other states besides Nevada, and was in a position to drive a hard bargain. It took a stupendous mount of bribery ($1.3 B works out to $471 per Nevada resident) to get the factory in Nevada.

Skeptics think that Tesla stock is the latest bubble stock. Insiders at Panasonic, VW, and Daimler have expressed skepticism. Analysts say that the factory will only be profitable if it can reduce battery manufacturing costs by 30% from present levels, and must sell 500,000 cars per year. Last year, Tesla sold under 25,000 cars. The company is not currently profitable.

Critics of the deal came from the right and left. The right was represented by the NPRI , Nevada Policy Research Institute, which claims it supports government "transparency". However, NPRI refuses to disclose its funders, so their funding sources remain officially secret (but widely viewed as a front group for at least one big casino.) NPRI thought the governor's calculations of the benefits of the factor were too optimistic. The left was represented by the NPLA, Nevada Progressive Leadership Alliance, which was concerned with funding vital government services. In the short run at least, the result will be a demand for government services for schools, police, fire, roads, etc, but without any additional tax revenues.

Governor Sandoval claims that Nevada will benefit by $100 Billion over the next 20 years, even after the tax breaks. Even if it doesn't happen, he'll still be OK. The factory won't be running before 2017 at the earliest, and Sandoval is widely touted as a potential vice-presidential Republican candidate for the 2016 election cycle.

I wrote to my state legislators, pointing out the price tag of $200,000 per job. I asked politely if I form a company and create five jobs, do I get a million dollars in tax breaks? I will let my faithful readers at Rabett Run when and if I hear anything. Don't hold your breath.

Saturday, September 13, 2014

A Note About Roger Revelle, Justin Lancaster and Fred Singer

As somebunnies may remember, S. Fred Singer is quite proud of an article in the magazine of the Cosmos Club (a club for movers and shakers in DC) written by Fred Singer and Chauncey Starr that had Roger Revelle's name on it. How Revelle's name got there beyond the fact that Fred Singer put it there is a matter of interest that is explained by Justin Lancaster, Revelle's student and last assistant. Lancaster, to put it softly, was quite skeptical that Revelle was in any meaningful sense an author of that paper and said so.  Singer SLAPPED and got a statement from Lancaster as a settlement

Eli has written about this where you can get up to speed on the whole thing. Of course since Al Gore learned about climate change from Roger Revelle, this has become a stick to beat Gore with. 

Today, Justin Lancaster left a note at Rabett Run, that needs to be repeated
I would have skipped weighing in further on this topic, except (1) it seems to never recede into history (it's surfacing in Climate Change discussions on Facebook in September 2014), and (2) my dear cousin Walter, for whom I hold sincere respect, clearly needs an update (I wish he'd contacted me directly before adding to this slog).

So let's be clear:

1. Fred Singer is the most unethical scientist, in my opinion, that I have ever met. I said so in the early 1990s, publicly, and I am still confident in the truth of this statement.

2. The worst decision I ever made in my life was to provide a retraction of my statements in the early 1990s about Singer's nastiness. The retraction was coerced. It was required to stop the SLAPP suit brought against me by a conservative think-tank in Washington that wanted to keep Fred Singer in action.

3. I was 95% certain that I would win my case in court. But my wife was terrified. In fact, she was terrorized by this lawsuit. We had three young children. I was a Harvard postdoc now needing to find a next academic posting. She was a graduate student at Harvard. My wife was worried about that 5% risk. She was scared we could lose our house and all our assets. We new it would be a 2-3 year ordeal that would drain our resources and attention. The folks at NRDC and EDF chose to not step in; we couldn't afford the $100k+ that the lawsuit would cost. Defending for a year took an enormous amount of my time. That is the meanness and force of a SLAPP suit.

4. Singer distorted my words in his legal complaint and then even more so in his publication in the Hoover Institution volume. Singer flat out lied in that text about my role (and his wife, Candace Crandall contributed to this smear campaign). This chapter is not a sworn statement.

5. My testimony about what happened is sworn under oath, under penalty of perjury. I am an officer of the courts of VT, MA, CA and CO.

6. Everything I said was true. In my negotiations with the 8 lawyers from two national law firms, in which we scripted the retraction, I refused to state that anything I said was untrue. I never admitted to lying, because I never lied.

7. In the coerced retraction, I allowed that my remarks were "unwarranted," because my mother had commonly used that word when conveying to us that we need not have behaved the way we did. I realized that I could have proceeded more carefully and privately with Singer (which I initially had tried to do) and that I need not have made the issue so public. I also realized that because I was not in Revelle's office during the key session between Singer and Revelle, that I could have let Christa's affidavit and the galley proofs themselves speak the story. (Of course that was already hindsight, as Singer would not provide the galley proofs; I only got them from the Scripps archivist the night before my deposition of Singer).

8. I regret allowing the word "unwarranted" in the coerced retraction, because in fact my charges were fully justified when made. It was a three-hour negotiation, because Singer's lawyers wanted me to admit that I made false statements, but I refused. When my lawyer and I stood to quit the negotiation, saying "We'll be happy to see you in court in MA," there was a flurry of "Wait, wait," across the table. Eventually we settled on the word "unwarranted."

9. I never worked for Al Gore, I was not in any way involved in his political campaign and I had nothing to do with Gore's office other than getting a clip from him for a film on Roger's career that was shown in a film at the Rio Earth Summit. My entire focus was on a wrong being done to Roger Revelle's career and Roger's concern for the Earth environment and for humanity.

10. I had formed, in 1987, a non-profit named: "Environmental Science & Policy Institute (ESPI)," ESPI was the only non-governmental organization presenting scientific results at the 2nd World Climate Conference in Geneva, where I served on the Synthesis Committee. ESPI was an NGO registered at the Earth Summit. I was speaking widely at Dartmouth, Harvard, UC and other fora on the science and policy related to the carbon dioxide problem. I served on the NOAA Citizen Advisory Panel and was the first Chair of the Global Change Working Group within the Society for Risk Assessment.

11. Fred Singer started his "Science and Environmental Policy Project (SEPP)" in the early 1990s, practically in direct opposition to ESPI.

12. Singer was associated with an energy-industry-backed cabal, comprising of at least Patrick Michaels and Robert Balling, and loosely coupling Hugh Ellsaesser, Richard Lindzen and some others. I was known to most or all of these folks through face-to-face encounters academically and in governmental meetings.

13. I had hoped that, after having been found with his hand in the cookie jar, Singer would have the good grace to leave this sordid issue in the historical dust bin. Giving him the retraction and apology I hoped would be sufficient. But it was not and he did not put it down. Instead he raised this issue prominently in the public eye, publishing my retraction in newspapers and blatantly misrepresenting the history in the Hoover chapter. And his cabal echoed it all widely to their key blogging network. And that has continued to cascade through many blogging layers, for now more than twenty years!

14. In 2006, when Gore has published his "An Inconvenient Truth," this all erupted again, and I determined that enough is enough. I publicly and unequivocally repudiated and retracted the earlier "Retraction" that had been coerced, and I published the court documents and supporting affidavits and documentation so that people could read it for themselves.

15. The documentation is available online at Cosmos Myth

16. Singer and his supporters did not respond to my 2006 publication because they have no case. AGW is an issue of public concern. Singer is a celebrity in this field, perhaps the leading contrarian, skeptic, denier at the head of the pack for almost two decades. There are no objective canons of ethics in science (unlike for lawyers), so my charge of unethical can only mean "in my opinion" and "based on my standards." Not only do I believe my statements to be true, I have substantial evidence backing them up. And, we now have anti-SLAPP legislation in Massachusetts.

17. This entire episode has been investigated by journalists, described in chapters in two books, become the subject of a play and other media. Despite the bloggers who seem to continue to enjoy piling on the smear while ignoring the factual evidence, I'm comfortable with the outcome of the former more careful and thorough inspections.

Eli is grateful for Justin Lancaster's courage and setting the record straight. 

Willard Tony Plays Dr. Who

One of the advantages of the Tardis is that it allows going back in time and ignoring the present or even the more recent past.  This really is an advantage when you are pretending that the last word on proxy reconstructions of climate is Mann, Bradley and Hughes 1998 like Aunt Judy in the Climate Etc attic or Steve McIntyre reliving past cherry picks.  Willard Tony does a nice jig on ozone republishing on Watts Up what he bleated over at PJMedia (are they still alive?).  There is much to giggle about in WT's attempt to appear profound, Sou is on the case, but allow Eli to start from near the end.

Eli, being a fair bunny, can quote WT

Or does it? Adding to the madness, now there is scientific uncertainty about the actual extent of the ozone problem as it relates to CFCs. More recent science has shown that the sensitivity of the Earth’s ozone layer might very well be 10 times less than was originally believed back in the 1980s when the alarm was first sounded. As reported in the prestigious science journal Nature, Markus Rex, an atmospheric scientist at the Alfred Wegener Institute for Polar and Marine Research in Potsdam, Germany, found that the breakdown rate of a crucial CFC-related molecule, dichlorine peroxide (Cl2O2), is almost an order of magnitude lower than the currently accepted rate:
“This must have far-reaching consequences,” Rex says. “If the measurements are correct we can basically no longer say we understand how ozone holes come into being.” What effect the results have on projections of the speed or extent of ozone depletion remains unclear.
before, well, before.  That link goes back to 2007, almost a pause ago, and is not to a scientific paper, but a news report, one which quotes Rex.  What this is all about is a claim by Pope et al from JPL that the absorption cross-section of ClO-OCl (aka the ClO dimer or Cl2O2) was much smaller than had previously been measured.  This would mean that the rate at which broke apart (photolyzed or photodissociated) after absorbing a UV photon was much slower.  Thus there would be much less ClO and Cl available to participate in the catalytic destruction of ozone.
ClOOCl + hv --> Cl + ClOO
ClOO + M --> Cl + O2+M
and two of the Cl atoms react with two ozone molecules
Cl + O3 --> ClO + O2
To understand how blogscience works just read some of the comments in that Nature news item
Other groups have yet to confirm the new photolysis rate, but the conundrum is already causing much debate and uncertainty in the ozone research community. “Our understanding of chloride chemistry has really been blown apart,” says John Crowley, an ozone researcher at the Max Planck Institute of Chemistry in Mainz, Germany.
and of course, some of the debate and uncertainty lead to new experimental measurement, but that takes a while, and a short while longer to get published.  In 2009, Eli commented on a paper from the Academica Sinica in Taiwan that conclusively showed Pope et al to be wrong.  Of course there was more, most importantly a paper by Burkholder's group (Papanastasiou, et al) at the NOAA Boulder lab (open version sort of).
The UV photolysis of Cl2O2 (dichlorineperoxide) is a key step in the catalytic destruction of polar stratospheric ozone. In this study, the gas-phase UV absorption spectrum of Cl2O2 was measured using diode array spectroscopy and absolute cross sections, σ, are reported for the wavelength range 200-420 nm. Pulsed laser photolysis of Cl2O at 248 nm or Cl2/Cl2O mixtures at 351 nm at low temperature (200-228 K) and high pressure (∼700 Torr, He) was used to produce ClO radicals and subsequently Cl2O2 via the termolecular ClO self-reaction. The Cl2O2 spectrum was obtained from spectra recorded following the completion of the gas phase ClO radical chemistry. The spectral analysis used observed isosbestic points at 271, 312.9, and 408.5 nm combined with reaction stoichiometry and chlorine mass balance to determine the Cl2O2 spectrum. The where the quoted error limits are 2σ and include estimated systematic errors. The Cl2O2 absorption cross sections obtained for wavelengths in the range 300-420 nm are in good agreement with the Cl2O2 spectrum reported previously by Burkholder et al. (J. Phys. Chem. A 1990, 94, 687) and significantly higher than the values reported by Pope et al. (J. Phys. Chem. A 2007, 1, 4322). A possible explanation for the discrepancy in the Cl2O cross section values with the Pope et al. study is discussed. Representative,atmospheric photolysis rate coefficients are calculated and a range of uncertainty estimated based on the determination of σCl2O2(λ) in this work. Although improvements in our fundamental understanding of the photochemistry of Cl2O2 are still desired, this work indicates that major revisions in current atmospheric chemical mechanisms are not required to simulate observed polar ozone depletion.
And what is the possible problem with the Pope study, well, turns out that they were measuring the absorption of ClO dimer in the same region where Cl2 absorbs (btw 300 and 400 nm, kind of bell shaped).  As the abstract discusses, the method of production ClO dimer also produces Cl2 as a by product, and thus you have to know how much Cl2 there is in the mixture you are measuring. Papanastasiou, et al think that Pope et al got this slightly wrong (4.5%).

Also in 2009, there was a nice paper by the Anderson group at Harvard showing that the amount of Cl produced in the photolysis of ClOOCl was exactly what had been pre-Pope expected.

And finally, Pope, now at the University of Birmingham has a new paper (2013) extending the spectral measurements into the visible using cavity ring-down.  Perhaps another post

Eli tried to be nice, pointing this out to WT's fans, but you know. . .

Tuesday, September 09, 2014

A Kaya Festival

Somebunny has pointed Eli to a presentation on the Kaya identity from the Pacific Institute for Climate Solutions.  Not that Eli agrees with everything said, but it provides a basis for discussion, and as several have pointed (Dikran, palindrome and Marion) out this draws out the usefulness of such things.  Anyhow, you can go to the Pacific Institute  site and see the presentation full screen.  You can move through the sections using the menu on the left, or of course, as Eli recommends, you can alternate listening to Wynton Marsalas with the Kaya identity.

Oh yeah, the thing needs a minute or so to load and is set to start at the discussion of the identity, so just click yes. Take your time and beverage of pleasure. Eli will have a Carrot Cola.

Not that Eli Was Ever A Wynton Marsalas Fan

but this is good stuff to listen to before or during the serious in life . . . .

Sunday, September 07, 2014

On the Kaya Identity

Now that two of the three bunnies Eli considers most annoying on the INTERNET have spit the dummy on the Kaya identity, Eli thinks he might have a word.

\text{Global CO}_2\text{ Emissions} =(\text{Global Population})\left ( \frac{\text{Gross World Product}}{\text{Global Population}}\right )\left ( \frac{\text{Gross Energy Consumption}}{\text{Gross World Product}}\right )\left ( \frac{\text{Global CO}_2\text{ Emissions}}{\text{Gross Energy Consumption}}\right )

Dr Roy sees nothing wrong with , but really does not see why it is useful, Willis E says it's an identity, CO2 emissions equal CO2 emissions, who cares?  Roger Jr. says the mathematics are simple, therefore all is good.  Eli will take his word on that.  Oh yes, Roger doesn't like Paul Krugman's take.  Perhaps the math was too complicated?

But, dear bunnies, Eli is here to defend the Kaya identity.  Measuring current, past or future CO2 emissions is not trivial.  The Kaya identity allows one to look at four different factors which may be more easily and perhaps exactly estimated and/or measured.  Three of the factors are ratios.  While we may not be able to measure or estimate the numerators or denominators exactly, we can perhaps get a handle on the ratios. For the last two terms estimates can be gotten by looking at a range of known component systems and trends.

The Kaya identity is useful in that it provides a handle on something we cannot necessarily measure directly, future CO2 emissions (and to an extent past ones).  In this it is very much like engineering thermodynamics which allows us to quantify things we cannot directly measure by providing relationships with things that we can.  Maxwell's equations are useful for other things than bedeviling junior chemistry majors.

Eli was a most peculiar child

Wednesday, September 03, 2014

The Kind of Press Release Eli Likes

September 1, 2014 
Scripps Institution of Oceanography at UC San Diego today announced that Wendy and Eric Schmidt have provided a grant that will support continued operation of the renowned Keeling Curve measurement of atmospheric carbon dioxide levels. The grant provides $500,000 over five years to support the operations of the Scripps CO2 Group, which maintains the Keeling Curve. 
CO2 Group Director Ralph Keeling said the grant will make it possible for his team to restore atmospheric measurements that had been discontinued because of a lack of funding, address a three-year backlog of samples that have been collected but not analyzed, and enhance outreach efforts that educate the public about the role carbon dioxide plays in climate. 
"I'm very grateful to be able to return to doing science and being attentive to these records. When it comes to tracking the rise in carbon dioxide, every year is a new milestone.  We are still learning what the rise really means for humanity and the rest of the planet,” said Keeling.

Wendy Schmidt, co-founder with her husband of The Schmidt Family Foundation and The Schmidt Ocean Institute, said “The Scripps CO2 Project is critical to documenting the atmospheric changes on our planet and the Keeling Curve is an essential part of that tracking process. As government funding for science in general is decreasing, Eric and I are delighted to work with Scripps to help it continue its benchmark CO2 Project.” 
The Schmidt Family Foundation advances the development of renewable energy and the wiser use of natural resources and houses its grant-making operation in The 11th Hour Project, which supports more than 150 nonprofit organizations in program areas including climate and energy, ecological agriculture, human rights, and our maritime connection.
In 2009, the Schmidts created the Schmidt Ocean Institute (SOI), and in 2012 launched the research vessel Falkor as a mobile platform to advance ocean exploration, discovery, and knowledge, and catalyze sharing of information about the oceans. 
In keeping with the couple’s commitment to ocean health issues, Wendy Schmidt has partnered with XPRIZE to sponsor the $1.4 million Wendy Schmidt Oil Cleanup XCHALLENGE, awarded in 2011, and the Wendy Schmidt Ocean Health XPRIZE, a prize that will respond to the global need for better information about the process of ocean acidification. It will be awarded in 2015. 
The Keeling Curve has made measurements of carbon dioxide in the atmosphere at a flagship station on Hawaii’s Mauna Loa since 1958. In addition, the Scripps CO2 Group measures carbon dioxide levels at several other locations around the world from Antarctica to Alaska. The measurement series established that global levels of CO2, a heat-trapping gas that raises atmospheric and ocean temperatures as it accumulates, have risen substantially in the past century. From a concentration that had never risen above 280 parts per million (ppm) before the Industrial Revolution, CO2 concentrations had risen to 315 ppm when the first Keeling Curve measurements were made. In 2013, concentrations at Mauna Loa rose above 400 ppm for the first time in human history and likely for the first time in 3-5 million years. Multiple lines of scientific research have attributed the rise to the use of fossil fuels in everyday activities. 
The measurement series has become an icon of science with its steadily rising seasonal sawtooth representation of CO2 levels now a familiar image alongside Watson and Crick’s double helix representation of DNA and Charles Darwin’s finch sketches. Keeling Curve creator Charles David Keeling, Ralph Keeling’s father, received several honors for his work before his death in 2005, including the National Medal of Science from then-President George W. Bush. 
The value of the Keeling Curve has increased over time, making possible discoveries about Earth processes that would have been extremely difficult to observe over short time periods or with only sporadic measurements. For instance, in 2013, researchers discovered that the annual range of CO2 levels is increasing. This finding may point to an increase in photosynthetic activity in response to a greater availability of a key nutrient for plant life. 
Nuances in Keeling Curve measurements have similarly identified the global effects of events like volcanic eruptions, influences that would have been difficult to discern if measurements were made infrequently or periodically suspended. In addition, the Keeling Curve helps researchers understand the proportion of carbon dioxide being absorbed by the oceans, which in turn helps them estimate the pace of phenomena such as ocean acidification. In the past decade, scientists have come to widely study the ecological effects of acidification, which happens as carbon dioxide reacts chemically with seawater. 
The Keeling Curve could eventually serve as a bellwether revealing the progress of efforts to diminish fossil fuel use. Save for seasonal variations, the measurement has not trended downward at any point in its history.
- Robert Monroe
This is indeed good news and praise is due the Schmidts, but $500K for 5 years is about one NSF grant, and Scripps and Ralph Keeling still deserve the bunnies support.

Higher CCl4 emissions? Eli is shocked, shocked

About a week ago, maybe a bit more, NASA, nay Goddard Space Flight Center released a breathless press release which spread near and far even unto our buddies on the other side of reality and some on our side.

Eli is here to tell you that just about everyone missed the real discoveries in the paper and underlying work, and the paper itself let alone the press release did not tell the whole story.  According to everybunny else the take home was

However, the new research shows worldwide emissions of CCl4 average 39 kilotons per year, approximately 30 percent of peak emissions prior to the international treaty going into effect.

It has been long known that there are significant fugitive emissions of just about all CFCs and ilk and that bottom up inventories of atmospheric emissions always fall short of reality.  This has consequences for our understanding.  It should be no real surprise that the emissions are higher than officially reported.  Of course, the issue is quantifying the amount of emissions and tracing them back to their source and here the Liang paper makes an important contribution.  Yes, IF the Montreal Protocols (MP) were being rigidly enforced globally we would not be seeing such emissions, and we would be seeing a faster decline, but  the best of times has not yet arrived.  OTOH, it is vitally important to know if the fugitive emissions themselves are decreasing.

Buried down at the bottom of the press release is this afterthought
In addition to unexplained sources of CCl4, the model results showed the chemical stays in the atmosphere 40 percent longer than previously thought. The research was published online in the Aug. 18 issue of Geophysical Research Letters.
Geez, even tho it is a press release they could have provided a link.  To Eli this is the most important of the results in the paper.  If one simplistically looks at the problem as a one box model, to explain the slower than expected falls in CCl4 there are two possibilities, either the destruction rate is slower than expected or the emissions rate is higher or some combination of the two.  The problem with deciding which is which, is that the two interact.  If your lifetime is too short it will look like the amount of emissions are high, and if the lifetime is too long the amount of emissions will look too low.  Liang et al have a nice way of showing this

Figure 2. CCl4 global mean trend (ppt/yr) as a function of total lifetime and emissions from the two-box model (gray contours). Purple contours indicate the emissions and τCCl4 ranges that yield IHGs within the observed 1.1–2.0 ppt range (2-σ) between 2000 and 2012, using the current best estimate EFn of 0.94. Red (Advanced Global Atmospheric Gases Experiment (AGAGE)-based) and blue (GMD-based) numbers show emissions and lifetimes derived using the observed IHG and trend for individual years (2000–2012). The dark (light) gray shading outlines the range of emissions and τCCl4 that can be reconciled with the observations for EFn of 0.94 (0.88–1.00). The black diamond symbol shows our current best estimate for τ (thick and thin red bars indicate 1-σ and 2-σ uncertainties, respectively) and the upper limit bottom-up potential emissions for 2007–2012 (thick blue bar shows 1-σ variance) with 1-σ uncertainty shown in black-hatched shading.
IHG- Inter hemispherical gradient   EFn - fraction of emissions in the Northern Hemisphere.  

The black diamond indicates the results one would get using a bottoms up (reported emissions) estimate of emissions and the previous best estimate of the Total removal lifetime of 25 years.  Eli has added the purple dot showing the estimates of Liang, et al for 2007-2012 emissions with a 35 year CCltotal lifetime. The green dot is the result of an earlier study of Xiao, et al on emissions between 1996 and 2004 that used a lifetime which is too short.   Xiao's estimate of emission rates was 74 Gg/yr on average between 1996 and 2004 using a total lifetime of 25 + ~5 years, which is shown by the green dot far to the right.  OTOH, moving up the 0.9 ppm/year contour to the new inferred lifetime of Liang et al., 35 years, brings that estimate of emissions down to ~ 40 Gg/year.

Liang et al divide the world into two boxes, the Northern and Southern hemisphere.  Since most of the emission is in the Northern Hemisphere, there is an inter hemispherical gradient which can be used to calculate the total emissions and lifetime.

Cx is the concentration in each hemisphere, τns the time for inter-hemispheric mixing,  the αs are deposition losses in the oceans and soils and the STE represents dilution of tropospheric air with down moving stratospheric air with CCl4 diminished by photolysis.  f is a factor to convert emissions to concentrations.  The prevalence of land in the Northern Hemisphere and oceans in the Southern has to be taken into account as well as various issues about global circulation.  Liang, et al, then make a number of simplifications to yield

From which with various hocus pocus, they produce the graph above.  Surprisingly the latest evaluation of stratospheric ozone depleting molecules has some surprising reappraisals, including raising the photochemical lifetime of CCl4 from 35 to 44 years, perhaps another post.  In a straight-forward way this would explain part of the unexpectedly slow decrease in atmospheric carbon tet. Liang and other authors on Liang et al were part of that revaluation.  With reasonable values for ocean and soil deposition, the 35 year total lifetime that Liang, et al find is, well, reasonable.  FWIW soil deposition looks really slow, like about 1000 years.  Eli and maybe the Weasel remember arguments about that.

Prominent in the abstract and the press release is that the average emissions over the 2000-2012 time period were 393445 Gg/yr.  Somewhat less, well a lot less, prominent, you have to read the paper, is that emissions have been decreasing.  Between 2007 and 2012 they decreased to between 31 and 45 Gg/yr.  Simply taking the average of these gives an average of 35.5 Gg/yr.  Simple math tells us that the average emissions between 2000 and 2007 would then be an average of 41.5 Gg/yr which is consistent with the numbers shown in Liang's Figure 2 above, with the earlier years clustering to the right of the graph in the 40 Gg.year area and the later ones to the left.  Emissions are decreasing.  Not as fast as we would like, but they are decreasing.

Where is of course the question all bunnies want to know.  Several jumped to the conclusion that all the fugitive emissions are from China and India.  

Inverse 3D modeling is IEHO the best choice for quantifying sources total emissions, reported and unreported, and in 2010, X. Xiao and about 20 friends took a shot  in Atmos. Chem. Phys. 10, 10421, for the period 1996-2004.  As with all such things, that paper was not perfect, and with the passage of time, some of the problems with it have become clearer, but taken together with the new Liang paper there are a number of take homes.  By looking at the time history of CClat stations around the globe Xiao et al was able to infer the location and average carbon tet emissions from various locations during the study period.  By comparison, if you simply want global emissions, the advantage of the Liang, et al method is that the two box model is robust at the price of resolution

For convenience, Xiao et al divided the world up into eight boxes and tried to trace emissions geographically on a finer 64 x 128 point grid

Group I:  Europe, NW Asia (Russia and the stans), and N. America

In this group emission fell rapidly (factor of 5) from the pre-Montreal Protocol estimates.  

Group II Australia/NZ (very small), S. America and Africa

Substantial percentage increases but still minor contributors in the period.  Interestingly African emissions at 6.7 Gg/yr were three times that of S. America.  The regions of major emissions in Africa were South Africa and the Mediterranean coast (think oil production and maybe petrochemicals) but not Nigeria and Angola.  In S. America, the Atlantic coast of an industrializing Brazil was the major source

Group III:  So. Asia (the Mid-East oil patch, India and Pakistan), SE Asia (China and Indo-China, Japan and S. Korea)

Major increases above the pre-MP baseline.  Not very surprising, but note that Japan and S. Korea were bright red as are areas associated with oil production.  

Bottom line is that Xiao et al find that the yearly global emissions fell from ~130 Gg/yr pre-MP to about 65 Gg/yr in 2004 with a slight downward trend.  In addition to estimating the source term, Xiao et al estimated the ocean sink loss.  Using Liang's Figure 2, and tracing up the contour to a global lifetime of 35 years, this is not out of line with Liang, et al for a similar period, ~42 Gg/yr.

So yes, CCl4 in the atmosphere is decreasing ~1% per year, slower than bunnies expected, due to nature (a longer atmospheric lifetime) and fugitive emissions (which are also decreasing).

Friday, August 29, 2014

What Part of Hot Air Rises Do You Not Understand

Dr Roy is having a mid life crisis.  He writes:

"But what if (I’m NOT necessarily advocating this) most of the CO2 humans produce, which is near the land surface, is absorbed by vegetation, and the observed global increase is partly or mostly due to outgassing of the oceans?"

Most of the CO2 humans produce is created by combustion of fossil fuels.

What part of hot air rises doesn't Dr. Roy understand?

There are more sophisticated answers to this from a whole bunch of atmospheric transport studies

Wednesday, August 27, 2014

Water district election update from a guileless heavyweight

The title above refers to a San Jose newspaper article. I've mentioned a while back that I'm running for re-election, and it turns out it's going to be interesting.

People seem to have differing reactions to the column, so read it for yourself and I won't try to bias your view.

What I have been thinking for a while though in this runup to the election is to actually acknowledge the mistakes I've made. I've got one general category, and then one specific issue. The general category is dividing my attention on too many things. Here's an incomplete list of the memos I've written:  too many subjects. Obviously we do a lot more work than can be measured by memos, but those memos should usually be priority items. If I had chosen fewer subjects and put more time into them, we could have progressed further. (I'll still defend our progress overall, though.)

The other, specific issue is pretty technical but involves the best approach to resolving legal and environmental issues under both state and federal law for the protection of endangered steelhead salmon. We'd been attempting to get simultaneous state and federal permits but had been held up on the federal level. Some local environmental groups wanted to split the process to speed up actions to help steelhead under state law; our staff resisted this approach. A few months ago, our staff changed their minds. Maybe that could have happened two years ago instead - the split approach was the right one. Now we have to make up for that lost time.

Anyway, I think you can acknowledge mistakes while still doing a good job, and that's what I hope to do in this election. 

And to be a good politician (or less guileless one), I'll also add that help is greatly appreciated.