I just went to the first of a new lecture series at Caltech, NRG 0.1, during which various experts are going to be discussing various aspects of the energy problem (for which read “challenge”) that the world is facing.
This week was Steve Koonin, former Caltech provost and physics professor, and currently chief scientist for BP. I thought it was an excellent talk, covering a lot of the different aspects to the energy question, and some important principles that need to be kept in mind when looking for solutions in the near and medium term. I particularly enjoyed (and, yes, this probably says something about me too) how the talk assembled a large collection of numbers into a few key “back-of-the-envelope” facts, and then analysed the various options in terms of these constraints. While I’m not going to summarise the whole talk (which will hopefully be available here soon), here are some of the things which stood out:
2050 / twice pre-industrial
By BP’s Business as Usual (BAU) analysis, sometime before 2050 CO2 will hit twice pre-industrial atmospheric levels. This is a tipping point in many models, and so serves as a useful “safe” upper limit. Anything we do has to have a big effect well before 2050.
Running out of oil vs. global warming
A few years ago I was more concerned about the former; now I think I’m more concerned about the latter. The global economy is handling the high oil prices very well, so non-conventional oil, like oil sands in Canada, really start to look accessible. Oil prices may stay high, and national concerns about oil supply security may discourage oil use, but I think it’s here for a few more decades. My take home message: global warming will be solved, or not, before oil runs out.
CO2 has to drop hugely
CO2 has a lifetime of many centuries once it’s in the atmosphere. Thus to reach CO2 stability at twice pre-industrial levels by 2050, we actually need to cut emissions by about half from today’s level. (A useful figure: due to CO2 longevity, a drop of 10% in CO2 emissions growth delays by about 7 years the crossing of any given atmospheric CO2 concentration). But by business as usual estimates, economic growth, even including historically extrapolated improvements in efficiency, will have raised emissions by a factor of 4. So we have to improve somehow by a factor of 8. As Koonin points out, efficiency gains are generally overwhelmed by increased consumption.
CO2 drops have to start now
As CO2 stays in the atmosphere, delaying change by a few years’ delay makes the required drops much larger in future. Furthermore, the main drivers of emissions (power plants, houses, cars, etc.) all have lifetimes of decades — so the power plants being built now will still be emitting by 2050. Basically, if nothing dramatic changes in the next 5 to 10 years, stability by 2050 becomes nearly impossible.
Many “solutions” just don’t scale
There’s huge enthusiasm for corn-based biofuels in the US at the moment. Koonin’s figures were that about 20% of the corn crop is now going to fuels, contributing about 2% of the US’s transport fuel needs. This doesn’t scale to solve the problem. Another example: solar. It’s a lot more expensive, and so will never be accepted commercially. But even if it was, we need to cover (if I recall the figure) a million rooftops with solar panels every year, starting right now, to reach stability by 2050. I’m not sure if that was globally or just the US.
Currently, emitting CO2 is free in most places (Europe is a partial exception). That makes coal the cheapest power source. Most emissions reduction schemes assign a cost, one way or another, to CO2. Koonin had an interesting comparison graph: below about $20/ton CO2, coal remains cheapest. Above about $40/ton, there are no further major changes to the ordering of energy sources. So the magic number of balancing economic cost and yet still changing behaviour is around $30/ton. This would add only about 15% to the cost of petrol in the US or SA, and a little less in Europe, say. So the biggest changes will be in fixed electrical generation plants (which anyway are the biggest emitters).
Koonin’s take on matters, and I think I agree, is that given the size and cost of the changes needed, as well as their urgency, market forces have to be used to make changes. That is, we can’t pick an “ideal solution” and decree that that is what will be done — the political will isn’t there over the time scale required. Rather, the correct policy incentives need to be put in place right now — like a fixed, predictable cost for CO2 (which, interestingly, argues against a cap-and-trade approach), for the next 50 years. Without such definiteness, it becomes really hard for power companies to spend, say, an extra billion dollars now on a power plant that does CO2 sequestration.
Koonin’s roadmap would seem to be: policy incentives right now, leading to CO2 sequestering power plants still running predominantly off fossil fuels; a growing but still far from dominant contribution from sustainable power sources; and revolutionary improvements in next generation biofuels (using plant material that we do not, in fact, want to eat). He justifies hope in a biofuel revolution by pointing out that biotechnology is a very young and rapidly developing field — unlike, say, fusion. He also thinks there’s a chance for a solar revolution, but not with current technology.
As I overheard a participant say on the way out, though, “He could have given a much more pessimistic talk with the exact same slides”. We do have to make immediate, dramatic changes to an area of human endeavour that has vast pre-existing infrastructure, very long time-lines and huge costs. This for a problem that is hard to easily demonstrate now, and exists over a time scale far longer than political cycles. I think there’s a fair chance that, come 2050, we’ll have to be involved in some sort of huge active geoengineering (ie. a modification designed to “cancel out” our CO2 emissions), in order to stabilise the climate.