A new report, Post-Partisan Power, puts forth several interesting ideas for how the United States can accelerate technological progress to advance U.S. energy security and global climate protection. The authors are Steven F. Hayward of the American Enterprise Institute (AEI), Mark Muro of the Brookings Institution, and Ted Nordhaus and Michael Shellenberger of the Breakthrough Institute. The report has created a buzz, in part because of the “man bites dog” nature of the story – Brookings and AEI agree on something! And they are saying “post-partisan” out loud in these hyper-partisan times.
The authors recommend a number of initiatives that ought to be no-brainers – invest more in energy science and education, overhaul the energy innovation system by increasing funding for the new Advanced Research Projects Agency-Energy (ARPA-E) and developing regional energy innovation centers, reform energy subsidies, use military procurement and competitive deployment to drive innovation and price declines, and pay for all this through a very small carbon tax or electricity fee. The major critique of the report, best articulated by Harvard economist Rob Stavins, is that these recommended steps are necessary but not sufficient – i.e., it is all very well for the government to invest in these technologies, but we also need to create a market for them through a strong carbon price or serious greenhouse gas reduction requirements.
The authors have responded that they didn’t mean to imply that their recommendations include all we need to solve our energy and climate problems. However, their subtitle, “How a limited and direct approach to energy innovation can deliver clean, cheap energy, economic productivity and national prosperity,” makes it sound an awful lot like they did. And their opening critique of “both sides of the debate” on climate and energy is dismissive of pricing in general and cap and trade in particular – noting, for example, cap and trade’s defeat in the Senate but not its victory in the House, and saying pricing has not succeeded in reducing emissions in Europe, when in fact it has. But let’s set that aside for the moment.
The more intriguing question this report raises for me is why the energy and climate debate is so stuck and why even the modest proposals described in “Post-Partisan Power” face an uphill battle.
The report’s authors lament our irrational energy subsidies and dysfunctional federal support system for energy innovation, and I agree substantively with their critique and their proposed fixes. But this irrationality and dysfunctionality have persisted for a long time. Each energy source has a powerful constituency for federal subsidies and tax breaks. And for each DOE lab in the current national network that does most of our federal energy research, powerful regional interests protect the status quo.
Similarly, policy analysts have made an airtight case for decades that pricing policies are both effective and cost-effective at reducing emissions, but for the most part politicians and the public have resisted such policies. We seem to prefer our regulatory costs to be high and hidden rather than low and transparent.
What is going on? I’m not sure, but I can think of at least two partial answers. The first is our political system’s focus on the size of government rather than its efficacy. The “great debate” in this election is whether the government should be bigger or smaller, not whether government is effectively doing whatever tasks it is assigned. The key critique of cap and trade was that it was a tax and that it looked like too much government, when the debate should have been about its efficacy in reducing emissions and minimizing costs. We measure success or failure of federal action with respect to a particular energy source by the size of the budget or tax breaks devoted to it, not whether such action is effectively driving innovation or bringing down technology costs. Hayward et al. suggest we fix this through better program design, but that will not be easy. It requires a transformation of our nation’s political thinking at a very deep level.
The second answer is specific to the energy system. It is an inconvenient truth that fossil fuels have some really attractive characteristics as energy sources. They are abundant, seemingly cheap (if one doesn’t take into account their environmental or energy security impacts, and of course the market price does not do so), and “energy-dense” (meaning they can produce a lot of energy per unit of volume and mass). They have also been used for a long time, and their use has co-evolved with extensive fuel distribution infrastructure and fuel-using equipment. Thus, shifting away from these fuels requires displacing a suite of interdependent incumbent technologies.
This problem is really different, in kind and in scale, from any the U.S. government or the U.S. economy has wrestled with before. It is not like computer innovation, where a new set of technologies created new markets for new services; or airplanes, in which military procurement dominated an emerging market. To move away from the energy system we have, which meets our private needs very nicely, to one that may have lower social costs but higher private ones (at least for some transitional period), is going to be very difficult. Hayward et al. hope that we can eat our cake and have it, too, by finding new technologies that have both lower social costs and lower private costs. But substantially increasing government investment won’t guarantee this outcome – certainly not by itself. Rather the United States must make climate protection and national security a priority, and develop and implement a conscious, ambitious, and comprehensive national strategy with full public support. This is a daunting challenge.
Judi Greenwald is Vice President for Innovative Solutions
We’ve been tracking federal government efforts towards reducing our vulnerability and increasing our resiliency in the face of the potential impacts and risks from climate change. I continue to be impressed by the steps that many federal agencies are taking in this regard—a lot of work is going on to mainstream climate change adaptation.
Yesterday the Interagency Climate Change Adaptation Task Force released its report to the President. During the past year this task force—which includes about 20 different Federal agencies—worked on developing recommendations and guiding principles on a strategic approach to climate change adaptation. The Task Force’s recommendations include: making sure that adaptation is a standard part of Agency planning (mainstreaming!), ensuring information about the impacts of climate change is accessible, and aligning federal efforts that cut across agency jurisdictions and missions.
A number of agencies have already gotten started on this. Two agencies within the Department of the Interior (DOI) released climate change strategies last month—the Fish and Wildlife Service and the National Park Service. These efforts build on DOI’s overarching strategic response to climate change.
The Fish and Wildlife Service manages more than 150 million acres of wildlife refuges across the United States and has additional responsibilities related to the protection of fish populations, endangered species, and migratory birds. (Interesting side note: according to the Service, about 41 million people visit national wildlife refuges each year and their spending generates almost $1.7 billion in sales for regional economies.) The Service defines adaptation as “minimizing the impact of climate change on fish and wildlife through the application of cutting-edge science in managing species and habitats” and has made adaptation the centerpiece of its Strategic Plan.
Charged with preserving the natural and cultural heritage of our nation, adapting to climate change presents the National Park Service with many challenges. What should it do about the melting glaciers at Glacier National Park? Or the threats of flooding to historic Jamestown, VA (part of the Colonial National Historical Park)? The National Park Service’s Climate Change Response Strategy details long- and short-term actions in three major areas: mitigation, adaptation, and public communication. Measures to tackle the adaptation piece include planning, promoting ecosystem resilience, preserving the nation’s heritage, and protecting facilities and infrastructure.
Earlier this month, the EPA released its 2011-2015 Strategic Plan containing five strategic goals for advancing its environmental and human health missions, the first of which is “Taking Action on Climate Change and Improving Air Quality.” As part of its Strategy, the EPA recognizes that it “must adapt and plan for future changes in climate” and “incorporate the anticipated, unprecedented changes in climate into its programs and rules.”
And just last week at the first White House Council on Environmental Quality (CEQ) GreenGov Symposium there were three separate panels devoted to climate change adaptation. We heard presentations from the Army Corps of Engineers, CDC, CEQ, DOT, the Forest Service, HUD, OSTP, USDA, as well as a number of stakeholders including the state of Maryland, the Nature Conservancy, and the National Association of Clean Water Agencies (NACWA). All of which are very much engaged on the adaptation issue.
Finding it hard to keep track of all of these agencies and what they are up to? Don’t worry – we’ll be posting our newest adaptation report, Climate Change Adaptation: What Federal Agencies are Doing, to this site very soon.
Heather Holsinger is a Senior Fellow for Domestic Policy
There has been a lot of important climate news in recent weeks and months. In addition to record warmth, an unusually active Atlantic hurricane season, and a devastating string of extreme weather events in the U.S. and around the world, Arctic sea ice has reached a new low in its total volume.
The ice covering the Arctic Ocean goes through a seasonal cycle in which it expands during the winter, reaching its maximum extent in March, and shrinks during the summer, reaching its minimum extent in September. Satellites have been observing the daily coverage of sea ice since 1979, during which time the summer minimum has declined rapidly over the decades. In 2007, the summer minimum dropped by a startling amount compared to previous summers, generating an iconic graph that was splashed across blogs and newspapers around the world (Figure 1). This record still holds, although every year since 2007 has seen below-average summer minima.
According to the National Snow and Ice Data Center (NSIDC), Arctic sea ice reached its minimum extent for 2010 on September 19 at 1.78 million square miles. Although this was the third-lowest extent behind 2007 and 2008, the sea ice set a new and probably more important record by reaching the lowest estimated volume – or total amount of sea ice – since satellite observations began in 1979.
Picturing an ice cube floating in a glass of water is a good comparison. The ice cube has three dimensions. But looking directly down at the glass, you see only the two dimensions that cover part of the surface of the water. When you look at the glass from the side, you can also see that the ice cube has depth, and that most of the ice is below the surface. The same phenomenon holds for sea ice, so if the ice melts from below, it becomes thinner and its total volume decreases.
This year, even though the area of the ocean’s surface covered by ice was a little larger than in 2007, the ice was much thinner, making its total volume much less than in 2007 or any previous year since estimates began in 1979 (Figure 2).
The rapid decline in total ice volume is significant since it takes less heat to melt a small volume of ice than to melt a larger volume. The area of ice cover can recover in one season, as it did in 2009, but the thickness builds up over several years. Consequently, the low volume of ice currently in the Arctic is more susceptible to melting next summer and the summer after that than was the 2007 ice. Consequently, scientists are wondering whether the Arctic could become ice free during the summer much sooner than previously projected.
The opening of the Arctic has enormous implications, ranging from global climate disruption to national security issues to dramatic ecological changes. The Arctic may seem far removed from our daily lives, but changes there are likely to have serious global implications.
- An ice-free Arctic Ocean will absorb more sunlight and convert it to heat, thus amplifying warming.
- The Arctic currently removes CO2 from the atmosphere, but physical and biological changes in the Arctic could cause it to switch to releasing CO2 and CH4 (a very potent greenhouse gas) to the atmosphere, thus amplifying global warming.
- Atmospheric circulation and therefore precipitation and storm patterns may be altered by a warming Arctic and changes in how the ocean interacts with the atmosphere in the region.
- A warmer, ice-free Arctic Ocean with more freshwater from snow and ice melt could change global ocean circulation patterns, thus altering marine ecosystems (i.e. fisheries) around the world and changing patterns of precipitation and storms on a very broad scale.
- More rapid melting of ice on land will accelerate sea level rise and could destabilizing the Greenland Ice Sheet, leading to abrupt and massive sea level rise.
- Countries have begun to compete for access to untapped natural resources in the Arctic. Unlike other international arenas, such as Antarctica, coastal waterways, and space, there are no agreed international rules to govern how different countries will access and utilize the Arctic.
Jay Gulledge is Senior Scientist and Director of the Science and Impacts Program
Several of my previous posts have examined the remarkable weather of the past year, including the unusual U.S. East Coast snowstorms this winter, the wide array of floods and heat waves this summer, and how these can help us understand our vulnerabilities to climate change. The average land surface temperature this summer (June-August) was the warmest on record globally and the fourth warmest on record in the United States.
Now that northern summer has come to a close, we can take stock of just how warm it was. Christopher C. Burt—weather historian, extreme-weather guru, and author—takes a look at temperature records set in the U.S. and around the world this summer in his blog at Weather Underground. Some of his findings include:
- Fifteen (15) U.S. cities recorded their warmest summer (June-August) ever.
- Only one U.S. city (Santa Barbara, CA) recorded its coldest summer.
- Seventeen (17) countries set new records for high temperatures, breaking the previous record of fifteen (15) countries set in 2008.
- No countries recorded a record low temperature.
- The Arctic country of Finland recorded a high temperature of 99°F at the Joensuu airport.
- A town in Pakistan recorded a record high temperature of 128.3°F.
- Los Angeles recorded its highest ever temperature of 113°F this Monday, in spite of an otherwise cool summer.
It’s important to put this single year into a broader perspective; if this warmth is just an aberration, then we might be wasting time talking about it. But it is clearly part of a much longer warming trend that has been going on for decades. A recent report from the National Oceanographic and Atmospheric Administration announced that 2009 was one of the ten warmest years on record (since 1880) and that the 2000s was the warmest decade followed by the 1990s and then the 1980s. If the first 9 months of this year are an indication, the 2010s appear poised to continue this upward march in temperatures.
(Figure Source: NOAA’s State of the Climate in 2009, Chapter 2)
Jay Gulledge is Senior Scientist and Director of the Science and Impacts Program
The rough weather of 2010 teaches us that climate change is risky business.
Recently, I posted a blog discussing the possible link between global climate change and two related extreme weather events: the heat wave in Russia and historic flooding in Pakistan. Although there is no method to definitively attribute any single event to climate change, based on documented trends in extreme weather events and research showing that specific types of meteorological phenomena are more common in a greenhouse-warmed world, I said:
“It is reasonable to conclude that, in aggregate, the documented increase in extreme events is partially a climate response to global warming, and that global warming has increased the risk of extreme events like those in Russia and Pakistan. On the other hand, there is no scientific basis for arguing that these events have nothing to do with global warming.”
That’s as far as the science permits me to go with this question. We simply cannot know whether any particular weather event was “caused” by climate change. In recent weeks, however, the media have done their all-too-common “he said-she said” routine of finding one source who says the extreme weather of 2010 is because of climate change and another who says it’s not. This is a meaningless argument that distracts us from what we should be thinking about, which is what these events can teach us about our vulnerabilities to climate change.
You might recall earlier this year that a few mistakes were discovered in the Intergovernmental Panel on Climate Change’s (IPCC) 3,000-page assessment report published in 2007. The mistakes did nothing to undermine the report’s major findings: It is unequivocal that the climate is changing, and there is greater than 90 percent certainty that most of the observed warming of the past half-century is due to human influences. Earlier this year, I discussed the errors on E&ETV’s On Point program.
Update: Dr. Jay Gulledge is featured on National Journal's Energy & Environment Expert Blogs. Click here to read Dr. Gulledge's take on Climate Risks Here and Now
Last fall I posted a blog about the unusual number and severity of extreme weather events that have been striking around the globe for the past several years. That entry focused on the alternating severe drought and heavy flooding in Atlanta in 2007-2009 as an example of the roller coaster ride that climate change is likely to be. As every dutiful scientist does, I stopped short of blaming those individual weather events on global warming, but I am also careful to point out that it is scientifically unsound to claim that the confluence of extreme weather events in recent years is not associated with global warming; I’ll return to this question later.
The weather of 2010 continues the chaos of recent years. In the past six months, the American Red Cross says it “has responded to nearly 30 larger disasters in 21 [U.S.] states and territories. Floods, tornadoes and severe weather have destroyed homes and uprooted lives …” Severe flooding struck New England in March, Nashville in May, and Arkansas and Oklahoma in June.
Last week we held a workshop at the Newseum in Washington, DC, entitled Federal Government Leadership: Mainstreaming Adaptation to Climate Change. The workshop was intended to build on our recent report highlighting the important role of the federal government in climate change adaptation and the recent National Academies’ report—Adapting to the Impacts of Climate Change—which emphasized that the federal government should not only serve as a “role model,” but also play a significant role as a “catalyst and coordinator” in identifying vulnerabilities to climate change impacts and the adaptation options that could increase our resilience to these changes.
In his defense of soldiers in the Boston Massacre trials, John Adams went on to say “… and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.”
No matter what we may wish were happening, no matter what spin some may try to sell, the clear evidence of climate change continues to mount.
The U.S. National Oceanic and Atmospheric Administration (NOAA) has just released its annual report on the state of the climate, and the facts speak volumes about the pervasiveness and speed of actual climate change, not model projections.
I posted previously on the controversy surrounding emails that were hacked from a computer server at the University of East Anglia’s (UEA) Climatic Research Unit (CRU) in the U.K. The emails revealed the private exchanges of several prominent climate scientists dealing with their science and their reactions to climate change deniers who requested access to their private computer files and intellectual property. The contents of the emails suggested to the untrained eye that the scientists had manipulated data and tried to undermine the scientific peer-review process. From my reading of the emails, I judged that nothing of the sort had happened. Since my last writing on the topic, five separate independent investigations (3 in the United Kingdom and 2 in the U.S.) of the matter have concluded that there was no mishandling of data or other wrongdoing beyond some foot-dragging in response to Freedom of Information requests by climate change deniers. The clear message from these investigations is that proper scientific methods were followed and the integrity of climate science remains solid as a rock.