Wednesday, November 3rd, 2010
Lots of interesting analysis in the wake of yesterday’s elections, including some surveying the dynamic political landscape and what this means for the country’s ability to deal with crises including jobs, energy, and climate.
Some have suggested that the large Republican shift in 2010 is a (over)correction of the large Democratic (over)correction in 2008. Like a water hose flailing wildly back and forth out of control, does this portend increasingly variable election cycles going forward as people become more-and-more frustrated with the inability of the federal government to confront problems?
John Judis at The New Republic offers this:
Does losing over 60 House seats and as many as eight Senate seats simply make this a below average outcome, or did something much more serious and significant happen in yesterday’s election?
Republicans might say it’s the re-emergence of a conservative Republican majority, but that’s not really what happened. What this election suggests to me is that the United States may have finally lost its ability to adapt politically to the systemic crises that it has periodically faced. America emerged from the Civil War, the depression of the 1890s, World War I, and the Great Depression and World War II stronger than ever—with a more buoyant economy and greater international standing. A large part of the reason was the political system’s ability to provide the leadership the country needed. But what this election suggests to me is that this may no longer be the case.
…Like the depressions of the 1890s and 1930s, this slowdown was also precipitated by the exhaustion of opportunities for economic growth. America’s challenge over the next decade will be to develop new industries that can produce goods and services that can be sold on the world market. The United States has a head start in biotechnology and computer technology, but as the Obama administration recognized, much of the new demand will focus on the development of renewable energy and green technology. As the Chinese, Japanese, and Europeans understand, these kinds of industries require government coordination and subsidies. But the new generation of Republicans rejects this kind of industrial policy. They even oppose Obama’s obviously successful auto bailout.
Instead, when America finally recovers, it is likely to re-create the older economic structure that got the country in trouble in the first place: dependence on foreign oil to run cars; a bloated and unstable financial sector that primarily feeds upon itself and upon a credit-hungry public; boarded-up factories; and huge and growing trade deficits with Asia. These continuing trade deficits, combined with budget deficits, will finally reduce confidence in the dollar to the point where it ceases to be a viable international currency.
The election results will also put an end to the Obama administration’s attempt to reach an international climate accord. It will cripple its ability to adopt domestic limits on carbon emissions. The election could also doom Obama’s one substantial foreign policy achievement—the arms treaty it signed with Russia that still awaits Senate confirmation.
…[I]f I am right about the fundamental problems that this nation suffers from at home and overseas, then any politician’s or political party’s victory is likely to prove short-lived. If you want to imagine what American politics will be like, think about Japan.
Japan had a remarkably stable leadership from the end of World War II until their bubble burst in the 1990s. As the country has stumbled over the last two decades, unable finally to extricate from its slump, it has suffered through a rapid of succession of leaders, several of whom, like Obama, have stirred hopes of renewal and reform, only to create disillusionment and despair within the electorate. From 1950 to 1970, Japan had six prime ministers. It has had 14 from 1990 to the present, and six from 2005 to the present. That kind of political instability is both cause and effect of Japan’s inability to transform its economy and international relations to meet the challenges of a new century.
[L]ike Japan, we’ve had a succession of false dawns, or what Walter Dean Burnham once called an “unstable equilibrium.” That’s not good for party loyalists, but it’s also not good for the country. America needs bold and consistent leadership to get us out of the impasse we are in, but if this election says anything, it’s that we’re not going to get it over the next two or maybe even ten years.
Monday, November 1st, 2010
An amazing night-time photo from the International Space Station showing how human settlement along the Nile River and Delta stand out against the Sahara Desert (shot by Astronaut Doug Wheelock). Click here for a larger image.
Wednesday, October 20th, 2010
In his latest blog post in Time Magazine, Bryan Walsh laments the fact that—6 months after the Gulf Oil Spill— it appears no lessons have been learned:
…We all wanted to find the “lessons of the spill”—even while the oil was still flowing. (Look back at that first story I did—it was written during the first week of May, more than 2 months before BP’s blown well was capped.) But we haven’t gotten smarter since the spill. We’ve gotten stupider.
…It’s now six months to the day after the Deepwater Horizon exploded, and it’s safe to say that the BP spill will not be remembered as the modern green movement’s march on Washington. Climate legislation is dead in the Senate, and if the midterm polls are accurate, next year’s Congress will be even less inclined to act on global warming—or even believe it. President Obama—under constant pressure from the same Gulf Coast states that were drenched in oil—lifted his moratorium on deepwater drilling earlier this month, before the initial deadline of Nov. 30 and before investigations into the true cause of the accident were complete. The government response to the disaster, while heroic at times, was deeply problematic, with evidence that Washington kept the public in the dark for weeks about the true size of the spill. The response on the ground was marred by obstructionism on the part of BP, to the point where off-duty cops in Louisiana seemed to be acting as hired muscle for the oil company that—let’s not forget—was chiefly responsible for spill in the first place. The legacy is a climate of distrust and paranoia in the Gulf—academic researchers and government scientists quarreling over underwater oil, conspiracy theories about BP burning sea animals, and anger along the Gulf coast among those who feel they’ve been left behind, as the rest of the country has moved on.
Forget energy reform—the biggest change in the Gulf seems to be the flood of money from BP, as part of its $20 billion promise to “make this right,” as former CEO Tony Hayward put it.
…It’s not exactly a clean energy revolution.
None of this is surprising. It’s what I predicted at the beginning:
Wednesday, October 13th, 2010
In the wake of Ryan Lizza’s provocative piece, As the World Burns, published in the New Yorker last week comes a renewed call by folks like Shellenberger and Nordhaus emphasizing the need to make clean energy cheap rather than dirty energy expensive.
See the latest in today’s NY Times.
Tuesday, October 12th, 2010
In 40 years, there will be about 3 billion additional people living on the Earth (~9.5 billion total). With all of these new folks, it’s easy to think about the added demands of energy, food, and water required to sustain their lifestyles. And in terms of climate warming, it’s hard to escape the fact that significantly greater energy consumption will lead to rising rates of carbon emissions, unless there’s a shift to decarbonize the economy.
In this week’s early Edition of the Proceedings of the National Academy of Sciences (open access), Brian O’Neill and colleagues note that emissions are not just controlled by the sheer size of the human population but also by important demographic changes.
For example, how might an aging or more urban population affect emissions? How about changes in household size? Modelers of carbon emissions don’t usually ask these kinds of questions, so the conventionally projected emissions might be off if these additional demographic details matter.
The researchers developed a global economic model (Population-Environment-Technology, or PET) in which they specified relationships between demographic factors like houshold size, age, and urban/rural residency and economic factors like the demand for consumer goods, wealth, and the supply of labor. Here’s a bit more on how this works:
In the PET model, households can affect emissions either directly through their consumption patterns or indirectly through their effects on economic growth in ways that up until now have not been explicitly accounted for in emissions models. The direct effect on emissions is represented by disaggregating household consumption for each household type into four categories of goods (energy, food, transport, and other) so that shifts in the composition of the population by household type produce shifts in the aggregate mix of goods demanded. Because different goods have different energy intensities of production, these shifts can lead to changes in emissions rates. To represent indirect effects on emissions through economic growth, the PET model
explicitly accounts for the effect of (i) population growth rates on economic growth rates, (ii) age structure changes on labor supply, (iii) urbanization on labor productivity, and (iv) anticipated demographic change (and its economic effects) on savings and consumption behavior.
Although there are some exceptions, households that are older, larger, or more rural tend to have lower per capita labor supply than those that are younger, smaller, or more urban. Lower-income households (e.g., rural households in developing countries) spend a larger share of income on food and a smaller share on transportation than higher-income households. Although labor supply and preferences can be influenced by a range of nondemographic factors, our scenarios focus on capturing the effects of shifts in population across types of households.
To project these demographic trends, we use the high, medium, and low scenarios of the United Nations (UN) 2003 Long-Range World Population Projections combined with the UN 2007 Urbanization Prospects extended by the International Institute for Applied Systems Analysis (IIASA) and derive population by age, sex, and rural/urban residence for the period of 2000–2100.
What did they find?
Tuesday, October 5th, 2010
When we think of human population change and resource use, it’s easy to assume that more people will consume more resources, such as water, energy, and food. An important corollary is that resource limitations will limit population growth. Thomas Malthus was perhaps the most influential proponent of this idea.
However, several factors complicate this story:
(1) Affluence is a multiplier such that more people in a wealthy, high-consumption society lead to a disproportionate use of resources compared to people in poor countries. As my recent article on global change in Nature Knowledge shows,
the populations of China and India are roughly 1.32 and 1.14 billion people, respectively — about four times that of the US. However, the energy consumption per person in the US is six times larger than that of a person in China, and 15 times that of a person in India. Because the demand for resources like energy is often greater in wealthy, developed nations like the US, this means that countries with smaller populations can actually have a greater overall environmental impact. Over much of the past century, the US was the largest greenhouse gas emitter because of high levels of affluence and energy consumption. In 2007, China overtook the US in terms of overall CO2 emissions as a result of economic development, increasing personal wealth, and the demand for consumer goods, including automobiles.
(2) Interestingly, resource limitations may actually inhibit our ability to slow population growth. Yes, you read that right. A new paper by John DeLong and colleagues in this week’s PLOS One (open access) argues exactly this. Here’s why:
Influential demographic projections suggest that the global human population will stabilize at about 9–10 billion people by mid-century. These projections rest on two fundamental assumptions. The first is that the energy needed to fuel development and the associated decline in fertility will keep pace with energy demand far into the future. The second is that the demographic transition is irreversible such that once countries start down the path to lower fertility they cannot reverse to higher fertility. Both of these assumptions are problematic and may have an effect on population projections. Here we examine these assumptions explicitly. Specifically, given the theoretical and empirical relation between energy-use and population growth rates, we ask how the availability of energy is likely to affect population growth through 2050. Using a cross-country data set, we show that human population growth rates are negatively related to per-capita energy consumption, with zero growth occurring at ~13 kW, suggesting that the global human population will stop growing only if individuals have access to this amount of power. Further, we find that current projected future energy supply rates are far below the supply needed to fuel a global demographic transition to zero growth, suggesting that the predicted leveling-off of the global population by mid-century is unlikely to occur, in the absence of a transition to an alternative energy source. Direct consideration of the energetic constraints underlying the demographic transition results in a qualitatively different population projection than produced when the energetic constraints are ignored. We suggest that energetic constraints be incorporated into future population projections.
I love these kinds of unexpected outcomes that make us think more critically about simplified assumptions when it comes to the drivers and impacts of global change.
DeLong, J., Burger, O., & Hamilton, M. (2010). Current Demographics Suggest Future Energy Supplies Will Be Inadequate to Slow Human Population Growth PLoS ONE, 5 (10) DOI: 10.1371/journal.pone.0013206
Photo credit: wili_hybrid
Monday, October 4th, 2010
The NY Times is running a story, Military Orders Less Dependence on Fossil Fuels, that evaluates the American military’s role in green energy innovation:
Even as Congress has struggled unsuccessfully to pass an energy bill and many states have put renewable energy on hold because of the recession, the military this year has pushed rapidly forward. After a decade of waging wars in remote corners of the globe where fuel is not readily available, senior commanders have come to see overdependence on fossil fuel as a big liability, and renewable technologies — which have become more reliable and less expensive over the past few years — as providing a potential answer. These new types of renewable energy now account for only a small percentage of the power used by the armed forces, but military leaders plan to rapidly expand their use over the next decade.
“There are a lot of profound reasons for doing this, but for us at the core it’s practical,” said Ray Mabus, the Navy secretary and a former ambassador to Saudi Arabia, who has said he wants 50 percent of the power for the Navy and Marines to come from renewable energy sources by 2020. That figure includes energy for bases as well as fuel for cars and ships.
While setting national energy policy requires Congressional debates, military leaders can simply order the adoption of renewable energy. And the military has the buying power to create products and markets. That, in turn, may make renewable energy more practical and affordable for everyday uses, experts say.
Last year, the Navy introduced its first hybrid vessel, a Wasp class amphibious assault ship called the U.S.S. Makin Island, which at speeds under 10 knots runs on electricity rather than on fossil fuel, a shift resulting in greater efficiency that saved 900,000 gallons of fuel on its maiden voyage from Mississippi to San Diego, compared with a conventional ship its size, the Navy said.
The Air Force will have its entire fleet certified to fly on biofuels by 2011 and has already flown test flights using a 50-50 mix of plant-based biofuel and jet fuel; the Navy took its first delivery of fuel made from algae this summer. Biofuels can in theory be produced wherever the raw materials, like plants, are available, and could ultimately be made near battlefields.
Photo credit: Shortbread1015DT
Friday, September 24th, 2010
Mike Berners-Lee and Duncan Clark at The Guardian have a recent post in the series examining the carbon footprints of daily life activities. Their post asks how much carbon emissions results from the direct and indirect activities of building a car.
The carbon footprint of making a car is immensely complex. Ores have to be dug out of the ground and the metals extracted. These have to be turned into parts. Other components have to be brought together: rubber tyres, plastic dashboards, paint, and so on. All of this involves transporting things around the world. The whole lot then has to be assembled, and every stage in the process requires energy. The companies that make cars have offices and other infrastructure with their own carbon footprints, which we need to somehow allocate proportionately to the cars that are made.
….The best we can do is use so-called input-output analysis to break up the known total emissions of the world or a country into different industries and sectors, in the process taking account of how each industry consumes the goods and services of all the others. If we do this, and then divide by the total emissions of the auto industry by the total amount of money spent on new cars, we reach a footprint of 720kg CO2e per £1000 spent.
This is only a guideline figure, of course, as some cars may be more efficiently produced than others of the same price. But it’s a reasonable ballpark estimate, and it suggests that cars have much bigger footprints than is traditionally believed. Producing a medium-sized new car costing £24,000 may generate more than 17 tonnes of CO2e – almost as much as three years’ worth of gas and electricity in the typical UK home.
17 (metric) tons is 17,000 kg or about 37,400 pounds. The U.S. EPA estimates that the average passenger vehicle in the U.S. emits 5-5.5 metric tons CO2e per year, assuming 12,000 miles driven.
If you do the math, this means the embodied CO2e emissions to make a car is about 3-3.5 years worth of tailpipe emissions from driving. Assuming that most people own their cars for longer than three years, this figure doesn’t jive with what the authors claim:
The upshot is that – despite common claims to contrary – the embodied emissions of a car typically rival the exhaust pipe emissions over its entire lifetime. Indeed, for each mile driven, the emissions from the manufacture of a top-of-the-range Land Rover Discovery that ends up being scrapped after 100,000 miles may be as much as four times higher than the tailpipe emissions of a Citroen C1.
If people held onto their cars for 10 years (assuming 120,000 miles), tailpipe emissions would equal 50 metric tons of CO2e, and embodied emissions would be about 34% of tailpipe emissions. If people drove their cars for 20 years (assuming 240,000 miles), the exhaust emissions would rise to 100 metric tons CO2e, with embodied emissions dropping to 17% of tailpipe emissions.
While most folks generally agree with the notion of driving their vehicle into the ground (as my recently dead 16-yr-old truck illustrates), you’d have to be driving a Toyota Prius to get a lifetime tailpipe emission that equals the embodied emissions of building it (assuming that a Prius achieves three times the mpg of a typical car, which would drop CO2e tailpipe emissions from 5 to 1.7 metric tons CO2e per year, making a 10-year total tailpipe emission of 17 metric tons reasonable).
Thus, if you drive an average car for 10 years, your lifetime tailpipe emissions (50 metric tons) will be a lot larger than the embodied emissions to build the car (17 metric tons) (for a total emission of 67 metric tons). If you drive a hyper-efficient vehicle for 10 years, tailpipe and embodied emissions may be comparable (17 metric tons each, 34 metric tons total). This means you could buy a new Prius every three years, and the embodied emissions from all of these purchases plus tailpipe emissions would roughly equal a normal car driven for 10 years.
This raises an important question: What matters here? If the goal is to reduce total emissions, the best thing is to buy a car with a very high fuel efficiency and drive it for its full life, as the above examples illustrate.
Photo credit: atomicshark
Sunday, September 19th, 2010
Laura Miller at Salon reviews a new book out this week by Judy Pasternak titled, “Yellow Dirt: An American Story of a Poisoned Land and a People Betrayed.” A few excerpts from Miller’s analysis:
In the summer of 1979, an earthen dam over the town of Church Rock, Utah, broke, flooding the arroyo below and then the bed of the Rio Puerco (an intermittent stream) on the southern border of the Navajo Nation. It was a small flood, but a dangerous one. It burned the feet of a boy who stepped into it, and caused sheep and crops along the banks to drop dead. That’s because the pond it came from had been used by a nearby uranium mine to store the tailings (residue) of its excavations — the water kept the radioactive dust from blowing away. The 93 million gallons of contaminated water that poured into the Rio Puerco remains the largest accidental release of radioactive material in U.S. history, bigger than the notorious Three Mile Island reactor meltdown that occurred 14 weeks later.
The Church Rock flood is only one incident among many in the “slow-motion disaster” investigative journalist Judy Pasternak comprehensively recounts in her chilling new book, “Yellow Dirt: An American Story of a Poisoned Land and a People Betrayed.” Based on a prize-winning four-part series she wrote for the Los Angeles Times, “Yellow Dirt” begins during World War II, when secretive government surveyors first appeared on the remote reservation, supposedly looking for deposits of an ore called vanadium, used to strengthen steel needed for the war effort. Uranium was the real prize, and after the bombings of Hiroshima and Nagasaki and the ramping up of the Cold War, the American demand for the radioactive substance boomed.
The Navajo Nation and the area around it contained some of the richest deposits of uranium ore in the world, and certainly the most conveniently located. For about a decade, various corporations and government agencies reaped 1.4 million tons of uranium ore from the Monument Valley region alone; Pasternak makes a single mine there, known as Monument No. 2, her primary focus. The mining operations were relatively rudimentary, and by ordination of the tribal government, worked almost entirely by Navajo men. Even the cheapest and most elementary safety practices, such as wetting down blast areas to keep the miners from breathing toxic dust, were neglected in the rush to satisfy the Atomic Energy Commission’s insatiable appetite for uranium.
By the 1960s, the need tapered off, and the mining companies blithely abandoned the sites, leaving piles of radioactive tailings lying around for Navajo kids to play on and their parents to scavenge for conveniently sized rocks with which to build houses, ovens and cisterns. The dust and gravel made seemingly excellent concrete for floors. Monument No. 2, once a mesa, had been nearly leveled, its uranium-laced innards exposed to the open air, reduced to what Pasternak characterizes as a “radioactive pit.” Old quarries filled up with rain- and groundwater, new “lakes” from which local residents watered their herds and gratefully drank.
The next boom, unsurprisingly, was in cancer rates (previously so low among the Navajo that they were thought to be miraculously immune to the disease), and in a birth defect, christened “Navajo neuropathy,” that caused children’s fingers to fuse together and curl into claws. Still, it took decades for the cause to be fully recognized and even longer for it to be addressed; it wasn’t until 2008 and under the lashing of Rep. Henry Waxman, that the federal government made serious efforts to clean up the mine sites, purify water supplies and relocate families living in houses built from radioactive materials.
Read the rest of the review here.
Photo Credit: Christopher Isherwood
Saturday, September 11th, 2010
Here’s an interesting thought question: How much would global temperature warm if we used only the existing energy infrastructure (i.e., power plants, furnaces, motor vehicles) until these machines reached the end of their useful lives? Once they died, they would be replaced by devices that did not emit CO2.
Steven Davis and colleagues addressed this question in the current issue of Science:
We calculated cumulative future emissions of 496 (282 to 701 in lower- and upperbounding scenarios) gigatonnes of CO2 from combustion of fossil fuels by existing infrastructure between 2010 and 2060, forcing mean warming of 1.3°C (1.1° to 1.4°C) above the pre-industrial era and atmospheric concentrations of CO2 less than 430 parts per million. Because these conditions would likely avoid many key impacts of climate change, we conclude that sources of the most threatening emissions have yet to be built. However, CO2-emitting infrastructure will expand unless extraordinary efforts are undertaken to develop alternatives.
Their analysis suggests that CO2 emissions would decline linearly from 35 gigatons/year in 2010 to less than 5 gigatons/year in 2050, with the majority of the remainder being non-energy emissions from things like cement manufacture and land use changes.
On a personal level, this would mean replacing your current furnace, car, and electricity sources with ones that emitted no CO2, so we’re talking upwards of 15-20 years for a personal vehicle, 20-30 years for a furnace, and 50+ years for power stations, depending on the age of these items. The average power plant age in the U.S. is 32 years compared to 12 years in China and 21 and 27 years in Japan and Europe.
It’s encouraging to know that it may be possible to avert serious climate change without having to shut down existing infrastructure right away (especially long-lived fossil fuel power plants) but only if we plow significant funding into developing and implementing carbon-free technologies to replace them. However, Davis et al. acknowledge that this is a tall order:
[T]here is little doubt that more CO2-emitting devices will be built. Our analysis considers only devices that emit CO2 directly. Substantial infrastructure also exists to produce and facilitate use of these devices. For example, factories that produce internal combustion engines, highway networks dotted with gasoline refueling stations, and oil refineries all promote the continuation of oil-based road transport emissions. Moreover, satisfying growing demand for energy without producing CO2 emissions will require truly extraordinary development and deployment of carbon-free sources of energy, perhaps 30 TW by 2050. Yet avoiding key impacts of climate change depends on the success of efforts to overcome infrastructural inertia and commission a new generation of devices that can provide energy and transport services without releasing CO2 to the atmosphere.
Davis, S., Caldeira, K., & Matthews, H. (2010). Future CO2 Emissions and Climate Change from Existing Energy Infrastructure Science, 329 (5997), 1330-1333 DOI: 10.1126/science.1188566
Photo credit: Stuck in Customs