Guilford Harbor

Archive for the ‘technology’ Category

« Older Entries |

Another challenge to confront with geoengineering: Ocean plankton toxins

Wednesday, November 10th, 2010

There have been several critiques of geoengineering as a climate mitigation tool.  Two of the most incisive, in my opinion, come from science and ethics.

The first is a 2007 paper in PNAS by Matthews and Caldeira showing that if we establish aerosol clouds or space reflectors while doing nothing to reduce carbon emissions, we run the risk of catastrophic rates of warming (2-4 degrees C per decade) if these systems were to fail.

The second is a recent piece in Slate by my colleague, Dale Jamieson, who argued that there is no moral and legal authority to know how and when to deploy geoengineering or by how much.

One proposed geoengineering tool is fertilizing the world’s oceans with iron.  The premise behind this idea was developed by John Martin in 1990, who is often quoted as saying something like, “Give me a tanker of iron, and I’ll give you an ice age.” Micronutrients like iron and zinc are extremely limiting to phytoplankton growth in the open ocean—orders of magnitude moreso than nutrients we typically think of in common fertilizers, like nitrogen and phosphorus.  Dumping iron into the oceans has been shown to stimulate algal blooms, and the creation of this biomass consumes CO2 from the surface waters and atmosphere, thereby helping to mitigate rising CO2 from fossil fuels.  In theory, some of this biomass should sink to the deep ocean where it is sequestered for centuries, but this has yet to be shown definitively on a wide scale.

In a forthcoming paper in the Proceedings of the National Academy of Sciences, Mary Silver and colleagues show that there is another potential risk of geoengineering resulting from ocean iron fertilization…

(more…)

Posted in climate adaptation, climate change science, geoengineering, solutions, technology, toxics | 1 Comment »

Disconnect: The latest warning on cell phone radiation and health

Sunday, October 10th, 2010


Thomas Rogers at Salon.com has a review of Devra Davis’ new book, “Disconnect: The Truth About Cell Phone Radiation, What the Industry Has Done to Hide It, and How to Protect Your Family“.

The apparent bottom line for cell phone safety:

  • Use texting instead of voice calling.
  • Use an earpiece if you must voice call.
  • Keep your cell phone at least an inch away from your body at all times while it’s on.

The full article is worth reading.  Below are a few excerpts of the review and Rogers’ interview with Davis:

In “Disconnect,” Devra Davis, a scientist and National Book Award finalist for “When Smoke Ran Like Water,” looks at the connection between cellphones and health problems, with some disturbing results. Recent studies have tied cellphone use to rises in brain damage, cheek cancer and malfunctioning sperm. She reveals the unsettling fact that many new cellphones now come with the small-print warning that they are to be kept at least one-inch from the ear (presumably for safety reasons) and many insurance companies refuse to insure cellphone companies against health-related claims. Most troubling of all, science has shown that children and teenagers are particularly susceptible to cellphone radiation, raising questions about its effects on coming generations.

What to you is the most compelling evidence that links cellphones to brain cancer?

The brain cancer connection is in fact a very complicated one. Cancer can take a long time to develop. After the Hiroshima bomb fell, there was no increase in brain cancer for 10 years, even 20 years afterward. Forty years later, there was a significant increase in brain cancer in people who survived the bombing. Now, for studies of people who have been heavy cellphone users (defined as someone who has made a half-hour call a day for 10 years), there is a 50 percent increase in brain cancer overall. And among the heaviest users there’s a two- to fourfold increased risk.

We’ve only really been using cellphones for 10 years. Isn’t it a bit early to be drawing these kinds of conclusions?

Well, that’s actually not true. Heavy use of cellphones in the United States is a very recent phenomenon for the general population. In the year 2000, fewer than half of us regularly used cellphones. Now almost all of us do. If there’s a 10-year latency, we still have to wait another five years in the United States to see any general population impacts.

You have to look at all of the evidence and not simply wait for proof of human harm or sick people or dead people. If the debate becomes, “Do we have sufficient proof of human harm?” that means we’re waiting another 20 years. That means we will potentially have an epidemic before we act to prevent harm. Now, some people could be very cynical and say, look, brain cancer is relatively rare so even if it doubles or quadruples it’s still rare. But it’s also, at this point, mostly incurable.

Why are young people so much more at risk?

Their brains are not fully protected with myelin. Myelin is a kind of fatty sheath that goes around neurons [brain cells] and helps to enhance judgment and a whole bunch of other things, like impulse control. Their skulls are also thinner, and a thinner skull admits more radiation. We now know that the young brain doesn’t mature until the mid-20s, later in boys than girls. We need to be much more vigilant about protecting the young brain because it is more vulnerable. We know that from work that’s been done on lead and a number of other agents.

If this research is really as convincing as it seems to be, then why hasn’t it created a widespread uproar?

Well, it has in France. Bills passed both houses of the French national government this spring that ban the marketing and creation of phones uniquely for children. It’s also had an impact in Israel, a country that is very sophisticated in its use of radar and microwaves, and Finland, both of which have issued warnings.

But think about the fine print warning that comes with BlackBerry Torch. It says, If you keep the phone in your pocket, it can exceed the FCC exposure guidelines. What’s that supposed to tell you? It sounds like that phone cannot safely be put in your pocket — well, where do they expect people to keep them?

….The book also describes the aggressive push-back by people affiliated with the cellphone industry against scientists whose findings point to safety concerns — including, in one case, a campaign to discredit someone’s findings by accusing them of manufacturing evidence. It’s pretty explosive stuff.

I think it might have started out as nothing more than companies wanting to make profits, and wanting to keep their products in a positive light. Companies are allowed to make profits; I’m not opposed to that. And I imagine people genuinely thought these kinds of dangers from radiation weren’t possible, because the physics paradigm [at the time] said it wasn’t. But it has since been morphed into something worse. Now even the insurance industry is listening to scientists. Many companies are no longer providing coverage for health damage from cellphones.

We need to be more sophisticated as a society in using experimental data where we have it. We have experimental data on sperm counts. We have experimental data on brain cell damage. We have experimental data on biological markers that we know increase the risk of cancer. These are the same debates that went out over passive smoking, over active smoking, over asbestos, over benzene, over vinyl chloride. They said we don’t have enough sick or dead people. The consequence was to continue exposing people. Is there anybody in the world who believes we should have waited as long as we did?

Read more.

___

Photo credit: liber

Tags: , ,
Posted in health, risk analysis, technology | No Comments »

The drive to green energy, courtesy of…the military?

Monday, October 4th, 2010

The NY Times is running a story, Military Orders Less Dependence on Fossil Fuels, that evaluates the American military’s role in green energy innovation:

Even as Congress has struggled unsuccessfully to pass an energy bill and many states have put renewable energy on hold because of the recession, the military this year has pushed rapidly forward. After a decade of waging wars in remote corners of the globe where fuel is not readily available, senior commanders have come to see overdependence on fossil fuel as a big liability, and renewable technologies — which have become more reliable and less expensive over the past few years — as providing a potential answer. These new types of renewable energy now account for only a small percentage of the power used by the armed forces, but military leaders plan to rapidly expand their use over the next decade.

“There are a lot of profound reasons for doing this, but for us at the core it’s practical,” said Ray Mabus, the Navy secretary and a former ambassador to Saudi Arabia, who has said he wants 50 percent of the power for the Navy and Marines to come from renewable energy sources by 2020. That figure includes energy for bases as well as fuel for cars and ships.

While setting national energy policy requires Congressional debates, military leaders can simply order the adoption of renewable energy. And the military has the buying power to create products and markets. That, in turn, may make renewable energy more practical and affordable for everyday uses, experts say.

Last year, the Navy introduced its first hybrid vessel, a Wasp class amphibious assault ship called the U.S.S. Makin Island, which at speeds under 10 knots runs on electricity rather than on fossil fuel, a shift resulting in greater efficiency that saved 900,000 gallons of fuel on its maiden voyage from Mississippi to San Diego, compared with a conventional ship its size, the Navy said.

The Air Force will have its entire fleet certified to fly on biofuels by 2011 and has already flown test flights using a 50-50 mix of plant-based biofuel and jet fuel; the Navy took its first delivery of fuel made from algae this summer. Biofuels can in theory be produced wherever the raw materials, like plants, are available, and could ultimately be made near battlefields.

Read more.

___

Photo credit: Shortbread1015DT

Tags:
Posted in biofuels, conflict, energy, solutions, technology, transportation | No Comments »

What’s the carbon footprint of building your car, and how does that compare to tailpipe emissions?

Friday, September 24th, 2010

Mike Berners-Lee and Duncan Clark at The Guardian have a recent post in the series examining the carbon footprints of daily life activities.  Their post asks how much carbon emissions results from the direct and indirect activities of building a car.

The carbon footprint of making a car is immensely complex. Ores have to be dug out of the ground and the metals extracted. These have to be turned into parts. Other components have to be brought together: rubber tyres, plastic dashboards, paint, and so on. All of this involves transporting things around the world. The whole lot then has to be assembled, and every stage in the process requires energy. The companies that make cars have offices and other infrastructure with their own carbon footprints, which we need to somehow allocate proportionately to the cars that are made.

….The best we can do is use so-called input-output analysis to break up the known total emissions of the world or a country into different industries and sectors, in the process taking account of how each industry consumes the goods and services of all the others. If we do this, and then divide by the total emissions of the auto industry by the total amount of money spent on new cars, we reach a footprint of 720kg CO2e per £1000 spent.

This is only a guideline figure, of course, as some cars may be more efficiently produced than others of the same price. But it’s a reasonable ballpark estimate, and it suggests that cars have much bigger footprints than is traditionally believed. Producing a medium-sized new car costing £24,000 may generate more than 17 tonnes of CO2e – almost as much as three years’ worth of gas and electricity in the typical UK home.

17 (metric) tons is 17,000 kg or about 37,400 pounds.   The U.S. EPA estimates that the average passenger vehicle in the U.S. emits 5-5.5 metric tons CO2e per year, assuming 12,000 miles driven.

If you do the math, this means the embodied CO2e emissions to make a car is about 3-3.5 years worth of tailpipe emissions from driving.  Assuming that most people own their cars for longer than three years, this figure doesn’t jive with what the authors claim:

The upshot is that – despite common claims to contrary – the embodied emissions of a car typically rival the exhaust pipe emissions over its entire lifetime. Indeed, for each mile driven, the emissions from the manufacture of a top-of-the-range Land Rover Discovery that ends up being scrapped after 100,000 miles may be as much as four times higher than the tailpipe emissions of a Citroen C1.

If people held onto their cars for 10 years (assuming 120,000 miles), tailpipe emissions would equal 50 metric tons of CO2e, and embodied emissions would be about 34% of tailpipe emissions.  If people drove their cars for 20 years (assuming 240,000 miles), the exhaust emissions would rise to 100 metric tons CO2e, with embodied emissions dropping to 17% of tailpipe emissions.

While most folks generally agree with the notion of driving their vehicle into the ground (as my recently dead 16-yr-old truck illustrates), you’d have to be driving a Toyota Prius to get a lifetime tailpipe emission that equals the embodied emissions of building it (assuming that a Prius achieves three times the mpg of a typical car, which would drop CO2e tailpipe emissions from 5 to 1.7 metric tons CO2e per year, making a 10-year total tailpipe emission of 17 metric tons reasonable).

Thus, if you drive an average car for 10 years, your lifetime tailpipe emissions (50 metric tons) will be a lot larger than the embodied emissions to build the car (17 metric tons) (for a total emission of 67 metric tons).  If you drive a hyper-efficient vehicle for 10 years, tailpipe and embodied emissions may be comparable (17 metric tons each, 34 metric tons total).  This means you could buy a new Prius every three years, and the embodied emissions from all of these purchases plus tailpipe emissions would roughly equal a normal car driven for 10 years.

This raises an important question:  What matters here?  If the goal is to reduce total emissions, the best thing is to buy a car with a very high fuel efficiency and drive it for its full life, as the above examples illustrate.

___

Photo credit:  atomicshark

Posted in behavior, carbon footprint, climate change science, energy, solutions, sustainability, technology, transportation | 3 Comments »

How much would climate change if we used existing infrastructure to the end of its life?

Saturday, September 11th, 2010

Here’s an interesting thought question:  How much would global temperature warm if we used only the existing energy infrastructure (i.e., power plants, furnaces, motor vehicles) until these machines reached the end of their useful lives?  Once they died, they would be replaced by devices that did not emit CO2.

Steven Davis and colleagues addressed this question in the current issue of Science:

We calculated cumulative future emissions of 496 (282 to 701 in lower- and upperbounding scenarios) gigatonnes of CO2 from combustion of fossil fuels by existing infrastructure between 2010 and 2060, forcing mean warming of 1.3°C (1.1° to 1.4°C) above the pre-industrial era and atmospheric concentrations of CO2 less than 430 parts per million. Because these conditions would likely avoid many key impacts of climate change, we conclude that sources of the most threatening emissions have yet to be built. However, CO2-emitting infrastructure will expand unless extraordinary efforts are undertaken to develop alternatives.

Their analysis suggests that CO2 emissions would decline linearly from 35 gigatons/year in 2010 to less than 5 gigatons/year in 2050, with the majority of the remainder being non-energy emissions from things like cement manufacture and land use changes.

On a personal level, this would mean replacing your current furnace, car, and electricity sources with ones that emitted no CO2, so we’re talking upwards of 15-20 years for a personal vehicle, 20-30 years for a furnace, and 50+ years for power stations, depending on the age of these items.  The average power plant age in the U.S. is 32 years compared to 12 years in China and 21 and 27 years in Japan and Europe.

It’s encouraging to know that it may be possible to avert serious climate change without having to shut down existing infrastructure right away (especially long-lived fossil fuel power plants) but only if we plow significant funding into developing and implementing carbon-free technologies to replace them.  However, Davis et al. acknowledge that this is a tall order:

[T]here is little doubt that more CO2-emitting devices will be built. Our analysis considers only devices that emit CO2 directly. Substantial infrastructure also exists to produce and facilitate use of these devices. For example, factories that produce internal combustion engines, highway networks dotted with gasoline refueling stations, and oil refineries all promote the continuation of oil-based road transport emissions. Moreover, satisfying growing demand for energy without producing CO2 emissions will require truly extraordinary development and deployment of carbon-free sources of energy, perhaps 30 TW by 2050. Yet avoiding key impacts of climate change depends on the success of efforts to overcome infrastructural inertia and commission a new generation of devices that can provide energy and transport services without releasing CO2 to the atmosphere.

Davis, S., Caldeira, K., & Matthews, H. (2010). Future CO2 Emissions and Climate Change from Existing Energy Infrastructure Science, 329 (5997), 1330-1333 DOI: 10.1126/science.1188566
___

Photo credit: Stuck in Customs

Posted in climate change science, energy, solutions, sustainable development, technology | 1 Comment »

Homes of the future?

Saturday, September 11th, 2010

The NYT is running an op-ed by Bob Dunay and Joseph Wheeler (Virginia Tech) about a new, award-winning home design that challenges people to re-think their conception of the built environment:

Will our children’s homes be anything as comfortable and expansive as our own?

The answer is yes—though it depends on how you frame the question. Our children probably won’t be able to afford to run conventional air conditioners all day long. Nor will they likely have access to unlimited water supplies, particularly in the parched Southwest. But that doesn’t mean they have to live without the same quality of life that their parents and grandparents have grown accustomed to. The key is to use smart planning and technological advances to not merely adapt the home, but rethink its most basic design and function.

To demonstrate what such a house might look like, our team of professors and students at Virginia Tech designed and built Lumenhaus. With functional spaces and a modest size that allows for efficient energy use, Lumenhaus won the 2010 Solar Decathlon Europe, a competition that brought together 17 college teams from around the world in Madrid.

Check out the film about this house and the interesting interactive feature.

Posted in nature and culture, solutions, sustainability, sustainable architecture, technology | No Comments »

City dwellers of the future: Urban heat island warming may be as large as doubling CO2

Monday, April 19th, 2010

I remember driving on a freeway in Phoenix after midnight in 1990.  The temperature was a cool 102 degrees F after breaking the all-time heat record of 126 F that day.  Deserts are good at cooling off at night.  But with all of the built environment in Phoenix storing heat from the day, the sidewalks, roads, and even swimming pools felt like they were being heated.

We all have probably experienced urban heat islands—the mass of dark asphalt and concrete absorbing solar radiation and radiating it back to space as heat.  The lack of water exacerbates the situation because there is little-to-no evaporative cooling.  Waste heat from cars, machines, air conditioners, and even human bodies also heat up the air.  And the warmer it gets, the stronger the tendency to crank up the air conditioners, generating even more waste heat.

The problem is potentially large in areas like the Middle East, India, parts of Africa, and the American Southwest, where rapid urbanization in warm, dry environments has the potential to make some urban areas much warmer at night than surrounding rural areas.

In a forthcoming article in Geophysical Research Letters1, Mark McCarthy and colleagues at the Met Office, Hadley Centre, UK used a climate model that examines what climate might look like in a doubled CO2 world and calculates the added warming caused by urbanization and wasted heat.

Their results were eye-opening:

  • Urban regions in places like the Arabian Peninsula, Iran, and India may experience night time warming by as much as 3-5 degrees C above and beyond that caused by doubled CO2 alone.
  • The number of hot nights per year (defined as temperatures in the 99th percentile of nonurban areas) increase in the following cities:
    • London: 1-2 hot nights now vs. up to 10 hot nights in 2050
    • Sydney: 1-2 hot nights now vs. up to 15 hot nights in 2050
    • Delhi: 5-10 hot nights now vs. up to 30 hot nights in 2050
    • Beijing: 3-6 hot nights now vs. up to 50 hot nights in 2050
    • Los Angeles: 8-12 hot nights now vs. up to 40 hot nights in 2050
    • Tehran: 20 hot nights now vs. up to 60 hot nights in 2050
    • Sao Paulo: <5 hot nights now vs. up to 80 hot nights in 2050
    • Lagos (Nigeria): <5 hot nights now vs. up to 150 hot nights in 2050

As mentioned in an earlier post, we only need to remember Chicago in 1995 to recall the deadly impact that heat waves can have on urban people.  And as we saw in that unfortunate example, the victims were disproportionately the elderly and African American.

Although we may not be able to mitigate this warming, basic adaptation steps should be set into motion, including re-thinking urban design, making cities more resilient to hot environments, developing better energy and technology solutions (including cooling), installing green roofs, and putting into place emergency disaster plans and social safety nets for vulnerable populations.

1Mark McCarthy, Martin Best, and Richard Betts (2010). Climate change in cities due to global warming and urban effects Geophysical Research Letters : 10.1029/2010GL042845

_____
Photo Credit:

http://www.flickr.com/photos/dustinphillips/ / CC BY-NC-ND 2.0

Posted in climate adaptation, climate change science, energy, environmental justice, health, land use, population, race and class, sustainability, technology, urban | 3 Comments »

The hidden global CO2 emissions of consumerism

Monday, March 8th, 2010

It’s been easy for citizens of the developed, industrialized world to criticize China and India over their rapidly growing greenhouse gas emissions.  This was one of the major reasons why the Kyoto Protocol was never ratified in the United States.

As many have  pointed out, however, there are several flaws with this argument:

  • The per-capita carbon emissions in China and India remain much lower (1/4 and 1/16, respectively) compared to the U.S..
  • Perhaps more importantly, some of the carbon emission in these countries is caused by the production of export goods to fuel consumer demand in wealthy nations.  Thus, we are responsible for “shadow carbon emissions” that get attributed to developing nations.

Until today, there haven’t been very good estimates of these kinds of shadow emissions.

In the Early Edition of the Proceedings of the National Academy of Sciences, Steven Davies and Ken Caldeira examine how much CO2 is embodied in the import and export of goods.1

Their results are interesting (excerpts below—If you can get a copy of the article, check out figures 1 and 2; they are terrific visuals for this information.  Alas, copyrights don’t allow me to post them):

  • Approximately 6.2 gigatonnes (Gt) of CO2, 23% of all CO2 emissions from fossil-fuel burning, were emitted during the production of goods that were ultimately consumed in a different country.
  • Emissions imported to the United States exceed those of any other country or region, primarily embodied in machinery (91 Mt), electronics (77 Mt), motor vehicles and parts (75 Mt), chemical, rubber, and plastic products (52 Mt), unclassified manufactured products (52 Mt), wearing apparel (42 Mt), and intermediate goods (654 Mt).
  • These imports are offset by considerable US exports of transport services (49 Mt CO2), machinery (42 Mt), electronics (26 Mt), chemical, rubber, and plastics products (25 Mt), motor vehicles (22 Mt), and intermediate goods (263 Mt).
  • [G]oods imported to Western Europe and Japan embody much more CO2 per US$ than do their exports, reflecting the import of energy-intensive products from elsewhere.
  • The carbon intensity of imports to China, Russia, India, and the Middle East is consistently far less than that of their exports.
  • China is by far the largest net exporter of emissions, followed by Russia, the Middle East, South Africa, Ukraine, and India and, to a lesser extent, Southeast Asia, Eastern Europe, and areas of South America.
  • The primary net importers of emissions are the United States, Japan, the United Kingdom, Germany, France, and Italy. Although the overall mass of emissions is much less, the other countries of Western Europe are all net importers, as are New Zealand, Mexico, Singapore, and many areas of Africa and South America. Similarly, Canada, Australia, Indonesia, the Czech Republic, and Egypt are among the countries whose net exports of emissions are small.
  • On a per-capita basis, net imports of emissions to the United States, Japan, and countries in Western Europe are disproportionately large, with each individual consumer associated with 2.4–10.3 tons of CO2 emitted elsewhere.

Their conclusion:

Consumption-based accounting reveals that substantial CO2 emissions are traded internationally and therefore not included in traditional production-based national emissions inventories. The net effect of trade is the export of emissions from China and other emerging markets to consumers in the United States, Japan, and Western Europe. In the large economies of Western Europe, net imported emissions are 20–50% of consumption emissions; the net imported emissions fall to 17.8% and 10.8% in Japan and the United States, respectively. In contrast, net exports represent 22.5% of emissions produced in China. Thus, to the extent that constraints on emissions in developing countries are the major impediment to effective international climate policy, allocating responsibility for some portion of these emissions to final consumers elsewhere may represent an opportunity for compromise.

1Steven J. Davis and Ken Caldeira (2010). Consumption-based accounting of CO2 emissions PNAS : 10.1073/pnas.0906974107

_____
Photo Credit: http://www.flickr.com/photos/deks/ / CC BY-NC 2.0

Tags: ,
Posted in behavior, climate change science, climate economics, energy, nature and culture, technology, transportation | 1 Comment »

Cell phones and your health

Saturday, February 27th, 2010

343384475_5ad1045bba

Environmental Working Group (EWG) has updated their information on cell phone radiation and potential health risks.

As I alluded to in a previous post, conducting human health risk analyses for things like cell phone radiation exposure is difficult because it’s hard to determine how much exposure is too much, and it takes years to see what health effects might show up.

The research below suggests that links between cell phone radiation and health are now becoming evident.

And with more than 4 billion cell phone users worldwide (2/3 of the human population), we are unintentionally conducting one of the largest epidemiological studies of all time.

Learn more from EWG:

____

Photo credit:  http://www.flickr.com/photos/gibbons/ / CC BY-ND 2.0

Tags: , ,
Posted in health, risk analysis, technology | 2 Comments »

Energy breakthrough? Have fuel cells for the masses finally arrived?

Monday, February 22nd, 2010

Huff Post is running a story on a recent 60 Minutes piece about a new kind of fuel cell—the “Bloom Box” —that is already powering companies like Google, Fed Ex, and EBay (click on the link for video of this story).

It runs on natural gas, and two of these little boxes (about the size of a shoe box combined) could conceivably power your entire home.

Estimated cost: $3,000 for off-the-grid electricity.

It will be interesting to see if these are commercially viable and what else Silicon Valley has in store over the next five years.  Along with electric cars, which roll into showrooms in a matter of months, we are on the cusp of some pretty big technology transformations.

Update:  An educated guess from one of my colleagues, Andy Price, in the energy business:

I hope I am wrong, but the Bloom Box looks like it suffers from the same problem that all fuel cell companies are suffering from: their systems are really expensive per KW.

If Ebay paid $700,000 to $800,000 per unit for 5 units, as was suggested in the story, this would be $3.5 to $4 million. If they saved the stated $100,000 in 9 months this would be a 26 to 30 year payback – and with a fuel cell using natural gas you still need a natural gas pipe and have associated carbon emissions.

If Bloom can somehow deliver the dramatic cost reductions that they claim
this could start to look more attractive but until Bloom provides additional
details, it looks like more hype than substance. Many other well funded
companies including UTC, Honda and GE are working on similar technology and none have been able to deliver the big breakthrough. Yet.

Update 2Wired comes to a similar conclusion–too pricey.

Tags:
Posted in energy, solutions, sustainability, technology | No Comments »

« Older Entries |
Bowdoin College

Bowdoin College web site:

Search | A - Z Index | Directory