This blog will cover some news items related to Sustainability: Corporate Social Responsibility, Stewardship, Environmental management, etc.


Global Warming - Who Might Get Sued: Lawyers, sharpen your pencils

Thanks to Peter

Global Warming - Who Might Get Sued

global-warming-who-might-get-sued.jpegA number of companies are leaving themselves vulnerable to shareholder lawsuits regarding their carbon emissions, but a CNNMoney article claims the companies most vulnerable are some less obvious firms.Corporate Library, a corporate governance research group, took the top fifty emitters of CO2 and the top fifty companies that emit more CO2 than other industries in their sector, then ranked the companies using a number of criteria.  Hasbro, Corning, and Burlington Northern were some of the companies that scored lower than most utilities and other large emitters of CO2 because their disclosure and reduction strategies were not detailed enough.

No such suits have been filed yet, but should a cap on CO2 emissions be passed, lawsuits would most likely come not from environmentalists, says Beth Young, author of the Corporate Library report, but from pension or hedge funds that take a loss or index funds that can't dump the company's stock.

Environmental lawyer Jeffrey Smith believes that winning a shareholder suit related to CO2 emissions would be very difficult.  He also says that Corporate Library's study, is "A good first step, but these issues are developing so rapidly, and the nuances of any individual company's decisions about what and how to disclose this information make them largely incompatible with such a scoring device."

Forty-three climate-related shareholder resolutions were filed with US companies  in 2007.


Powered by

Click Here to Print



Who could get sued for global warming
It's not who you think. One report identifies a toymaker and cruise operator among firms most at risk for not telling shareholders enough.
By Steve Hargreaves, staff writer
January 9 2008: 10:01 AM EST

NEW YORK ( -- A host of well-known companies are leaving themselves open to shareholder lawsuits because they're not telling investors enough about how much they contribute to global warming or what it might cost them to clean up, according to a recent report.

But the companies most exposed to lawsuits aren't the big utilities but a cadre of less obvious firms.

The corporate governance research group the Corporate Library ranked companies that both emit lots of carbon - either directly or through their major parts suppliers - and those that produce more emissions than their peers.

"The purpose of the study was to put up some red flags and say there are some things that look disturbing in our data," said Beth Young, a Corporate Library researcher and author of the report.

Toycompany Hasbro (HAS), fiber-optic maker Corning (GLW, Fortune 500), railroad Burlington Northern (BNI, Fortune 500), Royal Caribbean (RCL) cruise line and lawn and garden company Scotts Miracle-Grow (SMG) all scored below average in the report, while most utilities and other big emitters of carbon dioxide - who have long detailed their emissions and potential costs for cleaning them up in financial filings - scored better.

The firms that ranked below averageare at greater risk of shareholder lawsuits if and when a cap on carbon dioxide emissions is passed, the report said.

Since there are currently no federal laws restricting these emissions, there have been no suits of this type filed to date.

But lawyers say it's certainly a possibility, especially if a company's stock stumbles if expensive upgrades are mandated.

"A company that has a lot of carbon emissions and doesn't disclose it could be exposed," said Michael Gerrard, a New York-based lawyer specializing in environmental issues at the law firm Arnold and Porter.

Young from the Corporate Library said lawsuits would mostly likely come not from environmentalists, but from pension funds or hedge funds that take a loss, or index funds that don't have the option of dumping the company's stock.

"If someone sees an opportunity to recoup value, you don't have to be an activist shareholder" to take a company to court, she said.

While there's "unprecedented political heat and exposure on this issue," Jeffrey Smith, an environmental lawyer at Cravath, Swaine & Moore, believes winning a shareholder suit related to exposure from carbon emissions would be difficult.

First, management can do all sorts of things that cause a company to lose value and it's hard to prove they were acting irresponsibly. Second, no legislation capping carbon levels has yet been passed, so it would be hard for a company to put a dollar amount on any clean up costs required, Smith said. contacted the companies that scored below average in the report.

"I'm pretty surprised we're on this list," Hasbro spokesman Wayne Charness said. "I'm not sure where it came from, but I think it's very misleading."

Like other company spokespeople, Charness wondered how the Corporate Library came up with their total carbon emission numbers and said a company target to reduce emissions 30 percent by 2007 had already been exceeded by Hasbro in 2006.

The Corporate Library relied on Trucost, an environmental research organization, for the carbon data.

A spokesman for Burlington Northern said the company does voluntarily disclose its carbon emissions, adding that rail transport is more efficient than other forms of land shipping.

Corning said it voluntarily reports its carbon emissions and has a reduction strategy in place.

But the disclosure and reduction strategies employed by Hasbro, Burlington Northern and Corning were not detailed enough to score high on the Corporate Library's list.

The other companies cited in the report either declined comment or could not be reached.

As investors try to determine which companies might benefit and which might suffer under carbon regulations, it's becoming more important for businesses to gather and release relevant information.

"The market is craving data and statistics," Cravath's Smith said. "Timely analysis of data by management will be vital in communicating to the marketplace that a company has its response to climate change well in hand."

But attempting to rank companies, as the Corporate Library and other studies have done, may be premature.

"It's a good first step, but these issues are developing rapidly, and the nuances of any individual company's decisions about what and how to disclose this information make them largely incompatible with such a scoring device," said Smith.

The Corporate Library devised the list by taking a combination of the 50 top emitters of carbon dioxide in the country and the top 50 companies that emit a greater percentage of carbon dioxide than other industries in their sector.

It then ranked the companies using a number of criteria, including whether they disclosed actual historical and current greenhouse gas emissions,had devised an emissions reduction strategy or had created business models around the future cost of carbon. The research group also looked at whether company boards included members with climate change experience and if their executive compensation packages included rewards for reducing carbon emissions. To top of page

Hooking up a greenhouse gas meter

How to save Mother Earth without breaking the bank


Find this article at:

Click Here to Print

 Check the box to include the list of links referenced in the article.

© 2007 Cable News Network LP, LLP.



[BBC] Report predicts wave of green tech: Worldwatch estimates there was $100bn (£51bn) of new investment in industries and technologies tackling climat

Thanks to Pete!

Report predicts wave of green tech
Entrepreneurs and investors are 'inventing' a new sustainable global economy, according to an environmental research group.

The Worldwatch Institute says there is a new wave of innovation based around environmental technologies.

The report also says large firms are changing their ways and are investing in greener production methods.

But it says a worldwide agreement on climate change is still needed to secure investment in the future.

New opportunities

In order to prevent local successes being swallowed in larger global failure you must have a way of measuring emissions and controlling them
Aubrey Meyer, Director, Global Commons Institute

In its report 'State of the World 2008', Worldwatch estimates there was $100bn (£51bn) of new investment in industries and technologies tackling climate change in the last year, which is creating new opportunities for business.

It identifies examples like renewable energy, which saw $52bn of investment in 2006, up by 33%.

It says that carbon trading reached an estimated $30bn in 2006, triple that of the previous year.

It also points to individual companies, like that of US chemical firm Dupont, which has committed to sharp reductions in greenhouse gas emissions.

Really exciting

Worldwatch President, Chris Flavin likens the situation to the 1980s when money poured into start-up firms in the computer and software industries, which helped create giants like Microsoft and Dell.

He says: "The most significant thing happening is that environmental technology is the hottest area for venture capital - it's really exciting."

Global failure

Aubrey Meyer is a climate campaigner and Director of the Global Commons Institute.

He says there's a danger the good work could be lost amid political failures.

He says: "In order to prevent local successes being swallowed in larger global failure you must have a way of measuring emissions and controlling them."

Worldwatch accepts at the moment these efforts only represent a fraction of the global economy, and that more effort needs to be made to find a global agreement on combating climate change.

But it argues that there has been a sea change in the thinking of businesses.

In the UK, progress may be slower.

A report released last month also disappointed environment campaigners.

In a survey of business leaders the Carbon Trust found that just 1% of all UK firms knew their "carbon footprint" - or an account of how much carbon dioxide they produce.

Story from BBC NEWS:

Monitoring Tools Help Cut Home Energy Use, Study Finds: The homeowners could go to a Web site to set their ideal home temperature and how many degrees they were willing to have that temperature move above or below the target. They also indicated their level of tolerance for fluctuating electricity prices

thanks to Lloyd for forwarding on

From tomorrow's NY Times, with a quote from Ron Ambrosio...
January 10, 2008
Monitoring Tools Help Cut Home Energy Use, Study Finds

Giving people the means to closely monitor and adjust their electricity use lowers their monthly bills and could significantly reduce the need to build new power plants, according to a yearlong government study.

The results of the research project by the Pacific Northwest National Laboratory of the Energy Department, released Wednesday, suggest that if households have digital tools to set temperature and price preferences, the peak loads on utility grids could be trimmed by up to 15 percent a year.

Over a 20-year period, this could save $70 billion on spending for power plants and infrastructure, and avoid the need to build the equivalent of 30 large coal-fired plants, say scientists at the federal laboratory.

The demonstration project was as much a test of consumer behavior as it was of new technology. Scientists wanted to find out if the ability to monitor consumption constantly would cause people to save energy — just as studies have shown that people walk more if they wear pedometers to count their steps.

In the Olympic Peninsula, west of Seattle, 112 homes were equipped with digital thermostats, and computer controllers were attached to water heaters and clothes dryers. These controls were connected to the Internet.

The homeowners could go to a Web site to set their ideal home temperature and how many degrees they were willing to have that temperature move above or below the target. They also indicated their level of tolerance for fluctuating electricity prices. In effect, the homeowners were asked to decide the trade-off they wanted to make between cost savings and comfort.

The households, it turned out, soon became active participants in managing the load on the utility grid and their own bills.

"I was astounded at times at the response we got from customers," said Robert Pratt, a staff scientist at the Pacific Northwest National Laboratory and the program director for the demonstration project. "It shows that if you give people simple tools and an incentive, they will do this."

"And each household," Mr. Pratt added, "doesn't have to do a lot, but if something like this can be scaled up, the savings in investments you don't have to make will be huge, and consumers and the environment will benefit."

After some testing with households, the scientists decided not to put a lot of numbers and constant pricing information in front of consumers. On the Web site, the consumers were presented with graphic icons to set and adjust.

"We gave them a knob," Mr. Pratt said. "If you don't like it, change the knob."

Behind the fairly simple consumer settings was a sophisticated live marketplace, whose software and analytics were designed by I.B.M. Research. Every five minutes, the households and local utilities were buying and selling electricity, with prices constantly fluctuating by tiny amounts as supply and demand on the grid changed.

"Your thermostat and your water heater are day-trading for you," said Ron Ambrosio, a senior researcher at the Watson Research Center of I.B.M.

The households in the demonstration project on average saved 10 percent on their monthly utility bills. Jerry Brous, a retiree who owns a three-bedroom house in Sequim, Wash., did a bit better, saving about 15 percent, which added up to $135 over a year.

Mr. Brous, 67, said that at first he was a real price hawk, allowing the household temperature to go 10 degrees above or below the target as the outside temperature changed. In the winter, he and his wife, Pat, decided the house was too cold at times, so they changed the range to five degrees.

The monetary savings were nice, but Mr. Brous said his main motivation for joining the project was to participate in research that might accelerate the spread of energy efficiency programs.

Shortly after the demonstration project ended last March, the digital thermostat and other equipment supplied by Invensys Controls were removed from Mr. Brous's home. "I miss it a lot," he said. "It was cool."

The research project was done with an eye toward guiding policy on energy-saving programs. Efficiency programs promise to curb the nation's fuel bill and reduce damage to the environment, if consumers can be persuaded to use energy more intelligently. Still, a big question among economists and energy experts is how to tailor incentives to prompt changes in energy consumption.

The market signals from household utility bills are not clear to people, some experts say. Conservation steps, they note, may bring savings of only a few percentage points, and even those may be obscured by seasonal swings in electricity use and pricing. Thus, they say, the only way to make real progress in household energy efficiency is with sizable subsidies and mandated product standards.

The federal laboratory's project was instead a test of market incentives and up-to-the-minute information. But how quickly the kind of technology used in the project might be deployed across the country is uncertain. Many utilities are experimenting with this so-called smart-grid technology, but most are using it to upgrade their own networks, not to let households manage consumption.

One big hurdle is that in most states, utilities are still granted rates of return that depend mainly on the power plants and equipment they own and operate instead of how much energy they save.

"What they did in Washington is a great proof of concept, but you're not likely to see this kind of technology widely used anytime soon," said Rick Nicholson, an energy technology analyst at IDC, a research firm.


50% Of Consumers Consider Sustainability When Picking Brands: (Surprisingly,) “consumers aged 55 and older are the real driving force behind this expansion”

thanks to Peter

50% Of Consumers Consider Sustainability When Picking Brands

50-of-consumers-consider-sustainability-when-picking-brands.jpegApproximately 50 percent of U.S. consumers consider at least one sustainability factor in selecting consumer packaged goods items and choosing where to shop for those products, according to a survey conducted by Information Resources, Inc.

The 22,000 U.S. consumers surveyed were asked to determine the impact of four key sustainability features in their product and store selection-organic, eco-friendly products, eco-friendly packaging and fair treatment of employees and suppliers. One-fifth of those surveyed were determined to be "sustainability driven," taking at least two sustainability factors into account when making their selections.

Among the results:

  • Approximately 30 percent look for eco-friendly products and packaging in their brand selection,
  • Up to one-quarter of those surveyed consider fair trade practices along with eco-friendly or organic designations in selecting a shopping destination,
  • Nearly 40 percent of consumers search specifically for organic products,
iri2.jpgBenefiting from the winning combination of a 'better for you' association and a 'better for the environment' attribute, the organic designation has moved to the front of consumer consciousness.Once dominated by niche manufacturers and specialty retailers, CPG industry leaders now maintain a sizable stake in the organics market and with leading retailers. This includes Safeway and Kroger with their highly successful organic private label lines. Several leading manufacturers are also beginning to offer organic versions of favorite products, such as Kraft's organic Wheat Thins and Chips Ahoy.

Among non-food items, the IRI study highlights replacement of chemical-based items with eco-friendly products as an emerging sustainability category. One example is green laundry detergent. Though currently just two percent of the total detergent market, the growing demand for biodegradable, non-toxic and plant-based products is reflected in a 66 percent increase in green product sales during the past year within a category that has overall flat sales.

Whether motivated by the aim for healthier ingredients or a heightened environmental consciousness, the survey also underscores the fact that calls for sustainability cut across every consumer age group. Though contrary to assumptions that the focus on sustainability is a more youth-oriented phenomenon, IRI data shows that older consumers are actually the more likely audience to weigh multiple sustainability factors in their purchases.

"Consumers aged 55 and older are the real driving force behind this expansion," says Salzman. "Generally, with the time to seek out specialty items and the resources to afford premium priced products, aging consumers are a critical target market today. As sustainable products and packaging become more widely available, we anticipate that the market will expand across consumer segments."

Forty million boomers use their purchasing power to buy environmentally safe brands, according to a recent survey.


Does your company buy from polluters? Company presidents and marketing chiefs may be demanding more environmentally responsible operations, but analysts say those in charge of supply chains are having trouble figuring out how to make their operations greener.

Does your company buy from polluters?

By: Thomas Wailgum  (07 Jan 2008)

Company presidents and marketing chiefs may be demanding more environmentally responsible operations, but analysts say those in charge of supply chains are having trouble figuring out how to make their operations greener.
In a recent Forrester Research report, analyst Patrick Connaughton notes that "tightening regulatory pressure and pervasive media attention are moving supply chain sustainability issues up the corporate agenda." But, he continues, "surprisingly, very few companies are measuring the environmental impact of their supply chains today."
Then there's the nagging question of just how much consumers care. "Consumers have expressed in surveys that they are interested in this," says Edgar Blanco, a research associate at MIT's Center for Transportation and Logistics. "But in the key moment of purchase, it's unclear how much more they are willing to pay. And that's what companies are struggling with."
The problem, say researchers, lies in the intricate nature of supply chains. These webs of suppliers interact with multiple partners (and their partners, and their partners). How does a company figure out which emissions are its responsibility? And how much will it cost to reduce them? Blanco says companies have a long way to go in order to fully comprehend the size and scope of their carbon footprints.
Complex carbon calculations
When it comes to how much green initiatives actually cost companies, estimates can vary widely. For example, an August 2007 survey of 1,400 "key players" in the real estate and construction industry found that they misjudged the costs and benefits of green buildings.

The survey by the World Business Council for Sustainable Development (WBCSD), found that respondents had estimated the additional cost of building green at 17 per cent above conventional construction, which the WBCSD contends is more than triple the true cost difference of about 5 percent. Survey respondents also estimated greenhouse gas emissions by buildings at 19 per cent of the world's total, though the WBCSD claims that the actual number is 40 per cent.
For the supply chain, Blanco says, there are no reliable benchmarks that can offer companies a ballpark figure of just how much they'll have to spend to green their supply chains.
Here's why. According to Blanco, a representative from an electronics manufacturer described to him the complexity relative to the carbon footprint of its printer products: 10,000 parts from 5,000 suppliers who in turn have 3,000 partners. "Companies have information on their immediate suppliers, buy beyond that they don't really know," Blanco says.
Supply chains are also dynamic. Changing routes and modes of transportation affect the overall carbon emitted to deliver the product to the shelf. A banana picked in Costa Rica will take many different routes (by ship, truck or automobile) and use various sources of energy (diesel for transportation, electricity for refrigeration and warehousing) before it winds up in the supermarket, corner convenience store or Starbucks. Blanco says it isn't clear how companies should account for such variation in the life cycle of a product.
In his research, which is the design of energy- and carbon-efficient supply chains, Blanco has two goals: to figure out an independent and verifiable way to measure the carbon output in the complex industrial network; and to identify a commercially viable and consumer-friendly carbon labeling system. Right now, unlike, say, Energy-Star labels on appliances or food nutrition labels, there is no standard for what data should be on a carbon label or how the information should be conveyed to consumers.
But as Blanco envisions the project, labels would help consumers differentiate the carbon output from producing that banana and be able to purchase the most environmentally-efficient fruit. "Each banana will have a different label, depending on if it goes to a supermarket or a Wal-Mart," he proposes. "The last person in the supply chain would be the only one who could put the label on."
Just recently, many public companies have begun releasing data about their carbon emissions. Blanco says the "carbon inventory" reports typically cover the number of employees at the company, along with buildings and vehicles and the carbon-related impact of all three. "That's a good step," he says.

From ITWorld Canada

Top tech firms set 'green' targets

A Solar Grand Plan: By 2050 solar power could end U.S. dependence on foreign oil and slash greenhouse gas emissions. The greatest obstacle to implementing a renewable U.S. energy system is not technology or money. It is the lack of public awareness that solar power is a practical alternative?and one that can fuel transportation as well

Scientific American Magazine -  December 16, 2007

A Solar Grand Plan
By 2050 solar power could end U.S. dependence on foreign oil and slash greenhouse gas emissions

By Ken Zweibel, James Mason and Vasilis Fthenakis

High prices for gasoline and home heating oil are here to stay. The U.S. is at war in the Middle East at least in part to protect its foreign oil interests. And as China, India and other nations rapidly increase their demand for fossil fuels, future fighting over energy looms large. In the meantime, power plants that burn coal, oil and natural gas, as well as vehicles everywhere, continue to pour millions of tons of pollutants and greenhouse gases into the atmosphere annually, threatening the planet.

Well-meaning scientists, engineers, economists and politicians have proposed various steps that could slightly reduce fossil-fuel use and emissions. These steps are not enough. The U.S. needs a bold plan to free itself from fossil fuels. Our analysis convinces us that a massive switch to solar power is the logical answer.

Solar energy's potential is off the chart. The energy in sunlight striking the earth for 40 minutes is equivalent to global energy consumption for a year. The U.S. is lucky to be endowed with a vast resource; at least 250,000 square miles of land in the Southwest alone are suitable for constructing solar power plants, and that land receives more than 4,500 quadrillion British thermal units (Btu) of solar radiation a year. Converting only 2.5 percent of that radiation into electricity would match the nation's total energy consumption in 2006.

To convert the country to solar power, huge tracts of land would have to be covered with photovoltaic panels and solar heating troughs. A direct-current (DC) transmission backbone would also have to be erected to send that energy efficiently across the nation.

The technology is ready. On the following pages we present a grand plan that could provide 69 percent of the U.S.'s electricity and 35 percent of its total energy (which includes transportation) with solar power by 2050. We project that this energy could be sold to consumers at rates equivalent to today's rates for conventional power sources, about five cents per kilowatt-hour (kWh). If wind, biomass and geothermal sources were also developed, renewable energy could provide 100 percent of the nation's electricity and 90 percent of its energy by 2100.

The federal government would have to invest more than $400 billion over the next 40 years to complete the 2050 plan. That investment is substantial, but the payoff is greater. Solar plants consume little or no fuel, saving billions of dollars year after year. The infrastructure would displace 300 large coal-fired power plants and 300 more large natural gas plants and all the fuels they consume. The plan would effectively eliminate all imported oil, fundamentally cutting U.S. trade deficits and easing political tension in the Middle East and elsewhere. Because solar technologies are almost pollution-free, the plan would also reduce greenhouse gas emissions from power plants by 1.7 billion tons a year, and another 1.9 billion tons from gasoline vehicles would be displaced by plug-in hybrids refueled by the solar power grid. In 2050 U.S. carbon dioxide emissions would be 62 percent below 2005 levels, putting a major brake on global warming.

Photovoltaic Farms
In the past few years the cost to produce photovoltaic cells and modules has dropped significantly, opening the way for large-scale deployment. Various cell types exist, but the least expen­sive modules today are thin films made of cadmium telluride. To provide electricity at six cents per kWh by 2020, cadmium telluride modules would have to convert electricity with 14 percent efficiency, and systems would have to be installed at $1.20 per watt of capacity. Current modules have 10 percent efficiency and an installed system cost of about $4 per watt. Progress is clearly needed, but the technology is advancing quickly; commercial efficiencies have risen from 9 to 10 percent in the past 12 months. It is worth noting, too, that as modules improve, rooftop photovoltaics will become more cost-competitive for homeowners, reducing daytime electricity demand.

In our plan, by 2050 photovoltaic technology would provide almost 3,000 gigawatts (GW), or billions of watts, of power. Some 30,000 square miles of photovoltaic arrays would have to be erected. Although this area may sound enormous, installations already in place indicate that the land required for each gigawatt-hour of solar energy produced in the Southwest is less than that needed for a coal-powered plant when factoring in land for coal mining. Studies by the National Renewable Energy Laboratory in Golden, Colo., show that more than enough land in the Southwest is available without requiring use of environmentally sensitive areas, population centers or difficult terrain. Jack Lavelle, a spokesperson for Arizona's Department of Water Conservation, has noted that more than 80 percent of his state's land is not privately owned and that Arizona is very interested in developing its solar potential. The benign nature of photovoltaic plants (including no water consumption) should keep environmental concerns to a minimum.

The main progress required, then, is to raise module efficiency to 14 percent. Although the efficiencies of commercial modules will never reach those of solar cells in the laboratory, cadmium telluride cells at the National Renewable Energy Laboratory are now up to 16.5 percent and rising. At least one manufacturer, First Solar in Perrysburg, Ohio, increased module efficiency from 6 to 10 percent from 2005 to 2007 and is reaching for 11.5 percent by 2010.

Pressurized Caverns
The great limiting factor of solar power, of course, is that it generates little electricity when skies are cloudy and none at night. Excess power must therefore be produced during sunny hours and stored for use during dark hours. Most energy storage systems such as batteries are expensive or inefficient.

Compressed-air energy storage has emerged as a successful alternative. Electricity from photovoltaic plants compresses air and pumps it into vacant underground caverns, abandoned mines, aquifers and depleted natural gas wells. The pressurized air is released on demand to turn a turbine that generates electricity, aided by burning small amounts of natural gas. Compressed-air energy storage plants have been operating reliably in Huntorf, Germany, since 1978 and in McIntosh, Ala., since 1991. The turbines burn only 40 percent of the natural gas they would if they were fueled by natural gas alone, and better heat recovery technology would lower that figure to 30 percent.

Studies by the Electric Power Research Institute in Palo Alto, Calif., indicate that the cost of compressed-air energy storage today is about half that of lead-acid batteries. The research indicates that these facilities would add three or four cents per kWh to photovoltaic generation, bringing the total 2020 cost to eight or nine cents per kWh.

Electricity from photovoltaic farms in the Southwest would be sent over high-voltage DC transmission lines to compressed-air storage facilities throughout the country, where turbines would generate electricity year-round. The key is to find adequate sites. Mapping by the natural gas industry and the Electric Power Research Institute shows that suitable geologic formations exist in 75 percent of the country, often close to metropolitan areas. Indeed, a compressed-air energy storage system would look similar to the U.S. natural gas storage system. The industry stores eight trillion cubic feet of gas in 400 underground reservoirs. By 2050 our plan would require 535 billion cubic feet of storage, with air pressurized at 1,100 pounds per square inch. Although development will be a challenge, plenty of reservoirs are available, and it would be reasonable for the natural gas industry to invest in such a network.

Hot Salt
Another technology that would supply perhaps one fifth of the solar energy in our vision is known as concentrated solar power. In this design, long, metallic mirrors focus sunlight onto a pipe filled with fluid, heating the fluid like a huge magnifying glass might. The hot fluid runs through a heat exchanger, producing steam that turns a turbine.

For energy storage, the pipes run into a large, insulated tank filled with molten salt, which retains heat efficiently. Heat is extracted at night, creating steam. The molten salt does slowly cool, however, so the energy stored must be tapped within a day.

Nine concentrated solar power plants with a total capacity of 354 megawatts (MW) have been generating electricity reliably for years in the U.S. A new 64-MW plant in Nevada came online in March 2007. These plants, however, do not have heat storage. The first commercial installation to incorporate it—a 50-MW plant with seven hours of molten salt storage—is being constructed in Spain, and others are being designed around the world. For our plan, 16 hours of storage would be needed so that electricity could be generated 24 hours a day.

Existing plants prove that concentrated solar power is practical, but costs must decrease. Economies of scale and continued research would help. In 2006 a report by the Solar Task Force of the Western Governors' Association concluded that concentrated solar power could provide electricity at 10 cents per kWh or less by 2015 if 4 GW of plants were constructed. Finding ways to boost the temperature of heat exchanger fluids would raise operating efficiency, too. Engineers are also investigating how to use molten salt itself as the heat-transfer fluid, reducing heat losses as well as capital costs. Salt is corrosive, however, so more resilient piping systems are needed.

Concentrated solar power and photovoltaics represent two different technology paths. Neither is fully developed, so our plan brings them both to large-scale deployment by 2020, giving them time to mature. Various combinations of solar technologies might also evolve to meet demand economically. As installations expand, engineers and accountants can evaluate the pros and cons, and investors may decide to support one technology more than another.

Direct Current, Too
The geography of solar power is obviously different from the nation's current supply scheme. Today coal, oil, natural gas and nuclear power plants dot the landscape, built relatively close to where power is needed. Most of the country's solar generation would stand in the Southwest. The existing system of alternating-current (AC) power lines is not robust enough to carry power from these centers to consumers everywhere and would lose too much energy over long hauls. A new high-voltage, direct-current (HVDC) power transmission backbone would have to be built.

Studies by Oak Ridge National Laboratory indicate that long-distance HVDC lines lose far less energy than AC lines do over equivalent spans. The backbone would radiate from the Southwest toward the nation's borders. The lines would terminate at converter stations where the power would be switched to AC and sent along existing regional transmission lines that supply customers.

The AC system is also simply out of capacity, leading to noted shortages in California and other regions; DC lines are cheaper to build and require less land area than equivalent AC lines. About 500 miles of HVDC lines operate in the U.S. today and have proved reliable and efficient. No major technical advances seem to be needed, but more experience would help refine operations. The Southwest Power Pool of Texas is designing an integrated system of DC and AC transmission to enable development of 10 GW of wind power in western Texas. And TransCanada, Inc., is proposing 2,200 miles of HVDC lines to carry wind energy from Montana and Wyoming south to Las Vegas and beyond.

Stage One: Present to 2020
We have given considerable thought to how the solar grand plan can be deployed. We foresee two distinct stages. The first, from now until 2020, must make solar competitive at the mass-production level. This stage will require the government to guarantee 30-year loans, agree to purchase power and provide price-support subsidies. The annual aid package would rise steadily from 2011 to 2020. At that time, the solar technologies would compete on their own merits. The cumulative subsidy would total $420 billion (we will explain later how to pay this bill).

About 84 GW of photovoltaics and concentrated solar power plants would be built by 2020. In parallel, the DC transmission system would be laid. It would expand via existing rights-of-way along interstate highway corridors, minimizing land-acquisition and regulatory hurdles. This backbone would reach major markets in Phoenix, Las Vegas, Los Angeles and San Diego to the west and San Antonio, Dallas, Houston, New Orleans, Birmingham, Ala., Tampa, Fla., and Atlanta to the east.

Building 1.5 GW of photovoltaics and 1.5 GW of concentrated solar power annually in the first five years would stimulate many manufacturers to scale up. In the next five years, annual construction would rise to 5 GW apiece, helping firms optimize production lines. As a result, solar electricity would fall toward six cents per kWh. This implementation schedule is realistic; more than 5 GW of nuclear power plants were built in the U.S. each year from 1972 to 1987. What is more, solar systems can be manufactured and installed at much faster rates than conventional power plants because of their straightforward design and relative lack of environmental and safety complications.

Stage Two: 2020 to 2050
It is paramount that major market incentives remain in effect through 2020, to set the stage for self-sustained growth thereafter. In extending our model to 2050, we have been conservative. We do not include any technological or cost improvements beyond 2020. We also assume that energy demand will grow nationally by 1 percent a year. In this scenario, by 2050 solar power plants will supply 69 percent of U.S. electricity and 35 percent of total U.S. energy. This quantity includes enough to supply all the electricity consumed by 344 million plug-in hybrid vehicles, which would displace their gasoline counterparts, key to reducing dependence on foreign oil and to mitigating greenhouse gas emissions. Some three million new domestic jobs—notably in manufacturing solar components—would be created, which is several times the number of U.S. jobs that would be lost in the then dwindling fossil-fuel industries.

The huge reduction in imported oil would lower trade balance payments by $300 billion a year, assuming a crude oil price of $60 a barrel (average prices were higher in 2007). Once solar power plants are installed, they must be maintained and repaired, but the price of sunlight is forever free, duplicating those fuel savings year after year. Moreover, the solar investment would enhance national energy security, reduce financial burdens on the military, and greatly decrease the societal costs of pollution and global warming, from human health problems to the ruining of coastlines and farmlands.

Ironically, the solar grand plan would lower energy consumption. Even with 1 percent annual growth in demand, the 100 quadrillion Btu consumed in 2006 would fall to 93 quadrillion Btu by 2050. This unusual offset arises because a good deal of energy is consumed to extract and process fossil fuels, and more is wasted in burning them and controlling their emissions.

To meet the 2050 projection, 46,000 square miles of land would be needed for photovoltaic and concentrated solar power installations. That area is large, and yet it covers just 19 percent of the suitable Southwest land. Most of that land is barren; there is no competing use value. And the land will not be polluted. We have assumed that only 10 percent of the solar capacity in 2050 will come from distributed photovoltaic installations—those on rooftops or commercial lots throughout the country. But as prices drop, these  applications could play a bigger role.

2050 and Beyond
Although it is not possible to project with any exactitude 50 or more years into the future, as an exercise to demonstrate the full potential of solar energy we constructed a scenario for 2100. By that time, based on our plan, total energy demand (including transportation) is projected to be 140 quadrillion Btu, with seven times today's electric generating capacity.

To be conservative, again, we estimated how much solar plant capacity would be needed under the historical worst-case solar radiation conditions for the Southwest, which occurred during the winter of 1982–1983 and in 1992 and 1993 following the Mount Pinatubo eruption, according to National Solar Radiation Data Base records from 1961 to 2005. And again, we did not assume any further technological and cost improvements beyond 2020, even though it is nearly certain that in 80 years ongoing research would improve solar efficiency, cost and storage.

Under these assumptions, U.S. energy demand could be fulfilled with the following capacities: 2.9 terawatts (TW) of photovoltaic power going directly to the grid and another 7.5 TW dedicated to compressed-air storage; 2.3 TW of concentrated solar power plants; and 1.3 TW of distributed photovoltaic installations. Supply would be rounded out with 1 TW of wind farms, 0.2 TW of geothermal power plants and 0.25 TW of biomass-based production for fuels. The model includes 0.5 TW of geothermal heat pumps for direct building heating and cooling. The solar systems would require 165,000 square miles of land, still less than the suitable available area in the Southwest.

In 2100 this renewable portfolio could generate 100 percent of all U.S. electricity and more than 90 percent of total U.S. energy. In the spring and summer, the solar infrastructure would produce enough hydrogen to meet more than 90 percent of all transportation fuel demand and would replace the small natural gas supply used to aid compressed-air turbines. Adding 48 billion gallons of biofuel would cover the rest of transportation energy. Energy-related carbon dioxide emissions would be reduced 92 percent below 2005 levels.

Who Pays?
Our model is not an austerity plan, because it includes a 1 percent annual increase in demand, which would sustain lifestyles similar to those today with expected efficiency improvements in energy generation and use. Perhaps the biggest question is how to pay for a $420-billion overhaul of the nation's energy infrastructure. One of the most common ideas is a carbon tax. The International Energy Agency suggests that a carbon tax of $40 to $90 per ton of coal will be required to induce electricity generators to adopt carbon capture and storage systems to reduce carbon dioxide emissions. This tax is equivalent to raising the price of electricity by one to two cents per kWh. But our plan is less expensive. The $420 billion could be generated with a carbon tax of 0.5 cent per kWh. Given that electricity today generally sells for six to 10 cents per kWh, adding 0.5 cent per kWh seems reasonable.

Congress could establish the financial incentives by adopting a national renewable energy plan. Consider the U.S. Farm Price Support program, which has been justified in terms of national security. A solar price support program would secure the nation's energy future, vital to the country's long-term health. Subsidies would be gradually deployed from 2011 to 2020. With a standard 30-year payoff interval, the subsidies would end from 2041 to 2050. The HVDC transmission companies would not have to be subsidized, because they would finance construction of lines and converter stations just as they now finance AC lines, earning revenues by delivering electricity.

Although $420 billion is substantial, the annual expense would be less than the current U.S. Farm Price Support program. It is also less than the tax subsidies that have been levied to build the country's high-speed telecommunications infrastructure over the past 35 years. And it frees the U.S. from policy and budget issues driven by international energy conflicts.

Without subsidies, the solar grand plan is impossible. Other countries have reached similar conclusions: Japan is already building a large, subsidized solar infrastructure, and Germany has embarked on a nationwide program. Although the investment is high, it is important to remember that the energy source, sunlight, is free. There are no annual fuel or pollution-control costs like those for coal, oil or nuclear power, and only a slight cost for natural gas in compressed-air systems, although hydrogen or biofuels could displace that, too. When fuel savings are factored in, the cost of solar would be a bargain in coming decades. But we cannot wait until then to begin scaling up.

Critics have raised other concerns, such as whether material constraints could stifle large-scale installation. With rapid deployment, temporary shortages are possible. But several types of cells exist that use different material combinations. Better processing and recycling are also reducing the amount of materials that cells require. And in the long term, old solar cells can largely be recycled into new solar cells, changing our energy supply picture from depletable fuels to recyclable materials.

The greatest obstacle to implementing a renewable U.S. energy system is not technology or money, however. It is the lack of public awareness that solar power is a practical alternative—and one that can fuel transportation as well. Forward-looking thinkers should try to inspire U.S. citizens, and their political and scientific leaders, about solar power's incredible potential. Once Americans realize that potential, we believe the desire for energy self-sufficiency and the need to reduce carbon dioxide emissions will prompt them to adopt a national solar plan. 

The Price of Biofuels: Making ethanol from corn is expensive. Better biofuels are years away from the gas tank. Farmers are reluctant to change their practices. But do we really have any alternative to biofuels?

The Price of Biofuels
Making ethanol from corn is expensive. Better biofuels are years away from the gas tank. Farmers are reluctant to change their practices. But do we really have any alternative to biofuels?
By David Rotman

The irrational exuberance over ethanol that swept through the American corn belt over the last few years has given way to a dreary hangover, especially among those who invested heavily in the sprawling production facilities now dotting the rural landscape. It's the Midwest's version of the tech bubble, and in some ways, it is remarkably familiar: overeager investors enamored of a technology's seemingly unlimited potential ignore what, at least in retrospect, are obvious economic realities.

More than a hundred biofuel factories, clustered largely in the corn-growing states of Iowa, Minnesota, Illinois, Indiana, South Dakota, and Nebraska, will produce 6.4 billion gallons of ethanol this year, and another 74 facilities are under construction. Just 18 months ago, they were cash cows, churning out high-priced ethanol from low-priced corn, raising hopes of "energy independence" among politicians, and capturing the attention--and money--of venture capitalists from both the East and West Coasts.

Now ethanol producers are struggling, and many are losing money. The price of a bushel of corn rose to record highs during the year, exceeding $4.00 last winter before falling back to around $3.50 in the summer, then rebounding this fall to near $4.00 again. At the same time, ethanol prices plummeted as the market for the alternative fuel, which is still used mainly as an additive to gasoline, became saturated. In the face of these two trends, profit margins vanished.

The doldrums of the ethanol market reflect the predictable boom-and-bust cycle of any commodity: high prices drive increased production, and soon the market is oversupplied, causing prices to crash. But the large-scale use of corn-derived ethanol as a transportation fuel has economic problems all its own. Even though crude oil is at near record prices, and companies that use ethanol in their gasoline receive a federal tax credit of 51 cents per gallon, ethanol struggles to compete economically. And with limited infrastructure in place to distribute and sell the biofuel, demand will remain uncertain for the foreseeable future.

More alarming, the boom in ethanol production is driving up the price of food. Of the record 93 million acres of corn planted in the United States in 2007, about 20 percent went to ethanol. Since most of the rest is used to feed animals, the prices of beef, milk, poultry, and pork are all affected by increases in the cost of corn. The international Organization for Economic Coöperation and Development (OECD) recently warned that the "rapid growth of the biofuels industry" could bring about fundamental shifts in agricultural markets worldwide and could even "cause food shortages."

All this comes at a time when the need for alternatives to ­petroleum-based transportation fuels is becoming urgent. At press time, the price of crude oil was near $90 a barrel. And worries about the impact of greenhouse-gas emissions from the roughly 142 billion gallons of gasoline used every year in the United States are deepening. Expanded use of biofuels is central to the federal government's long-term energy strategy. In his State of the Union speech on January 23, 2007, President Bush set the goal of producing 35 billion gallons of renewable and alternative fuels by 2017, citing the need for independence from foreign oil. The U.S. Department of Energy has set the similar goal of replacing 30 percent of gasoline use with biofuel use by 2030.

Hitting both targets, however, will require significant techno­logical breakthroughs. In the United States, for now, ethanol means the corn-derived version. (Brazilian producers were expected to make 4.97 billion gallons of ethanol in 2007, mostly from sugar­cane; but that semitropical crop is agriculturally viable in only a few parts of the United States.) Even proponents of corn ethanol say that its production levels cannot go much higher than around 15 billion gallons a year, which falls far short of Bush's goal.

While President Bush and other advocates of biofuels have often called for ethanol to be made from alternative feedstocks such as switchgrass--a plant native to the U.S. prairie states, where it grows widely--the required technology is, according to most estimates, at least four to five years from commercial viability. Meanwhile, advanced biological techniques for creating novel organisms that produce other biofuels, such as hydrocarbons, are still in the lab. So far, researchers are making quantities that wouldn't even fill the tank of a large SUV.

The economic woes and market limitations of corn ethanol are a painful reminder of the immense difficulties facing developers of new biofuels. "The bottom line is that you're going to have to make fuel cheap," says Frances Arnold, a professor of chemical engineering and biochemistry at Caltech. "We can all make a little bit of something. But you have got to make a lot of it, and you have got to make it cheaply. The problem is so huge that your technology has to scale up and do it at a price that is competitive. Everyone is going to be competing on price alone."

Corn Blight
There may be no better place to get a realistic appraisal of biofuels than the Department of Applied Economics at the University of Minnesota. The large campus housing the department and the rest of the university's school of agriculture lies on a low hill in a quiet St. Paul neighborhood. Acres of fields where experiments are conducted spread out from the edge of the university. Nearby are the grounds of the Minnesota State Fair, a 12-day event that draws more than a million and a half visitors at the end of the summer.

The state is the fourth-largest producer of corn in the U.S., and much of its economy, even its culture, is intimately tied to the crop. The run-up of corn prices has been a boon for Minnesota's rural agricultural communities. And the governor and other state politi­cians have strongly pushed the use of ethanol as a transportation fuel. Still, you won't find much cheerleading for corn ethanol in the plain brick building that houses the department.

In his orderly office with its neat stacks of technical papers and farm reports, Vernon Eidman, an emeritus professor of agricultural economics, combines the authority of a scholar with the sternness of a Midwestern banker. "We could see this coming," he says, describing the current market plight of the ethanol producers. "It's not like [producers] didn't know it was coming. At least, they should have known it." In 2006 they "made profits like they never had before," Eidman says. "And that's a major factor that led to this tremendous buildup."

The numbers speak for themselves. Eidman's calculations show what it costs, given varying prices of corn, for a new, moderate­-size facility to produce ethanol. At $4.00 a bushel of corn, ethanol production costs $1.70 a gallon; to gain a 12 percent return on equity, the producers need to sell ethanol at $1.83 a gallon. Then Eidman shows his figures for the prices that petroleum companies are paying when they buy ethanol to blend with their gasoline: this December, prices were about $1.90 a gallon, and bids for 2008 range between $1.75 and a $1.80 a gallon. In other words, the profit margins for ethanol producers are extremely tight. To make matters worse, Eidman says, production capacity, which was around 5.4 billion gallons at the beginning of 2007, is expected to reach 12.5 billion gallons by 2010.

While swelling ethanol production has led to worries about oversupply, the other side of the market equation is actually a cause for greater concern: the future demand for ethanol fuel is by no means certain. In a few parts of the country, particularly in the corn-belt states, drivers can buy fuel that's 85 percent ethanol. But for the most part, petroleum companies use ethanol at a concentration of 10 percent, to increase the oxygen content of their gasoline. Not only is such a market limited, but the 10-percent-ethanol blend delivers slightly reduced gas mileage, potentially damping consumer appetite for the fuel.

It is not just the short-term economics of ethanol that concern agricultural experts. They also warn that corn-derived ethanol is not the "green fuel" its advocates have described. That's because making ethanol takes a lot of energy, both to grow the corn and, even more important, to run the fermentation facilities that turn the sugar gleaned from the corn kernels into the alcohol that's used as fuel. Exactly how much energy it takes has been the subject of intense academic debate in various journals during the last few years.

According to calculations done by Minnesota researchers, 54 percent of the total energy represented by a gallon of ethanol is offset by the energy required to process the fuel; another 24 percent is offset by the energy required to grow the corn. While about 25 percent more energy is squeezed out of the biofuel than is used to produce it, other fuels yield much bigger gains, says Stephen Polasky, a professor of ecological and environmental economics at Minnesota. Making etha­nol is "not a cheap process," he says. "From my perspective, the biggest problem [with corn ethanol] is just the straight-out economics and the costs. The energy input/output is not very good."

The high energy requirements of ethanol production mean that using ethanol as fuel isn't all that much better for the environment than using gasoline. One might think that burning the biofuel would release only the carbon dioxide that corn captures as it grows. But that simplified picture, which has often been conjured up to support the use of ethanol fuel, doesn't withstand closer scrutiny.

In fact, Polasky says, the fossil fuels needed to raise and harvest corn and produce ethanol are responsible for significant carbon emissions. Not only that, but the cultivation of corn also produces two other potent greenhouse gases: nitrous oxide and methane. Polasky calculates that corn-derived ethanol is responsible for greenhouse-gas emissions about 15 to 20 percent below those associated with gasoline: "The bottom line is that you're getting a slight saving in terms of greenhouse-gas emissions, but not much."

If corn-derived ethanol has had little impact on energy markets and greenhouse-gas emissions, however, its production could have repercussions throughout the agricultural markets. Not only are corn prices up, but so are soybean prices, because farmers planted fewer soybeans to make room for corn.

In the May/June 2007 issue of Foreign Affairs, C. Ford Runge, a professor of applied economics and law at Minnesota, cowrote an article titled "How Biofuels Could Starve the Poor," which argued that "the enormous volume of corn required by the ethanol industry is sending shock waves through the food system." Six months later, sitting in a large office from which he directs the university's Center for International Food and Agricultural Policy, Runge seems bemused by the criticism that his article received from local politicians and those in the etha­nol business. But he is steadfast in his argument: "It is clearly the case that milk prices, bread prices, are all rising at three times the average rate of increase of the last 10 years. It's appreciable, and it is beginning to be appreciated."

The recent OECD report, released in early September, is just the latest confirmation of his warnings, says Runge. And because a larger percentage of their income goes to food, he says, "this is really going to hit poor people." Since the United States exports about 20 percent of its corn, the poor in the rest of the world are at particular risk. Runge cites the doubling in the price of tor­tillas in Mexico a year ago.

All these factors argue against the promise of corn ethanol as a solution to the energy problem. "My take," says Polasky, "is that [ethanol] is only going to be a bit player in terms of energy supplies." He calculates that even if all the corn planted in the United States were used for ethanol, the biofuel would still displace only 12 percent of gasoline consumption. "If I'm doing this for energy policy, I don't see the payback," he says. "If we're doing this as farm support policy, there may be more merit there. But we're going to have to go to the next generation of technology to have a significant impact on the energy markets."

Since the oil crisis of the 1970s, when the price of a barrel of petroleum peaked, chemical and biological engineers have chased after ways to turn the nation's vast reserves of "cellulosic" material such as wood, agricultural residues, and perennial grasses into ethanol and other biofuels. Last year, citing another of President Bush's goals--reducing U.S. gasoline consumption by 20 percent in 10 years--the U.S. Department of Energy (DOE) announced up to $385 million in funding for six "biorefinery" projects that will use various technologies to produce ethanol from biomass ranging from wood chips to switchgrass.

According to a 2005 report by the DOE and the U.S. Department of Agriculture, the country has enough available forest and agricultural land to produce 1.3 billion tons of biomass that could go toward biofuels. Beyond providing a vast supply of cheap feedstock, cellulosic biomass could greatly increase the energy and environmental benefits of biofuels. It takes far less energy to grow cellulosic materials than to grow corn, and portions of the biomass can be used to help power the production process. (The sugarcane-based ethanol produced in Brazil also offers improvements over corn-based ethanol, thanks to the crop's large yields and high sugar content.)

But despite years of research and recent investment in scaling up production processes, no commercial facility yet makes cellulosic ethanol. The economic explanation is simple: it costs far too much to build such a facility. Cellulose, a long-chain polysaccharide that makes up much of the mass of woody plants and crop residues such as cornstalks, is difficult--and thus expensive--to break down.

Several technologies for producing cellulosic ethanol do exist. The cellulose can be heated at high pressure in the presence of oxygen to form synthesis gas, a mixture of carbon monoxide and hydrogen that is readily turned into ethanol and other fuels. Alternatively, industrial enzymes can break the cellulose down into sugars. The sugars then feed fermentation reactors in which microörganisms produce ethanol. But all these processes are still far too expensive to use commercially.

Even advocates of cellulosic ethanol put the capital costs of constructing a manufacturing plant at more than twice those for a corn-based facility, and other estimates range from three times the cost to five. "You can make cellulosic ethanol today, but at a price that is far from perfect," says Christopher Somerville, a plant biologist at the University of California, Berkeley, who studies how cellulose is formed and used in the cell walls of plants.

"Cellulose has physical and chemical properties that make it difficult to access and difficult to break down," explains Caltech's Arnold, who has worked on and off on the biological approach to producing cellulosic ethanol since the 1970s. For one thing, cellulose fibers are held together by a substance called lignin, which is "a bit like asphalt," Arnold says. Once the lignin is removed, the cellulose can be broken down by enzymes, but they are expensive, and existing enzymes are not ideal for the task.

Many researchers believe that the most promising way to make cellulosic biofuels economically competitive involves the creation--or the discovery--of "superbugs," microörganisms that can break down cellulose to sugars and then ferment those sugars into ethanol. The idea is to take what is now a multistep process requiring the addition of costly enzymes and turn it into a simple, one-step process, referred to in the industry as consolidated bioprocessing. According to Lee Lynd, a professor of engineering at Dartmouth College and cofounder of Mascoma, a company based in Cambridge, MA, that is commercializing a version of the technology, the consolidated approach could eventually produce ethanol at 70 cents a gallon. "It would be a transformational breakthrough," he says. "There's no doubt it would be attractive."

But finding superbugs has proved difficult. For decades, scientists have known of bacteria that can degrade cellulose and also produce some ethanol. Yet none can do the job quickly and efficiently enough to be useful for large-scale manufacturing.

Nature, Arnold explains, offers little help. "There are some organisms that break down cellulose," she says, "but the problem is that they don't make fuels, so that doesn't do you much good." An alternative, she says, is to genetically modify E. coli and yeast so that they secrete enzymes that degrade cellulose. But while many different kinds of enzymes could do the job, "most them don't like to be inserted into E. coli and yeast."

Arnold, however, is optimistic that the right organism will be discovered. "You never know what will happen tomorrow," she says, "whether it's done using synthetic biology or someone just scrapes one off the bottom of their shoe."

She didn't quite scrape it off her shoe, but Susan Leschine, a microbiologist at the University of Massachusetts, Amherst, believes she just might have stumbled on a bug that will do the job. She found it in a soil sample collected more than a decade ago from the woods surrounding the Quabbin Reservoir, about 15 miles from her lab. The Quabbin sample was just one of many from around the world that Leschine was studying, so it was several years before she finished analyzing it. But when she did, she realized that one of its bacteria, Clostridium phytofermentans, had extraordinary properties. "It decomposes nearly all the components of the plant, and it forms ethanol as the main product," she says. "It produces prodigious amounts of ethanol."

Leschine founded a company in Amherst, ­SunEthanol, that will attempt to scale up ethanol production using the bacterium. There's "a long way to go," she acknowledges, but she adds that "what we have is very different, and that gives us a leg up. We already have a microbe and have demonstrated it on real feedstocks." Leschine says that other useful microbes are probably waiting to be discovered: a single soil sample, after all, contains hundred of thousands of varieties. "In this zoo of microbes," she says, "we can think that there are others with similar properties out there."

Blooming Prairies
Whether ethanol made from cellulosic biomass is good or bad for the environment, however, depends on what kind of biomass it is and how it is grown.

In his office in St. Paul, David Tilman, a professor of ecology at the University of Minnesota, pulls out a large aerial photo of a field sectioned into a neat grid. Even from the camera's vantage point far above the ground, the land looks poor. In one plot are thin rows of grasses, the sandy soil visible beneath. Tilman says the land was so infertile that agricultural use of it had been abandoned. Then he and his colleagues scraped off any remaining topsoil. "No farmer has land this bad," he says.

In a series of tests, Tilman grew a mixture of native prairie grasses (including switchgrass) in some of the field's plots and single species in others. The results show that a diverse mix of grasses, even grown in extremely infertile soil, "could be a valuable source of biofuels," he says. "You could make more ethanol from an acre [of the mixed grasses] than you could from an acre of corn." Better yet, in a paper published in Science, Tilman showed that the prairie grasses could be used to make ethanol that is "carbon negative": the grasses might consume and store more carbon dioxide than is released by producing and burning the fuel made from them.

The findings are striking because they suggest an environmentally beneficial way to produce massive amounts of biofuels without competing with food crops. By 2050, according to Tilman, the world will need a billion hectares more land for food. "That's the land mass of the entire United States just to feed the world," he says. "If you did a lot of biofuels on [arable] land--it is very easy to envision a billion hectares for biofuels--you will have no nature left and no reserve of land after 50 years." Instead, ­Tilman argues, it makes sense to grow biomass for fuels on relatively infertile land no longer used for agriculture.

But down the hill from Tilman's office, his colleagues in the applied-economics department worry about the practical issues involved in using large amounts of biomass to make fuel. For one thing, they point out, the technology and infrastructure that could efficiently handle and transport the bulky biomass still need to be developed. And since the plant material will be expensive to move around, biofuel production facilities will have to be built close to the sources of feedstock--probably within 50 miles.

The amount of biomass needed to feed even one medium-size ethanol facility is daunting. Eidman calculates that a facility producing 50 million gallons per year would require a truck loaded with biomass to arrive every six minutes around the clock. What's more, he says, the feedstock is "not free": it will cost around $60 to $70 a ton, or about 75 cents per gallon of ethanol. "That's where a lot of people get fooled," he adds.

Since no commercial cellulosic facility has been built, says ­Eidman, it is difficult to analyze the specific costs of various technologies. Overall, he suggests, the economics look "interesting"--but cellulosic ethanol will have to compete with corn-derived biofuels and get down to something like $1.50 a gallon. Eidman believes it will be at least 2015 before biofuels made from cellulose "are much of a factor" in the market.

While chemical engineers, microbiologists, agronomists, and others struggle to find ways of making cellulosic ethanol commercially competitive, a few synthetic biologists and metabolic engineers are focusing on an entirely different strategy. More than fifteen hundred miles away from the Midwest's corn belt, several California-based, venture-backed startups founded by pioneers in the fledging field of synthetic biology are creating new microörganisms designed to make biofuels other than ethanol.

Ethanol, after all, is hardly an ideal fuel. A two-carbon molecule, it has only two-thirds the energy content of gasoline, which is a mix of long-chain hydrocarbons. Put another way, it would take about a gallon and a half of ethanol to yield the same mileage as a gallon of gasoline. And because ethanol mixes with water, a costly distillation step is required at the end of the fermentation process. What's more, because ethanol is more easily contaminated with water than hydrocarbons are, it can't be shipped in the petroleum pipelines used to cheaply distribute gasoline throughout the United States. Ethanol must be shipped in specialized rail cars (trucks, with their relatively small payloads, are usually far too expensive), adding to the cost of the fuel.

So instead of ethanol, the California startups are planning to produce novel hydrocarbons. Like ethanol, the new compounds are fermented from sugars, but they are designed to more closely resemble gasoline, diesel, and even jet fuel. "We took a look at ethanol," says Neil Renninger, senior vice president of development and cofounder of Amyris Biotechnologies in Emeryville, CA, "and realized the limitations and the desire to make something that looked more like conventional fuels. Essentially, we wanted to make hydrocarbons. Hydrocarbons are what are currently in fuels, and hydrocarbons make the best fuels because we have designed our engines to work with them." If the researchers can genetically engineer microbes that produce such compounds, it will completely change the economics of biofuels.

The problem is that nature offers no known examples of micro­örganisms that can ferment sugars into the types of hydro­carbons useful for fuel. So synthetic biologists have to start from scratch. They identify promising metabolic reactions in other organisms and insert the corresponding genes into E. coli or yeast, recombining metabolic pathways until they yield the desired products.

At LS9 in San Carlos, CA, researchers are turning E. coli into a hydrocarbon producer by reëngineering its fatty-acid metabolism (see "Better Biofuels," Forward, July/August 2007). Stephen del ­Cardayré, LS9's vice president of research and development, says the company decided to focus on fatty acids because organisms naturally produce them in abundance, as a way of storing energy. "We wanted to take advantage of a pathway that [naturally] makes a lot of stuff," del Cardayré says. "Just grab your middle." Del ­Cardayré and his coworkers use many of the existing pathways in E. coli's fatty-acid metabolism but divert them near the end of the metabolic cycle. Since fatty acids consist of a hydrocarbon chain with a carboxyl group, it is relatively straightforward to make the hydrocarbon fuels. "Think of it as a highway," says del Cardayré. "Near the end of the highway, we add a detour, a pathway we designed and stuck there, so the fatty acids have a better place to go. We pull them off and chemically change them, using this new synthetic pathway that takes them to products that we want."

Amyris, too, is taking the synthetic-biology approach, but instead of tweaking fatty-acid metabolism, it is working on pathways that produce isoprenoids, a large class of natural compounds. So far, however, both LS9 and Amyris are making their biofuels a few liters at a time. And while the companies have ambitious schedules for commercializing their technologies--both claim that their processes will be ready by 2010--improving the yield and the speed of their reactions remains a critical challenge. "It's where most of the biological work is going on," says Renninger. "We still have a little way to go, and that little way is very important."

If eventually commercialized, the hydrocarbon biofuels made by LS9 and Amyris could overcome many of the economic disadvantages of ethanol. Unlike ethanol, hydrocarbons separate from water during the production process, so no energy-­intensive distillation step is necessary. And hydrocarbon biofuels could be shipped in existing petroleum pipelines. "It's all about cost," says Robert Walsh, president of LS9. But a critical factor will be the price of feedstock, he says. "We want dirt-cheap sugars."

Indeed, the synthetic-biology startups face the same problem that established ethanol producers do: corn is not an inexpensive source of biofuels. "The next generation [of feedstock] will be cellulosic," says John Melo, CEO of Amyris. "But we are not sure which cellulosic technology will emerge as the winner." Whichever technology prevails, Melo says, Amyris expects to be able to "bolt it" onto its fermentation process, giving the company the advantages of both cheap cellulosic feedstocks and practical hydrocarbon fuels.

For now, though, the lack of an alternative to corn is driving Amyris right out of the country. The company, which plans to retro­fit existing ethanol plants so that they can make hydrocarbons, will initially work with Brazilian biofuel facilities that are using sugar­cane as a feedstock. Given the price of corn and the amount of energy needed to produce it, Melo says, Brazilian cane offers the most "viable, sustainable" way to make biofuels today.

No Choice
Even in a Silicon Valley culture that reveres successful venture capitalists, Vinod Khosla has a special place of honor. A cofounder of Sun Microsystems in the early 1980s, Khosla later joined the venture capital firm Kleiner Perkins Caufield and Byers, where in the late 1990s and early 2000s he gained a reputation for ignoring the dot-com excitement in favor of a series of esoteric startups in the far less glamorous field of optical networking. When several of the startups sold for billions of dollars to large companies gearing up their infrastructure for the Internet boom, Khosla became, in the words of one overheated headline of the time, "The No. 1 VC on the Planet."

These days Khosla, who is now among the world's richest ­people (the Forbes 400 lists him at 317, with a net worth of $1.5 billion), is putting most of his investments in alternative energies. He counts among his portfolio companies more than a dozen biofuel startups--synthetic-biology companies LS9 and Amyris, cellulosic companies like Mascoma, and corn ethanol companies like ­Cilion, based in Goshen, CA. But to call Khosla simply an investor in biofuels would greatly understate his involvement. In the last several years, he has emerged as one of the world's leading advocates of the technology, promoting its virtues and freely debating any detractors (see Q&A, March/April 2007).

Khosla seems exasperated by the biofuels naysayers. Climate change, he says, is "by far the biggest issue" driving his interest in biofuels. If we want to head off climate change and decrease consumption of gasoline, "there are no alternatives" to using cellulosic biofuels for transportation. "Biomass is the only feedstock in sufficient quantities to cost-effectively replace oil," he says. "Nothing else exists." Hybrid and electric vehicles, he adds, are "just toys."

In particular, argues Khosla, any transportation technology needs to compete in China and India, the fastest-growing automotive markets in the world. "It's no big deal to sell a million plug-in electrics in a place like California," he says. The difficulty is selling a $20,000 hybrid vehicle in India. "No friggin' chance. And any technology not adoptable by China and India is irrelevant to climate change," he says. "Environmentalists don't focus on scala­bility. If you can't scale it up, it is just a toy. Hence the need for biofuels. Hence biofuels from biomass."

In a number of opinion papers posted on the website of ­Khosla Ventures, a firm he started in 2004 that has invested heavily in biofuels and other environmental technologies, Khosla envisions biofuel production rapidly increasing over the next 20 years. According to his numbers, production of corn ethanol will level off at 15 billion gallons a year by 2014, but cellulosic ethanol will increase steadily, reaching 140 billion gallons by 2030. At that point, he predicts, biofuels will be cheap and abundant enough to replace gasoline for almost all purposes.

While Khosla readily acknowledges the limitations of corn-derived ethanol, he says it has been an important "stepping-stone": the market for corn ethanol has created an infrastructure and market for biofuels in general, removing many of the business risks of investing in cellulosic ethanol. "The reason that I like [corn ethanol] is that its trajectory leads to cellulosic ethanol," he says. "Without corn ethanol, no one would be investing in cellulosics."

But back in the Midwest, there is a "show me" attitude toward such blue-sky projections, and there are lingering questions about just how the nation's vast agricultural infrastructure will switch over to biomass. If Khosla's projections prove out, "then wonderful," says the University of Minnesota's Runge. "Meanwhile, we're stuck in reality." Perhaps the main point of contention, Runge suggests, is whether corn ethanol will in fact lead to new technologies--or stand in their way. "It is my opinion that corn ethanol is a barrier to converting to cellulosics," he says, pointing to the inertia caused by political and business interests heavily invested in corn ethanol and its infrastructure.

Runge is not alone in his skepticism. "Unless the cost is reduced significantly, cellulosic ethanol is going nowhere," says Wally Tyner, a professor of agricultural economics at Purdue University. Making cellulosic ethanol viable will require either a "policy mechanism" to encourage investment in new technologies or a "phenomenal breakthrough"--and "the likelihood of that is not too high," Tyner says. Farmers and ethanol producers currently have no incentive to take on the risks of changing technologies, he adds. There is "no policy bridge" to help make the transition. "The status quo won't do it."

Despite the sharp differences of opinion, there's still some common ground between people like Khosla, whose unbridled faith in innovation has been nurtured by the successes of Silicon Valley, and the Midwesterners whose pragmatism was forged by the competitive economics of agriculture. In particular, most observers agree that annual production of corn-derived ethanol will level off within a few years. After that, any growth in biofuel production will need to come from new technologies.

But if cellulosic biofuels are to begin replacing gasoline within five to ten years, facilities will need to start construction soon. This fall, Range Fuels, a company based in Broomfield, CO, announced that it had begun work in Georgia on what it claims is the country's first ­commercial-scale cellulosic-ethanol plant. The Range facility, which will use thermochemical technology to make ethanol from wood chips, is scheduled to reach a capacity of 20 million gallons in 2008 and eventually increase to 100 million gallons a year. Meanwhile, ­Mascoma has announced several demonstration units, including a facility in Tennessee that will be the first cellulosic-ethanol plant built to use switchgrass. But these production plants are federally subsidized or are a result of partnerships with state development organizations; attracting private investment for commercial-scale production will be another matter.

Indeed, ramping up the capacity of cellulosic-ethanol production will be a huge and risky challenge, says Colin South, president of Mascoma. "When people talk about cellulosic ethanol as if it is an industry, it is an unfair portrayal," he says. "There are a number of pilot plants, but none of them have gotten out of the pilot scale. We still need to show we can actually run these in the form of an operating chemical plant." South says that Mascoma hopes to begin construction of a commercial plant in 2009 and have it up and running by early 2011. But he adds that the company will only proceed when "the numbers are good enough."

Perhaps the most crucial number, however, will be the price of crude oil. If it stays high, cellulosic-ethanol production could become economically competitive much sooner. But few people, least of all the investors who would risk hundred of millions of dollars on new plants, are willing to take that bet. Many remember the late 1970s, when the federal government earmarked roughly a billion dollars to fund biomass-related research, only to abandon it when crude-oil prices fell in the early 1980s. And while the price of a barrel of crude hovered in the mid-$90s this fall, and wholesale gas prices reached $2.50 a gallon, biofuel experts say they cannot count on such high prices. Many producers of next-generation biofuels say they want to be competitive with crude oil at around $45 a barrel to ensure long-term viability in the market.

Indeed, announcements about new cellulosic-ethanol plants tend to obscure the fact that the technology is still not economically ­viable. Gregory Stephanopoulos, a professor of chemical engineering at MIT, describes himself as "very optimistic" about the future of biofuels. But even he is quick to add that it will take another 10 years to optimize production processes for cellulosic biofuels. Among ­myriad other problems, he says, is the need for more robust and versatile microbes to make them.

In a small conference room outside his office, ­Stephanopoulos takes out a pencil and paper and begins to draw a series of circles. You can imagine, he says, a biorefinery surrounded by sources of different types of biomass. He connects the circles at a central point, making lines like spokes on a wheel. You could, he goes on, imagine pipelines from these sources. What if the biomass were treated and piped to the biorefinery as a slurry? Stephanopoulos would be the first to acknowledge that such an ambitious infrastructure would take years to put in place, and that the idea raises numerous technical and engineering questions. But for the rest of the interview, the drawing sits patiently on the table--a simple target.

Copyright Technology Review 2007.

Cellulosic Biofuels: challenges in converting biomass to biofuels... The first is to optimize the yield and quality of the biomass, The second challenge is to improve the way biomass is broken down, so as to yield a stream of abundant, inexpensive sugars, The last step is to construct new pathways that convert sugars into the various target biofuels in organisms such as yeast

Thanks to Lloyd and Peter

Technology Review - Published by MIT

January/February 2008
Cellulosic Biofuels
Gregory Stephanopoulos explains challenges in converting biomass to biofuels.
By Greg Stephanopolous

It is now well accepted that for several reasons, corn ethanol will have a rather limited role as a renewable substitute for petroleum-derived liquid transportation fuels. The shortcomings of corn ethanol have sparked interest in the production of other types of fuels--such as higher alcohols, oils, and hydrocarbons--from renewable biomass feedstocks (see "The Price of Biofuels"). While the potential economic, environmental, and security benefits of such cellulosic biofuels are clear, many hurdles need to be cleared before they can begin to make a difference in the overall supply of liquid fuels for transportation.

There are three major challenges in the economical conversion of biomass to biofuels. The first is to optimize the yield and quality of the biomass, as well as to work out the logistics of securing, transporting, and processing the large volumes that will be required to support the operation of future biorefineries. New ways of harvesting, preprocessing, and transporting biomass will be necessary before it's cost-effective for biorefineries to import biomass from more than 15 or 20 miles away. One scheme is to establish satellite collection and pretreatment facilities from which slurry biomass is transported by pipelines to the main biorefinery. One can envision pipelines where cellulose hydrolysis, the slow process by which cellulose is broken down into usable sugars, takes place while a slurry is transported from the satellites to the main biorefinery.

The second challenge is to improve the way biomass is broken down, so as to yield a stream of abundant, inexpensive sugars for fermentation. This may be accomplished by modulating the plant's content or by genetically engineering self-destruction mechanisms into it, to be initiated after harvest and at the right processing time. Another solution might be to pursue more-active and less expensive conventional cellulolytic enzymes and perhaps new physicochemical methods, such as solubilization of cellulose by ionic liquids.

The last step is to construct new pathways that convert sugars into the various target biofuels in organisms such as yeast and E. coli. Here lies the third challenge: to engineer optimal pathways. There is an important difference between stitching reactions together by importing genes from other species and constructing an optimal pathway that converts all sugars at maximum yields and efficiencies, producing biofuels at high concentrations. Making biofuels cost-competitive will require the latter, but to achieve that goal we must engineer strains of yeast, E. coli, or other organisms with high tolerance for the toxicity of both the initial biomass hydrolysate and the final biofuel product.

No single breakthrough is likely to bring us to the point of efficient biofuel production--superbugs, consolidated bioprocessing, or blooming deserts notwithstanding. Rather, it will probably take many advances on several scientific and technological fronts, underlining the importance of a systems approach. A number of promising technologies, both biological and chemical, are in development. Economics will determine the winners, no matter what kinds of plants get built in the short term. It is also important to bear in mind that specific techniques may interfere with each other to obstruct a modular approach--by, say, undermining a well-­engineered strain's ability to work with a different feedstock hydrolysate. So far, solutions to this complex, multidimensional problem have been sought within the confines of bi­ology or chemistry, but the real answer may very well lie in a hybrid process that combines the best each field can offer.

Copyright Technology Review 2007.

EU clean vehicle procurement law proposed: All public authorities in the EU would be forced to consider the lifetime cost of pollution emissions and fuel consumption when procuring road vehicles under draft legislation tabled by the European commission

Thanks again to Erwin!

EU clean vehicle procurement law proposed

ENDS Europe Daily, 4 January 2008 - All public authorities in the EU would be forced to consider the lifetime cost of pollution emissions and fuel consumption when procuring road vehicles under draft legislation tabled by the European commission in December.

The proposed directive would establish a harmonised EU methodology for calculating the lifecycle costs of fuel consumption and emissions of carbon dioxide (CO2), nitrogen oxides (NOx), hydrocarbons and particulate matter (PM).

Contracting authorities and public transport operators would then be required to internalise these external costs when calculating the overall price of a vehicle for procurement decisions. This requirement would be legally binding from 2012, but member states could apply it earlier if they chose to do so.

As an example the commission says that for a bus costing €150,000, lifecycle fuel consumption costs of over €300,000 and NOx-related costs of nearly €90,000 would have to be taken into account. Existing EU procurement rules state that procurement decisions should be taken based either on the lowest price or the "economically most advantageous" offer.

Quoting a study by consultants PriceWaterhouseCoopers, the commission estimates that the proposed law could save up to 1.9m tonnes of CO2 emissions annually by 2017, equivalent to 0.5 per cent of total EU transport emissions. By the same year, vehicle purchase costs would increase by some €11.5bn, but would be more than offset by fuel savings of €21.3bn and avoided emissions worth nearly €12bn.

The commission's new draft legislation follows the withdrawal in 2006 of a proposal requiring public authorities to ensure that at least 25 per cent of heavy vehicles purchased meet an EU "enhanced environmentally friendly vehicle" standard (EEV). That plan was criticised by MEPs and the road haulage industry as being too weak.

By extending the proposal to cover the purchase of all types of vehicle by public authorities, the new draft is likely to get a better reception from MEPs. But the text will also have to be endorsed by EU governments before it can become EU law.

*Meanwhile the commission's separate proposals to reduce CO2 emissions from new cars have continued to prompt reaction. The international automobile federation (FIA) welcomed the plans as "realistic and achievable". But a group of centre-right MEPs said the legislation would "negatively affect manufacturers in Germany, France, Britain and beyond".

Met vriendelijke groet, kind regards,

Erwin van Overbeek
Environmental Affairs leader - IBM Global Services - Europe
Environmental Affairs - IBM NL

Phone : +31 (0)20 513 3405
Mobile : +31 (0)6 53763581
Home office : +31 (0)577 411609
Notes : Erwin van Overbeek/Netherlands/IBM

"We must consider our planet to be on loan from our children, rather than being a gift from our ancestors."  -  Gro Brundtland

Consortium aims to save energy in commercial buildings - Separate provisions in the energy law require all federal buildings, if feasible, to zero-out their fossil fuel consumption by 2030... Note: WBCSD is part of consortium, IBM member of WBCSD

Thanks to Erwin


Consortium aims to save energy in commercial buildings

Greenwire, 4 January 2008 - An alliance of energy and building industry organizations has formed the Commercial Buildings Initiative to help the federal government zero-out the net energy use of commercial buildings.

The energy bill that President Bush signed last month establishes an Office of Commercial High-Performance Green Buildings. With the help of an industry consortium, the Energy Department office would be charged with developing and disseminating technologies that would enable all new commercial buildings to be zero-net-energy users by 2030. The law ramps up the goal, so that all commercial buildings be zero-net-energy users by 2050.

"We'd certainly be interested in helping DOE run this program, assuming there's funding," said Lowell Ungar, director of policy for the Alliance to Save Energy, a founding member of the consortium. Other members of the group include the American Institute of Architects, U.S. Green Building Council, World Business Council for Sustainable Development, Lawrence Berkeley National Laboratory, and the American Society of Heating, Refrigeration and Air-Conditioning Engineers.

The energy law authorizes $20 million for the effort in fiscal 2008, which began Oct. 1. The law authorizes $50 million for fiscal 2009-2010, $100 million for fiscal 2011-2012 and $200 million for fiscal 2013-2018.

Ungar said the Alliance to Save Energy plans to lobby Congress to fund the program fully, as a way to mitigate climate change, improve air quality and reduce energy consumption and costs.

"This will require significant research and development and a lot of deployment activities," Ungar added. "This is a decades-long program."

Separate provisions in the energy law require all federal buildings, if feasible, to zero-out their fossil fuel consumption by 2030. A new Office of Federal High Performance Green Buildings within the General Services Administration is charged with conducting building energy analysis.

The energy law authorizes $4 million annually for fiscal 2008-2012 for the management of federal building efficiency and a technology acceleration program.

Members of the Commercial Buildings Initiative plan to convene at ASHRAE's winter meeting in New York on Jan. 22.

This article is reproduced with kind permission of E&E Publishing, LLC.
For more daily news and articles, please visit the
Greenwire website

Solar Cell Production Jumps 50 Percent in 2007

Thanks to Peter

December 27, 2007

Solar Cell Production Jumps 50 Percent in 2007

Jonathan G. Dorn

Production of photovoltaics (PV) jumped to 3,800 megawatts worldwide in 2007, up an estimated 50 percent over 2006. At the end of the year, according to preliminary data, cumulative global production stood at 12,400 megawatts, enough to power 2.4 million U.S. homes. Growing by an impressive average of 48 percent each year since 2002, PV production has been doubling every two years, making it the world's fastest-growing energy source.

Photovoltaics, which directly convert sunlight into electricity, include both traditional, polysilicon-based solar cell technologies and new thin-film technologies. Thin-film manufacturing involves depositing extremely thin layers of photosensitive materials on glass, metal, or plastics. While the most common material currently used is amorphous silicon, the newest technologies use non-silicon-based materials such as cadmium telluride.

A key force driving the advancement of thin-film technologies is a polysilicon shortage that began in April 2004. In 2006, for the first time, more than half of polysilicon production went into PVs instead of computer chips. While thin films are not as efficient at converting sunlight to electricity, they currently cost less and their physical flexibility makes them more versatile than traditional solar cells. Led by the United States, thin film grew from 4 percent of the market in 2003 to 7 percent in 2006. Polysilicon supply is expected to match demand by 2010, but not before thin film grabs 20 percent of the market.

The top five PV-producing countries are Japan, China, Germany, Taiwan, and the United States.
(See data.) Recent growth in China is most astonishing: after almost tripling its PV production in 2006, it is believed to have more than doubled output in 2007. With more than 400 PV companies, China's market share has exploded from 1 percent in 2003 to over 18 percent today. Having eclipsed Germany in 2007 to take the number two spot, China is now on track to become the number one PV producer in 2008. The United States, which gave the world the solar cell, has dropped from third to fifth place as a solar cell manufacturer since 2005, overtaken by China in 2006 and Taiwan in 2007.

Strong domestic production is not always a good indicator of domestic installations, however. For example, despite China's impressive production, PV prices are still too high for the average Chinese consumer. China only installed 25 megawatts of PV in 2006, exporting more than 90 percent of its PV production, mainly to Germany and Spain. But large PV projects are expected to increase domestic installations. China is planning a 100-megawatt solar PV farm in Dunhuang City in the northwestern province of Gansu, which would have five times the capacity of the largest PV power plant in the world today.

Despite its skies being cloudy two thirds of the time, Germany has been the leading market for PV installations since it overtook Japan in 2004. In 2006, Germany, adding 1,050 megawatts, became the first country to install more than one gigawatt in a single year. Driven by a feed-in tariff that guarantees the price a utility must pay homeowners or private firms for PV-generated electricity, annual installations in Germany alone have exceeded those in all other countries combined since 2004. There are now more than 300,000 buildings with PV systems in Germany, over triple the initial goal of the 100,000 Roofs Program launched in 1998. Growth is set to remain strong, as a feed-in tariff of 49¢ per kilowatt-hour will remain in place through 2009.

Japan, the United States, and Spain round out the top four markets with 350, 141, and 70 megawatts installed in 2006, respectively.
(See data.) Thanks to a residential PV incentive program, Japan now has over 250,000 homes with PV systems. But the country is currently experiencing a decrease in the growth rate of PV installations resulting from the phase-out of the incentive program in 2005 and a limited domestic PV supply due to the polysilicon shortage.

In contrast, the growth in installations in the United States increased from 20 percent in 2005 to 31 percent in 2006, primarily driven by California and New Jersey. The California Solar Initiative was launched in January 2006 as part of the state's Million Solar Roofs program to provide more than $3 billion in incentives for solar power. The goal is to generate 3,000 megawatts of new solar power statewide by 2017. New Jersey's Clean Energy Rebate Program, which began in 2001, offers a rebate of up to $3.50 per watt for residential PV systems, contributing to a more than tripling of installations between 2005 and 2006. Other states, such as Maryland, have passed renewable portfolio standards that mandate a certain percent of electricity generation from solar PV. For Maryland, the goal of producing 2 percent of electricity from the sun by 2022 is expected to lead to 1,500 megawatts of PV installations in the state.

Initial estimates for the United States as a whole indicate that PV incentives, including a tax credit of up to $2,000 available under the U.S. Energy Policy Act of 2005 to offset PV system costs, helped to achieve an incredible 83-percent growth in installations in 2007.

Spain tripled its PV installations in 2006 to 70 megawatts. A building code that went into force in March 2007 requires all new nonresidential buildings to generate a portion of their electricity with PV. Spain also initiated a feed-in tariff in 2004 that guarantees that renewable energy will be bought by utilities at three times the market value for 25 years. In September 2007, a 20-megawatt PV power plant, currently the largest in the world, came online in the Spanish town of Beneixama and is producing enough electricity to supply 12,000 homes. By the end of 2008, cumulative PV installations in Spain are expected to exceed 800 megawatts, twice its original 2010 goal.

Of the world's PV manufacturers in 2007, Sharp (Japan), Q-Cells (Germany), and Suntech (China) claimed the top three positions.
(See data.) But after holding the top spot for more than six years, Sharp, hampered by limited access to polysilicon, is likely to post only a 4-percent growth in production in 2007, well below the 50 percent industry average. However, Sharp's annual thin-film production capacity is on track to increase from 15 megawatts today to 1,000 megawatts per year in 2010.

Suntech, a relatively new firm started in 2001, was the fourth-largest PV manufacturer in 2006, and eclipsed Kyocera in 2007 to take third place. In the first half of 2007, Suntech produced almost as much PV as it did in all of 2006.

Capitalizing on the polysilicon supply crunch, First Solar in the United States moved into the top 15 global manufacturers in 2006 by producing 60 megawatts of cadmium telluride thin-film PV, triple its production in 2005. In the first half of 2007, First Solar leapt onto the top 10 list, moving up five spots to number eight and continuing its reign as the fastest-growing PV manufacturing company in the world.

The average price for a PV module, excluding installation and other system costs, has dropped from almost $100 per watt in 1975 to less than $4 per watt at the end of 2006.
(See data.) With expanding polysilicon supplies, average PV prices are projected to drop to $2 per watt in 2010. For thin-film PV alone, production costs are expected to reach $1 per watt in 2010, at which point solar PV will become competitive with coal-fired electricity. With concerns about rising oil prices and climate change spawning political momentum for renewable energy, solar electricity is poised to take a prominent position in the global energy economy.


World Annual Photovoltaic Production, 1975-2007 (figure and table)

World Cumulative Photovoltaic Production, 1975-2007 (figure and table)

Annual Photovoltaic Production, Select Countries and Europe, 1995-2006 (figure and table)

Annual Thin Film Photovoltaic Production, Select Countries and Regions, 2003-2006 (figure and table)

Annual Photovoltaic Installations, Select Countries and Regions, 2000-2007 (figure and table)

Photovoltaic Production by Top Ten Producing Companies, 2006 and First Half of 2007 (table)

World Average Photovoltaic Module Cost per Watt, 1975-2006 (figure and table)

2004 Solar Energy Indicator

Suntech, a relatively new firm started in 2001, was the fourth-largest PV manufacturer in 2006, and eclipsed Kyocera in 2007 to take third place. In the first half of 2007, Suntech produced almost as much PV as it did in all of 2006.

Capitalizing on the polysilicon supply crunch, First Solar in the United States moved into the top 15 global manufacturers in 2006 by producing 60 megawatts of cadmium telluride thin-film PV, triple its production in 2005. In the first half of 2007, First Solar leapt onto the top 10 list, moving up five spots to number eight and continuing its reign as the fastest-growing PV manufacturing company in the world.

The average price for a PV module, excluding installation and other system costs, has dropped from almost $100 per watt in 1975 to less than $4 per watt at the end of 2006. With expanding polysilicon supplies, average PV prices are projected to drop to $2 per watt in 2010. For thin-film PV alone, production costs are expected to reach $1 per watt in 2010, at which point solar PV will become competitive with coal-fired electricity. With concerns about rising oil prices and climate change spawning political momentum for renewable energy, solar electricity is poised to take a prominent position in the global energy economy.

For full text and tables go to:

Winner: Restoring Coal's Sheen -- You can add up all the electricity produced in the world from renewable sources plus nuclear reactors, and it doesn’t amount to what coal generates just in the United States and China

Winner: Restoring Coal's Sheen
By William Sweet
First Published January 2008
Swedish energy company takes a novel approach to carbon capture
emailEmail PrintPrint CommentsComments ( 1)  ReprintsReprints NewslettersNewsletters DiggDigg SlashdotSlashdot

PHOTO: Plamen Petkov

The industrial age, wrote historian Barbara Freese, "emerged literally in a haze of coal smoke, and in that smoke we can read much of the history of the modern world." In boom economies like India's and China's, where coal meets about three-quarters of the electrical demand, that haze still hangs heavily. Globally, according to a recent influential study done at MIT and data from the International Energy Agency, in Paris, coal accounts for a quarter of energy consumed and more than two-fifths of ­the electricity generated. That makes it the second leading fuel after oil and the world's main source of ­greenhouse-gas emissions.

You can add up all the electricity produced in the world from renewable sources plus nuclear reactors, and it doesn't amount to what coal generates just in the United States and China. It's impossible to imagine our getting along without coal anytime soon. And yet, with concerns rising sharply about climate change, the general expectation is that governments will increasingly be penalizing carbon emissions by taxing them, regulating them, or forcing companies to trade in them. So burning coal could become radically more expensive unless efficient means are found to capture and permanently store carbon dioxide, which right now is pumped into the atmosphere in astonishing quantities.

In the United States alone, according to MIT, coal-­burning power plants produce about 1.5 ­billion metric tons of CO2 a year—roughly a quarter of the world's total—which is about three times the weight of the total amount of natural gas the country uses each year and nearly twice the volume of oil it consumes annually.

Just capturing the carbon, not to mention finding sound ways of sequestering it, is a job of staggering dimensions and one that the world has just barely begun to address, as the MIT report emphasized. There's been a lot of talk about it, but hardly anybody is doing anything about it. "We need large-scale demonstration projects," a summary of the MIT report said, bluntly.

One company that is doing that kind of demonstration right now is Vattenfall, Sweden's national energy company, in Stockholm. It's building a novel clean-coal plant in southeastern Germany, in a town called Schwarze Pumpe. The approach Vattenfall will test and evaluate at the 30-megawatt ­facility—a technology called oxyfuel, or sometimes oxyfiring—is not the one most favored by students of carbon capture. But it appealed to Vattenfall partly because of its disarming simplicity.

In the oxyfuel process, instead of burning coal in air, the nitrogen is first extracted from the air using standard industrial equipment, so that the coal can be combusted in an atmosphere of oxygen and recycled flue gases. The result is a flue-gas stream containing almost none of the nitrogen that otherwise complicates the separation of carbon dioxide. Once the sulfur has been scrubbed using standard procedures, the flue gases consist essentially of just water vapor and carbon dioxide. The water is separated by condensation, and presto, the carbon dioxide is ready to be compressed and liquefied for transport to a final storage site. In this particular case, Vattenfall will have the CO2 trucked to a region called Altmark, where it will be injected into a natural gas reservoir, initially to enhance gas recovery, and ultimately for final disposal.

Why did Vattenfall settle on this somewhat eccentric approach to carbon capture? Back home, as Sweden's state-owned national utility, it traditionally has produced the country's electricity in hydropower ­stations and nuclear reactors, which for all practical purposes emit no carbon dioxide. But with the opening of Europe's ­electricity system to competition in the 1990s, Vattenfall began to expand outside Sweden and is more or less Europe's fourth largest electricity producer in terms of revenues.

At the end of the 1990s, Vat­tenfall acquired much of what had been East Germany's electricity system from West German energy companies, which had to sell them to meet competition rules. Those West German companies had already begun to improve and clean up the East German power system—which is based almost entirely on lignite—building several giant coal-burning plants, including a 1600-MW pulverized coal plant at Schwarze Pumpe.

The acquisition of the ­lignite plants in eastern Germany, together with the establishment of a European carbon trading system that will make emitting coal increasingly expensive, got Vattenfall's executives thinking about how to secure a future for its coal holdings and help meet commitments under the Kyoto Protocol. "The position we take is that there is a threat to the society and to the whole globe, actually. And so we need to do something," says Lennart Billfalk, an advisor to Vattenfall's CEO and the former manager of its R&D program.

Vattenfall is building the oxyfuel pilot plant at Schwarze Pumpe in close cooperation with the French firm Alstom Power, which is supplying almost all the major components except for the oxygen-­nitrogen separator, the desulfurization system, and the condenser that will remove the water, ­leaving CO2.

Best known for its supersleek and very fast TGV trains, Alstom, based in Levallois-Perret, is the world's No. 2 transportation company and No. 3 in power generation, behind GE and Siemens. The company sees oxyfuel as a growth opportunity and the Schwarze Pumpe project as a learning experience, says John Marion, vice president for global technology at Alstom's U.S. power subsidiary in Windsor, Conn. Marion says that Alstom has been looking closely at oxyfuel and that the Schwarze Pumpe project is the "most significant and advanced step globally" in the field of coal power with carbon capture. He adds that the company has been looking closely at oxyfuel prospects since 1997, because of Kyoto.

A quirky but important aspect of the Schwarze Pumpe plant [see diagram, "Just Take Out the Nitrogen"] is that flue gas is recirculated back into the combustion chamber in order to keep burning temperatures close to their levels in a regular coal-fired plant, near 1000 °C. Research engineers originally devised this procedure when oxyfuel combustion—which, by the way, is common in other industries such as steel, aluminum, and glass—was first visualized mainly as a retrofit technology for existing coal plants. If coal were burned in pure oxygen without recirculation, temperatures would get high enough to melt boiler walls. Recirculating the flue gases simulates, in effect, atmospheric burning conditions, with carbon ­dioxide substituting for nitrogen.

When a plant like the one at Schwarze Pumpe is custom designed, recirculation is theoretically not necessary; the boiler could be designed to withstand higher operating temperatures, and higher-temperature combustion could produce efficiencies. But the Vattenfall and Alstom designers wanted the boiler to be as similar as possible to standard boilers so that they could make close comparisons and scale up with greater confidence, says Marion. Also, coal typically contains between 5 and 30 percent ash, and if the ash melts in excessively high temperatures, it gets sticky, glasslike, and hard to handle.

Alstom would like to be able to sell utility-scale oxyfuel plants—not just major components—on a turnkey basis with the usual full guarantees by the middle of the next decade. And Vattenfall, too, would like to move aggressively with oxyfuel and have a precommercial plant in the ­­­250‑ to 300‑MW range running by 2014 or 2015. Right now Vattenfall is evaluating seven larger carbon-capture projects in Denmark, Germany, and Poland and expects soon to select two, one of which is likely to be an oxyfuel plant. The company's economic target is to develop plants that will pay for themselves if carbon prices in the European cap-and-trade system stabilize at €20 per metric ton or higher.

The oxyfuel concept for coal-fired power generation origi­nated in the late 1970s at Argonne National Laboratory, near Chicago, according to Alan Wolsky, the leader of the team that pioneered the idea there. Wolsky, now a visiting fellow at the University of Cambridge, in England, recalls that the U.S. Department of Energy supported the team's research mainly on the grounds that more CO2 was needed to inject into oil wells for enhanced recovery. Members of the group and their government sponsors were well aware, even then, that climate change was going to be a growing issue, says Wolsky, but neither they nor the Energy Department promoted the research on that basis.

The Argonne-led group did a series of small-scale demonstrations, controlling for factors such as the coal and gas mixture, temperature, and turbulence, and did computer simulations and ­analysis. The work attracted attention worldwide, and other experiments followed in Canada, Japan, the Netherlands, and the United Kingdom. It was a time when most work done at U.S. national ­laboratories was considered public property, and there was not much incentive to secure intellectual property. Wolsky remembers giving oxyfuel talks in Canada, only to be told a year later that Shell Oil had patented the content of his speech.

The initial oxyfuel demonstrations confirmed the technology's promise but also demonstrated the importance of implementing it carefully. For example, when a stoker-fed furnace was used in one demonstration, it was hard to keep air from leaking into the recirculation system; CO2 concentrations in the flue gas were correspondingly low. Handling pure oxygen is always a dicey business, of course, and so there were concerns about safety. Nevertheless, nothing suggested that oxyfuel firing couldn't work or wouldn't work in a pulverized coal system.

Although Vattenfall itself believes that custom oxyfuel design is the way to go, the retrofit option continues to be assessed by a number of companies, including notably Babcock & Wilcox in Barberton, Ohio. B&W owns a relevant patent portfolio, and its executives have testified to the U.S. Congress on the promise of oxyfiring.

Winner: Restoring Coal's Sheen Continued
By William Sweet
First Published January 2008
emailEmail PrintPrint CommentsComments ( 1)  ReprintsReprints NewslettersNewsletters DiggDigg SlashdotSlashdot

B&W was participating in a plan by SaskPower in Regina, Sask., Canada, to build a 300-MW lignite-burning oxyfuel plant, but that project was put on hold earlier this year and will be reassessed in 2009. Meanwhile, however, B&W has converted a test reactor in Alliance, Ohio, to do oxyfuel combustion. The program of oxyfiring tests began last October and will cost B&W US $14 million to $16 million. It concluded a run with bituminous coal in November and early this year will burn Saskatchewan lignite. B&W is partnering in this demonstration with the French company Air Liquide, a leading provider of liquid oxygen.

The Alliance test reactor, like Schwarze Pumpe, produces 30 MW of thermal energy. But it does not have an oxygen-
nitrogen separation facility, and carbon dioxide is not being captured in the tests. B&W is ­planning a commercial-scale demonstration soon, with both custom-designed new units and retrofit in mind, and it considers itself, with Vattenfall and Alstom, a world leader in oxyfuel.

In terms of retrofit, the most important oxyfuel project on the books is in Australia, where the technology got a government go‑ahead in November 2006. (Though Australia, until a new government was elected last fall, had declined to ratify the Kyoto Protocol, it authorized spending 400 ­million Australian dollars on the development of greenhouse gas–
reduction technologies.) CS Energy, of Brisbane, Australia, working with partners in Australia's coal industry and Japanese manu­facturers, wants to backfit a decommissioned 30-MW boiler, Callide A, in Queensland. To that end, CS Energy is doing front-end design work and specifying costs for a project that would involve installing a nitrogen separation plant, flue-gas recycling equipment, a facility to compress and liquefy the carbon dioxide, and the means to transport the CO2 to a storage site. There are at least a half dozen possible sequestration sites within several hundred kilometers of the plant, both depleted gas f­ields and saline aquifers, according to Chris Spero, who is in charge of oxyfuel research at CS Energy.

The retrofitted Callide A plant will burn bituminous coal, not lignite. Spero notes that Australia's soft coals are especially advantageous for oxyfuel retrofit because they are low in sulfur: the flue-gas recirculation system tends to concentrate the sulfur, making its removal more of a problem.

What The Experts Say

"Vattenfall's expensive carbon-­capture experiment is one of the many costs of the global-warming fad." —Nick Tredennick
Carbon capture and sequestration is clearly central to the future of coal in a carbon-constrained world. A ­retrofittable technology would have a big positive impact on our huge ­inventory of existing coal plants." —Kurt Yeager, Galvin Electricity Initiative

If oxyfuel retrofit could be made to work at low enough costs, the implications would be enormous. In principle, all the existing coal plants in the world could be refitted to run carbon free. But Vattenfall is quite skeptical about that scenario. Particularly because so much energy has to be used to separate oxygen from nitrogen at the front end, the whole process will probably be made economically attractive only when plants are scaled up and customized specially for oxyfiring, says Lars Strömberg, until recently chief engineer and project manager at Schwarze Pumpe and now Vattenfall's head of R&D.

Right now the standard oxygen-nitrogen separation equipment runs on electricity, which has to be obtained from the plant itself, reducing the plant's efficiency of energy conversion by several percentage points. With the development of membrane separation ­systems, however, the electrical cost of oxygen might come down. And if heat or steam were recovered from an oxyfuel plant to drive air separation, says Strömberg, and the whole plant were customized for oxyfuel at whatever scale turns out to be optimal, then the plant might register an efficiency gain of several points rather than a loss.

Oxyfuel is but one of three basic approaches to carbon capture and storage. In general terms, carbon can be separated from postcombustion flue gases by chemical means, as sulfur and nitrogen oxides are scrubbed, or the bigger part of the job can be done precombustion, either by gasifying the coal or by oxyfiring. In the United States, discussion of carbon sequestration has been dominated by the coal gasification ­scenario, which generally goes by the acronym IGCC, for integrated gasification, ­combined cycle.

IGCC involves converting coal into a synthetic gas that can be burned to drive steam turbines, just as if it were natural gas; the waste stream consists mainly of hydrogen, carbon dioxide, and water vapor. Four commercial-scale demonstration plants have been built and are operating, two in the United States and two in Europe. Studies comparing IGCC with oxyfuel and postcombustion carbon capture generally find costs in the same ballpark: the total cost of doing carbon capture and storage using any of the three approaches is likely to be between 25 and 75 percent higher, by comparison with standard pulverized coal. IGCC is generally considered slightly cheaper than oxyfuel, but with large uncertainties.

"There's a perception that IGCC is the only game in town, but our calculations indicate it's not the optimal choice, either for hard coal or lignite," says Alstom's John Marion.

IGCC plants are complicated structures that resemble small refineries. They tended to have problems in their early years of operation and by nature require a great deal of maintenance. Their relative economic attractiveness won't really be known until all three ­carbon-­capture approaches have been tested at much larger scales.

And although there are several IGCC plants that are considered adaptable to capture carbon, none have actually done so. So if carbon is captured at Schwarze Pumpe and disposed of permanently in a geologic repository, it will be a first—not just for oxyfuel, but for coal. Although carbon sequestration is not seen as an essential aspect of the project, Vattenfall wants to do a fully integrated demonstration to win public confidence. Stabilizing liquefied carbon dioxide at depths of a kilometer or more has been demonstrated in the North Sea, Canada, and northern Africa.

Vattenfall's Schwarze Pumpe plant builds on a well-developed approach that seems sure to be a part of the solution to the coal-carbon problem. Even if other approaches turn out to be superior for some types of coal, oxyfuel is uniquely suited to lignite, a low-grade and dirty coal found in superabundance in eastern Germany and in some other parts of the world, including Poland and regions of the United States and China. It's likely to be suitable as well for low-sulfur bituminous coals and anthracite.

But even if—contrary to expert expectations—oxyfuel proves to be a technical or economic failure, Vattenfall will still have achieved a moral victory of sorts. This is because Vattenfall will have been the first to initiate and complete a project of significant scale to demonstrate carbon capture and storage with a coal plant.

About the Author

PLAMEN PETKOV Bulgarian-born Plamen Petkov was excited to get his hands dirty shooting for this month's winner "Restoring Coal's Sheen". Given several samples of bituminous and anthracite coal, he chose one that surprised and fascinated him with the "mesmerizing shine and tonalities of black."
Oxyfuel Pilot Plant

WINNER: Clean Coal

GOAL: To show that burning coal in an atmosphere of pure oxygen can facilitate carbon capture; to ­evaluate technical features and economics for lignite and ­bituminous coal.

WHY IT'S A WINNER: Because of its simplicity and its suitability for lower-grade coals, oxyfuel technology will help guarantee a future for coal in a world increasingly preoccupied with climate change. As influential voices call for larger-scale tests of promising carbon-capture technologies, this is the first such full-system integrated demonstration.

PLAYERS: Vattenfall and Alstom

WHERE: Schwarze Pumpe, Germany

STAFF: 150 to 200 at the two companies, part-time

BUDGET: €50 million (about US $73 million)


It Isn't Easy Being Green: Not everyone shares the same concerns; Explain how your product makes a difference; Preempt skepticism with transparency; Position your product as a high-quality alternative; Consider the impact of premium pricing

Thanks to Penelope
Get To The Point from Marketing Profs
It Isn't Easy Being Green

Now is a wonderful time for companies that offer an environmentally friendly product or service. Political, environmental and economic concerns have created a marketplace that's extremely receptive to the idea of going green. But, before you proclaim your credentials from the treetops, consider Jacquelyn Ottman's five rules of green marketing:

  • Not everyone shares the same concerns. Be sure your audience is aware of the issue your product addresses, and wants to do something about it. "Whirlpool learned the hard way that consumers wouldn't pay a premium for a CFC-free refrigerator," writes Ottman, "because consumers didn't know what CFCs were!"
  • Explain how your product makes a difference. No one wants to feel like their contribution is a drop in the ocean, so provide a compelling demonstration of its environmental benefits—whether on an individual or collective basis.
  • Preempt skepticism with transparency. Make it easy for customers to see your commitment is genuine. "Consumers must believe in the legitimacy of your product and the specific claims you are making," Ottman says.
  • Position your product as a high-quality alternative. Reassure customers that it performs at least as well as trusted products from your not-as-green competitors.
  • Consider the impact of premium pricing. Customers might understand why your product costs more, but that doesn't mean they can afford the extra outlay—or that they think it's worth it.
The Po!nt: According to Ottman, "A strong commitment to environmental sustainability in product design and manufacture can yield significant opportunities to grow your business, to innovate and to build brand equity." But do your homework before diving in.

Source: MarketingProfs. Read the entire article here.

Vol. 2, No. 2    January 4, 2008

MarketingProfs, LLC | 419 N. Larchmont | #42 | Los Angeles, California | 90004

This email was sent to ( To ensure that you continue receiving our emails,
please add us to your address book or safe list.

manage your preferences | opt out by going here.
Got this as a forward?
Sign up to receive our future emails.
Copyright © 2000-2007 MarketingProfs, LLC All Rights Reserved.

[CAnet - news] High Speed Internet Help Cools the Planet: The Internet and ICT industry has the tools today to reduce to its own global carbon emissions to absolute zero by collocating routers and servers with renewable energy sites and using advanced data replication and re-routing techniques across optical networks

Thanks to Currie

For more information on this item please visit my blog at or

[Lightreading has been carrying a very useful blog on the Future of the
Internet. Your faithful correspondent has been making some contributions in
regards on how the Internet and ICT in general can contribute in reducing
CO2 emissions.  This can be done in 3 ways:

(a) The Internet and ICT industry has the tools today to reduce to its own
global carbon emissions to absolute zero by collocating routers and servers
with renewable energy sites and using advanced data replication and
re-routing techniques across optical networks.  If the ICT industry alone
produces 10% of the global carbon emissions this alone can have significant

(b) Developing societal applications that promote use of the Internet as an
alternate to carbon generating activities such as tele-commuting, distance
learning, etc as outlined below

(c) Deploying "bits and bandwidth for carbon" trading programs as an
alternate strategy to carbon taxes, cap and trade and/or carbon offsets as
for example in the green broadand initiative -

Thanks to Mr Roques in posting on Lightreading for this pointer--BSA]

Lightreading:  The future of the Internet and Global Warming

Study: High-speed Internet helps cool the planet

Tempted to obsess over how another personal habit helps or hurts the Earth?
Keep surfing with cable or DSL and you might save carbons in the process,
according to the American Consumer Institute.

The world would be spared 1 billion tons of greenhouse gases within a decade
if broadband Internet access were pervasive, the group's report (PDF)
concluded in October.

Broadband is available to 95 percent of U.S. households but active in only
half of them, the study said, noting that near-universal adoption of
high-speed Internet would cut the equivalent of 11 percent of oil imports to
the United States each year.

How would faster downloads and Web page loads curb the annual flow of
globe-warming gases, and by how much? According to the report:

Telecommuting, a "zero emission" practice, eliminates office space and car
commutes: 588 million tons.
E-commerce cuts the need for warehouses and long-distance shipping: 206
million tons.
Widespread teleconferencing could bring one-tenth of all flights to a halt:
200 million tons.
Downloading music, movies, newspapers, and books saves packaging, paper, and
shipping: 67 million tons.

The Department of Energy estimates that the nation's emissions of carbon
dioxide alone total 8 billion tons each year.

A study released and funded by a major Australian telecom company in October
also suggested that broader use of broadband could cut that country's
carbons by 5 percent by 2015.

All it would take is for more people to use software to monitor shipping
schedules, cut the flow of power to dormant gadgets and so forth, the study


send a blank e-mail message to

send a blank email message to

These news items and comments are mine alone and do not necessarily reflect
those  of the CANARIE board or management.

skype: pocketpro

news mailing list