Friday, February 26, 2010

Nuclear Power Renaissance Faces Serious Obstacles

The renaissance of nuclear power in the U.S. appears inevitable. It just may not happen as smoothly as the Obama administration and others hope.

The Vermont Senate's vote Wednesday to block a license renewal for an Entergy plant shows that supporters of nuclear power still have big obstacles to overcome. Those include the growing costs for new plants, environmental worries and the age of the country's existing nuclear fleet.
"I think if you said 'ready, go' today, any kind of meaningful addition would be 10 years down the road," said Eric Melvin of Mobius Risk.

The U.S. gets about a fifth of its electricity from nuclear power, but no new nuclear plant has been built in nearly three decades.

Momentum for new nuclear plants has been gaining steam, and backers say the production of electricity without any emissions of greenhouse gases outweigh potential problems. President Obama announced $8 billion in loan guarantees last week for two reactors to be added in Georgia, an investment he says is necessary to provide electricity from cleaner sources of energy than traditional fossil fuels.
Giant costs for adding nuclear power have proven to be a drag on utilities, which say the loan guarantees are the only way to get projects off the ground. Reactors can run $6 billion to $8 billion apiece. That tops the market cap of some of the utilities that have wanted to build them.
And some states have put a further strain on trying to develop new plants.

St. Louis-based Ameren, for example, has suspended plans for a second nuclear reactor in Missouri after lawmakers failed to repeal a 1976 law barring utilities from charging customers for certain costs of a new power plant before it starts producing electricity. Florida Power & Light suspended much of its work on two new reactors after Florida regulators rejected nearly all of the company's request for a rate increase.

Safety also becomes an issue for state officials.

Just the idea of a new plant gets people thinking about partial meltdown of a reactor at the Three Mile Island plant in Pennsylvania in 1979 and the Chernobyl disaster 23 years ago in then-Soviet Ukraine that spread radiation over much of Europe, Melvin said. "People can't get past the initial fear," he said.

The aging fleet of existing nuclear plants has led to worries about leaks from buried pipes and related systems, including the Yankee plant. "The Vermont Yankee situation certainly threatens to undermine the industry's campaign to spin nuclear power as clean and 'pollution-free' energy," said Edwin Lyman of the Union of Concerned Scientists. Steve Kerekes of the Nuclear Energy Institute, the industry trade group, said Entergy, the second biggest operator of nuclear plants in the U.S., found the leak based on voluntary safety programs put in place in 2006. The leak posed no health or safety threat, he said.

Since 2000, the NRC has granted license extensions at 59 of the nation's 104 reactors, Kerekes said. Another 19 requests are pending and the rest have not needed a renewal since then.
Nuclear plants have shown they can be safely operated for more than 60 years, Kerekes noted.
Nuclear Regulatory Commission spokesman Diane Screnci said the license renewal for the Yankee plant is pending before the commission. NRC staff has not found any reason not to renew the license.
Entergy said it is committed to win the 20-year renewal of the plant's license.

Spokesman Jim Steets said failure to keep that plant operating means relying on more costly replacement power that it is not as clean as nuclear power. "It makes a lot more sense to continue to operate that plant than shut it down," he said.


Wednesday, February 24, 2010

Australia carbon neutral by 2020?

New report from Beyond Zero Emissions says Australia could permanently pull the plug on fossil fuels in 10 years.

You have to give it to the advocacy organization Beyond Zero Emissions. They are not being set back by the failure of the Australian government to pass even a 5 percent reduction in CO2 emissions by 2020. A new report they commissioned in conjunction with two universities, Melbourne University and Australian National University, claims the country, which is blessed with both abundant solar and wind resources, could go entirely carbon-neutral in just 10 years. According to Executive Director Matthew Wright, "We have concluded that there are no technological impediments to transforming Australia’s stationary energy sector to zero emissions over the next 10 years."

According to the plan, 60 percent of the nation's power could come from CSP (concentrating solar thermal) and 40 percent from wind. The group sees no need for nuclear power in the mix, which after all still demands a continued supply of a dwindling resource — uranium. The solar and wind would complement each other and combined with molten salt storage and combustion of renewable biomass, they could create an even flow of electricity to the nation's 21 million residents.

It sounds like a pipe dream and as much as I love it, I have to agree it is ... quite literally. The super-grid transmission "pipes" that would enable such a noble and utopian vision are still in R & D and though theoretically possible would be probably be prohibitively expensive, significantly more than the $360 billion proposed in the report.

But this is just the kind of practical dreaming we need to do. If the big coal and gas companies redirected their efforts and their lobbying money, such a vision could in fact become a reality.


Tuesday, February 23, 2010

Asia-Produced Ozone Making Way to U.S.

WASHINGTON — A new study further bolsters concerns that pollution blowing across the Pacific Ocean from China and other rapidly developing Asian nations may swamp efforts to clean up the air in the Western United States and make it difficult for states and cities to meet federal standards.
The study, based on 100,000 measurements over 25 years and a computer model tracking air-flow patterns, found that during the spring, ozone from Asia reaches Washington, Oregon, California and other states west of the Rocky Mountains.
For the first time, the study links ozone in the air above the United States with Asian pollution, said Dan Jaffe, a professor of atmospheric and environmental chemistry at the University of Washington-Bothell and one of the study's authors.
"It is possible that emissions from emerging economies like China, with relatively limited emissions controls, are outpacing reductions in the developing countries," the report concludes. It says that the Asian emissions may "hinder the USA's compliance with its own ozone air-quality standard."
Previous studies have detected such pollutants from Asia as mercury, soot and PCBs reaching the United States. A National Academy of Sciences study last year pointed to increasing unease among regulators about a growing problem.
"Any air pollutant with an atmospheric lifetime of at least three to four days may be transported across most of a continent, a week or two may get it across an ocean, a month or two can send it around the hemisphere and a year or two may deliver it anywhere on Earth," the National Academy of Sciences said last year.
The academy's new report, prepared by the National Research Council, says the problem involves not only trans-Pacific pollution but also trans-Atlantic pollution, with emissions from the United States reaching Europe. The study zeroed in on ozone, particulate matter, mercury and persistent organic pollutants, which have been tracked by ground-based monitors, airborne monitors and satellite-borne sensors.
Among the federal agencies that are interested in the issue are the Environmental Protection Agency, the National Oceanic and Atmospheric Administration, NASA and the National Science Foundation.
The academy called for increased "fingerprinting" of pollutants, so that it's easier to locate their sources, and more detailed studies of emission totals and the atmospheric conditions that spread the pollution.
"The relative importance of long-range pollutant contributions from foreign sources is likely to increase as nations institute stricter air-quality standards that result in tougher emission controls on domestic sources," the academy's report says.
The study on ozone from Asia, authored by an international group of scientists, appears in the Jan. 21 issue of the journal Nature. It comes as the EPA is considering tightening ozone standards.
Ozone is the main ingredient in smog, which can cause health problems that range from burning in the eyes and throat to pulmonary inflammation and increased risk of heart attack. It's created when sunlight mixes with oxygen and nitrogen from vehicle tailpipes and other sources of combustion.
The study focused on an area two to six miles above the Earth known as the mid-troposphere. Pollution in the troposphere could affect the ground level.
While emissions of nitrous oxide, a precursor compound in ozone, have declined in the United States by about one-third since 1985, the study found that ozone levels had increased by 29 percent over the same period.
The study notes that from 2001 to 2006, ozone precursor emissions in east Asia were up 44 percent, and 55 percent in China.
"The changes we have seen over the past 25 years coincide with when China was transforming itself into an economic powerhouse," Jaffe said.
The study didn't pinpoint which Asian countries the ozone might be coming from, Jaffe said. Among the possibilities are China, India and Vietnam.
"What we can say is there has been a strong and significant increase in ozone in the mid-troposphere in the West and it doesn't seem the U.S. is contributing to the increase," said Owen Cooper, a research scientist at the University of Colorado attached to NOAA's Earth System Research Laboratory in Boulder, Colo.
Cooper said the study found that Asian emissions were adding to the ozone pollution in the Western United States, but scientists couldn't say by how much and which countries were involved.
"This is only the first step," Cooper said of the study.
The EPA, which regulates emissions through the Clean Air Act, is looking at international ozone transport, the agency said in an e-mailed statement.
"However, as the National Academy of Sciences concluded in its recent report, our current ability to fully characterize the impact of foreign sources on air quality in the United States is somewhat limited," the statement said.
The EPA said it was too soon to tell how the latest study might affect ozone standards.

Huffington Post

Monday, February 22, 2010

The Fighting Patriots, Troops Fighting for Climate Change

Fighting climate change is not only good for the Earth, but apparently it's also patriotic. National security and climate change may not be the likeliest pair, but the two are more interconnected than you would think. "Climate Patriots," a video presentation released by the Pew Project on National Security, Energy and Climate, explores how solving global climate change can make for a safer America, drawing on science and military experts to find new strategies.
According to The Pew Project's Web site, "if left unchecked, global warming could lead to civil strife, economic stress, conflicts over water and other resources, mass migration, and increased terrorism."
It goes a little something like this: As tensions mount because of the adverse effects of climate change, so does political instability, leading to more conflict, fundamentalism, and increased terrorism.
"It's like taking every hornet's nest we already have around the world and shaking it up," James Morin, a former captain of the U.S. Army, said.
Our lack of energy independence (dependence on foreign oil), combined with climate change, will pose an "unacceptably high threat level," according to a recent study by retired military leaders.
"We can be a better country, we can better support our troops by becoming more energy efficient, more sustainable, by using other forms of energy-- it has to be something that all Americans feel a part of," retired Vice Admiral Dennis McGinn said.

Huffington Post

America's Wind Energy Potential Triples In New Estimate

The amount of wind power that theoretically could be generated in the United States tripled in the newest assessment of the nation’s wind resources.
Current wind technology deployed in nonenvironmentally protected areas could generate 37,000,000 gigawatt-hours of electricity per year, according to the new analysis conducted by the National Renewable Energy Laboratory and consulting firm AWS Truewind. The last comprehensive estimate came out in 1993, when Pacific Northwest National Laboratory pegged the wind energy potential of the United States at 10,777,000 gigawatt-hours.
Both numbers are greater than the 3,000,000 gigawatt-hours of electricity currently consumed by Americans each year. Wind turbines generated just 52,000 gigawatt-hours in 2008, the last year for which annual statistics are available.
Though new and better data was used to create the assessment, the big jump in potential generation reflects technological change in wind machines more than fundamental new knowledge about our nation’s windscape.
Wind speed generally increases with height, and most wind turbines are taller than they used to be, standing at  about 250 feet (80 meters) instead of 165 feet (50 meters). Turbines are now larger, more powerful and better than the old designs that were used to calculate previous estimates.
“Now we can develop areas that in [previous decades] wouldn’t have been deemed developable,” said Michael Brower, chief technology offier with AWS Truewind, which carried out the assessment. “It’s like oil reserves. They tend to go up not because there is more oil in the ground but because the technology for accessing the oil gets better.”
The new maps, above, are useful for would-be wind-farm developers who need to find promising sites on which to place their turbines. They want locations with high wind speeds, access to transmission lines, cheap land and a host of other smaller logistical concerns. If you purchase the best versions, the Truewind maps have a resolution of 650 feet (200 meters), which is less than the spacing between modern machines. That means they can be used to provisionally site individual machines on the ground.
Many estimates have been made of the wind energy potential of the United States and the Earth. John Etzler made one of the first way back in the 1830s. He used loose numerical analogies to sailing ships to calculate that “the whole extent of the wind’s power over the globe amounts to about … 40,000,000,000,000 men’s power.”
The water-pumping windmill industry flourished in latter half of the 19th century, but wind energy potential calculations did not advance past the back-of-the-envelope until after World War II. When Palmer Putnam attempted to find the best site in Vermont for the first-megawatt sized wind turbine in the early 1940s, his first line of analysis was to look at how bent the trees were.
The 1980s saw a boom in wind energy in the state of California, driven by a number of federal and state incentives as well as an active environmental culture. Back then, the only way to really know how hard and often the wind blew was to put up a tower covered in sensors and measure. So, wind-farm developers concentrated their efforts on three areas — Tehachapi, Altamont Pass and San Gorgonio — and covered the places with towers to measure the wind.
“I still have some databases from back then and you look at them and say, ‘Oh my, they had 120 towers up,’ or something crazy,” Brower said. “That’s not how it’s done anymore.”
Even low-resolution regional maps did not exist until the early 1980s and the first national map was only published by the National Renewable Energy Laboratory (née Solar Energy Research Institute) in 1986. As you can see from the map above, it was more of a general guide than a series of detailed local estimates.
The real boom in wind data came with the availability of cheap computational power in the late 1990s. It was then that Brower’s company began being able to marry large-scale weather models with small-scale topographic models. They created a parallel process for crunching wind data and ran it on small, fast PCs to get supercomputer-level power at low cost. Then, they refined their estimates with data from 1,600 wind measurement towers.
The result is a much more accurate forecast. Truewind’s estimates of wind speed at a location have an uncertainty margin of 0.35 meters a second. Good wind sites have average wind speeds of between 6.5 and 10 m/s, though most onshore areas don’t get above 9. Perhaps more importantly, their estimates for how many kilowatt-hours a turbine in a location will produce are accurate to within 10 percent, Brower stated.
The newest models are now sufficiently good that developers don’t need as much on-site data. They do still use towers to check the maps and models produced by companies like Truewind, but not nearly as many, which reduces the expense and time that it takes to execute a project.
“You might see 10 or 15 towers over an area that would have had 50 or 100 towers before,” he said.
The new data, including these maps and forecasting models, may not directly make wind farms cheaper, but the advances certainly makes them easier to plan for, develop and operate.
“I think of it more as greasing the wheels of the process more than producing a really big cost savings,” Brower said. “You reduce the friction, the transaction costs, and that enables you to get where you’re going faster.”
The better processes, along with state renewable-energy mandates, seem to be helping. In 2009, 10 gigawatts of wind capacity was installed in the United States to bring the nation’s total to 35 gigawatts.
The data plays a more subtle role, too. In helping make the case that wind energy can play a very substantial role in supplying electricity, the new maps and estimates could help convince industrial and political leaders to support renewable energy, particularly in windy heartland states like Kansas, Montana and Nebraska.