Current wind technology deployed in nonenvironmentally protected areas could generate 37,000,000 gigawatt-hours of electricity per year, according to the new analysis conducted by the National Renewable Energy Laboratory and consulting firm AWS Truewind. The last comprehensive estimate came out in 1993, when Pacific Northwest National Laboratory pegged the wind energy potential of the United States at 10,777,000 gigawatt-hours.
Both numbers are greater than the 3,000,000 gigawatt-hours of electricity currently consumed by Americans each year. Wind turbines generated just 52,000 gigawatt-hours in 2008, the last year for which annual statistics are available.
Though new and better data was used to create the assessment, the big jump in potential generation reflects technological change in wind machines more than fundamental new knowledge about our nation’s windscape.
Wind speed generally increases with height, and most wind turbines are taller than they used to be, standing at about 250 feet (80 meters) instead of 165 feet (50 meters). Turbines are now larger, more powerful and better than the old designs that were used to calculate previous estimates.
“Now we can develop areas that in [previous decades] wouldn’t have been deemed developable,” said Michael Brower, chief technology offier with AWS Truewind, which carried out the assessment. “It’s like oil reserves. They tend to go up not because there is more oil in the ground but because the technology for accessing the oil gets better.”
The new maps, above, are useful for would-be wind-farm developers who need to find promising sites on which to place their turbines. They want locations with high wind speeds, access to transmission lines, cheap land and a host of other smaller logistical concerns. If you purchase the best versions, the Truewind maps have a resolution of 650 feet (200 meters), which is less than the spacing between modern machines. That means they can be used to provisionally site individual machines on the ground.
Many estimates have been made of the wind energy potential of the United States and the Earth. John Etzler made one of the first way back in the 1830s. He used loose numerical analogies to sailing ships to calculate that “the whole extent of the wind’s power over the globe amounts to about … 40,000,000,000,000 men’s power.”
The water-pumping windmill industry flourished in latter half of the 19th century, but wind energy potential calculations did not advance past the back-of-the-envelope until after World War II. When Palmer Putnam attempted to find the best site in Vermont for the first-megawatt sized wind turbine in the early 1940s, his first line of analysis was to look at how bent the trees were.
The 1980s saw a boom in wind energy in the state of California, driven by a number of federal and state incentives as well as an active environmental culture. Back then, the only way to really know how hard and often the wind blew was to put up a tower covered in sensors and measure. So, wind-farm developers concentrated their efforts on three areas — Tehachapi, Altamont Pass and San Gorgonio — and covered the places with towers to measure the wind.
“I still have some databases from back then and you look at them and say, ‘Oh my, they had 120 towers up,’ or something crazy,” Brower said. “That’s not how it’s done anymore.”
Even low-resolution regional maps did not exist until the early 1980s and the first national map was only published by the National Renewable Energy Laboratory (née Solar Energy Research Institute) in 1986. As you can see from the map above, it was more of a general guide than a series of detailed local estimates.
The real boom in wind data came with the availability of cheap computational power in the late 1990s. It was then that Brower’s company began being able to marry large-scale weather models with small-scale topographic models. They created a parallel process for crunching wind data and ran it on small, fast PCs to get supercomputer-level power at low cost. Then, they refined their estimates with data from 1,600 wind measurement towers.
The result is a much more accurate forecast. Truewind’s estimates of wind speed at a location have an uncertainty margin of 0.35 meters a second. Good wind sites have average wind speeds of between 6.5 and 10 m/s, though most onshore areas don’t get above 9. Perhaps more importantly, their estimates for how many kilowatt-hours a turbine in a location will produce are accurate to within 10 percent, Brower stated.
The newest models are now sufficiently good that developers don’t need as much on-site data. They do still use towers to check the maps and models produced by companies like Truewind, but not nearly as many, which reduces the expense and time that it takes to execute a project.
“You might see 10 or 15 towers over an area that would have had 50 or 100 towers before,” he said.
The new data, including these maps and forecasting models, may not directly make wind farms cheaper, but the advances certainly makes them easier to plan for, develop and operate.
“I think of it more as greasing the wheels of the process more than producing a really big cost savings,” Brower said. “You reduce the friction, the transaction costs, and that enables you to get where you’re going faster.”
The better processes, along with state renewable-energy mandates, seem to be helping. In 2009, 10 gigawatts of wind capacity was installed in the United States to bring the nation’s total to 35 gigawatts.
The data plays a more subtle role, too. In helping make the case that wind energy can play a very substantial role in supplying electricity, the new maps and estimates could help convince industrial and political leaders to support renewable energy, particularly in windy heartland states like Kansas, Montana and Nebraska.
Huffington Post
http://huffingtonpost.com
No comments:
Post a Comment