Friday, 19 February 2010

Johnald's Fantastical Daily Link Splurge

Johnald's Fantastical Daily Link Splurge


America’s Wind Energy Potential Triples in New Estimate

Posted: 19 Feb 2010 10:31 AM PST

windmap

The amount of wind power that theoretically could be generated in the United States tripled in the newest assessment of the nation's wind resources.

Current wind technology deployed in nonenvironmentally protected areas could generate 37,000,000 gigawatt-hours of electricity per year, according to the new analysis conducted by the National Renewable Energy Laboratory and consulting firm AWS Truewind. The last comprehensive estimate came out in 1993, when Pacific Northwest National Laboratory pegged the wind energy potential of the U.S. at 10,777,000 gigawatt-hours.

Both numbers are greater than the 3,00,000 gigawatt-hours of electricity currently consumed by Americans each year. Wind turbines generated just 52,000 gigawatt-hours in 2008, the last year for which annual statistics are available.

Though new and better data was used to create the assessment, the big jump in potential generation reflects technological change in wind machines more than fundamental new knowledge about our nation's windscape.

Wind speed generally increases with height, and most wind turbines are taller than they used to be, standing at about 250 feet (80 meters) instead of 165 feet (50 meters). Turbines are now larger, more powerful, and better than the old designs that were used to calculate previous estimates.

"Now we can develop areas that in [previous decades] wouldn't have been deemed developable," said Michael Brower, chief technology offier with AWS Truewind, which carried out the assessment. "It's like oil reserves. They tend to go up not because there is more oil in the ground but because the technology for accessing the oil gets better."

The new maps, above, are useful for would-be wind farm developers who need to find promising sites on which to place their turbines. They want locations with high wind speeds, access to transmission lines, cheap land, and a host of other smaller logistical concerns. If you purchase the best versions, the Truewind maps have a resolution of 650 feet (200 meters), which is less than the spacing between modern machines. That means they can be used to provisionally site individual machines on the ground.

old-wind-maps

Many estimates have been made of the wind energy potential of the United States and the Earth. John Etzler made one of the first way back in the 1830s. He used loose numerical analogies to sailing ships to calculate that "the whole extent of the wind's power over the globe amounts to about… 40,000,000,000,000 men's power."

The water-pumping windmill industry flourished in latter half of the 19th century, but wind energy potential calculations did not advance past the back-of-the-envelope until after World War II. When Palmer Putnam attempted to find the best site in Vermont for the first-megawatt sized wind turbine in the early 1940s, his first line of analysis was to look at how bent the trees were.

The 1980s saw a boom in wind energy in the state of California, driven by a number of Federal and state incentives as well as an active environmental culture. Back then, the only way to really know how hard and often the wind blew was to put up a tower covered in sensors and measure. So, wind farm developers concentrated their efforts on three areas — Tehachapi, Altamont Pass, and San Gorgonio — and covered the places with towers to measure the wind.

"I still have some databases from back then and you look at them and say, 'Oh my, they had 120 towers up,' or something crazy," Brower said. "That's not how it's done anymore."

Even low-resolution regional maps did not exist until the early 1980s and the first national map was only published by the National Renewable Energy Laboratory (née Solar Energy Research Institute) in 1986. As you can see from the map above, it was more of a general guide than a series of detailed local estimates.

The real boom in wind data came with the availability of cheap computational power in the late 1990s. It was then that Brower's company began being able to marry large-scale weather models with small-scale topographic models. They created a parallel process for crunching wind data and ran it on small, fast PCs to get supercomputer level power at low cost. Then, they refined their estimates with data from 1,600 wind measurement towers.

The result is a much more accurate forecast. Truewind's estimates of wind speed at a location have an uncertainty margin of 0.35 meters a second. Good wind sites have average wind speeds of between 6.5 and 10 m/s, though most onshore areas don't get above 9. Perhaps more importantly, their estimates for how many kilowatt-hours a turbine in a location will produce are accurate to within 10 percent, Brower stated.

The newest models are now sufficiently good that developers don't need as much on-site data. They do still use towers to check the maps and models produced by companies like Truewind, but not nearly as many, which reduces the expense and time that it takes to execute a project.

"You might see 10 or 15 towers over an area that would have had 50 or 100 towers before," he said.

The new data, including these maps and forecasting models, may not directly make wind farms cheaper, but the advances certainly makes them easier to plan for, develop, and operate.

"I think of it more as greasing the wheels of the process more than producing a really big cost savings," Brower said. "You reduce the friction, the transaction costs, and that enables you to get where you're going faster."

The better processes, along with state renewable energy mandates, seem to be helping. In 2009, 10 gigawatts of wind capacity was installed in the United States to bring the nation's total to 35 gigawatts.

The data plays a more subtle role, too. In helping make the case that wind energy can play a very substantial role in supply electricity, the new maps and estimates could help convince industrial and political leaders to support renewable energy, particularly in windy heartland states like Kansas, Montana, and Nebraska.

wind-energy-potential

1. Images: NREL/Truewind. 2. NREL. 3. Chart background: flickr/Wayfinder_73

See Also:

WiSci 2.0: Alexis Madrigal's Twitter, Tumblr, and green tech history research site; Wired Science on Twitter and Facebook.

New Giant Prehistoric Fish Species Found Gathering Dust in Museums

Posted: 18 Feb 2010 12:05 PM PST

bonnerichthys-painting

A fresh look at forgotten fossils has revealed two new species of giant, filter-feeding fish that swam Earth's oceans for 100 million years, occupying the ecological niche now filled by whales and whale sharks.

Until now, that ancient niche was thought to be empty, and such fish to be a short-lived evolutionary bust.

"We knew these animals existed, but thought they were only around for 20 million years," said Matt Friedman, a University of Oxford paleobiologist."People assumed they weren't important, that they were an evolutionary failure that was around for a brief time and winked out. Now we realize that they had a long and illustrious evolutionary history."

bonnerichthys_fossilsIn a paper Feb. 18 in Science, Friedman and five other paleobiologists describe Bonnerichthys gladius and Rhinconichthys taylori. They belong to the pachycormid genus, an extinct group of immense fishes that ate by drifting slowly, mouth agape, sucking down plankton and other tiny aquatic life.

Prior to the paper's publication, pachycormids were known from fossils of a single species, Leedsichthys problematicus. (The species name derives from the fragmented remains of its first fossils.) Leedsichthys was an impressive creature, reaching lengths of 30 and perhaps even 50 feet, but its fossils have only been found in western Europe and are between 160 and 145 million years old — a brief, relatively unexceptional footnote to animal history.

However, during a chance visit by Friedman to the University of Kansas, researchers from their Natural History Museum told him of odd recoveries from a newly-prepared fossil deposit: delicate plates and long rods of bone, jumbled beyond recognition. As Friedman put the pieces together, he realized that the plates were part of a jaw, and the rods were gills. That configuration was known from Leedsichthys, but this clearly belonged to a new species.

Working with other museums, Friedman found more examples of the species, which he dubbed B. gladius. They had been collected in the 19th century and mistakenly classified as Leedsichthys, or dismissed as uninteresting. By the time he was finished, Friedman found B. gladius fossils as old as 172 million years, and as young as 66 million years. In the dusty recesses of London's Natural History Museum, He also found another pachycormid species, R. taylori; it had been mischaracterized and forgotten by Gideon Mantell, the English paleontologist credited with starting the scientific study of dinosaurs.

Altogether, the fossils showed that pachycormids were not a footnote, but an evolutionary chapter that spanned more than 100 million years.

"That's longer than the duration of any living groups of feeders," said Friedman. "That's longer than the Cenozoic, when mammals ascended to ecological dominance."

The disappearance of B. gladius from the fossil record coincides with the Cretaceous-Paleogene mass extinction, which wiped out the dinosaurs and bequeathed terrestrial Earth to birds, mammals and insects. Then, extinction was likely caused by an asteroid strike or period of prolonged volcanic activity that shrouded the planet in dust, or both, causing massive die-offs in bottom-of-the-food-chain plants.

With a diet based on photosynthesizing algae, the pachycormids "had the perfect profile of a victim and became extinct," wrote Lionel Cavin, a paleontologist at Geneva's Natural History Museum, in an accompanying commentary.

Ten million years after B. gladius disappeared, sharks and rays rose to prominence. Twenty-five million years after that, modern whales evolved. As described in another Science paper, the whales' evolution coincided with a rebirth of the photosynthetic algae that had once fed B. gladius and the other pachycormids.

Friedman plans to continue studying the pachycormids, and hopes his story will inspire other researchers.

"We've just flagged off a couple examples of these animals," he said. "We know there must be others in the fossil record. Often, when people are collecting fossils in the field, they leave behind the fish, because they're not thought to be important. We hope they keep them."

Images: 1) Robert Nicholls. 2) Bonnerichthys forefin/Matt Friedman. 3) Bonnerichthys jawbones and forefin/Matt Friedman.

See Also:

Citations: "100-Million-Year Dynasty of Giant Planktivorous Bony Fishes in the Mesozoic Seas." By Matt Friedman, Kenshu Shimada, Larry D. Martin, Michael J. Everhart, Jeff Liston, Anthony Maltese, Michael Triebold. By Felix G. Marx and Mark D. Uhen. Science, Vol. 327 No. 5968, Feb. 18, 2010.

"On Giant Filter Feedes." By Lionel Cavin. Science, Vol. 327 No. 5968, Feb. 18, 2010.

"Climate, Critters, and Cetaceans: Cenozoic Drivers of the Evolution of Modern Whales." By Felix G. Marx and Mark D. Uhen. Science, Vol. 327 No. 5968, Feb. 18, 2010.

Brandon Keim's Twitter stream and reportorial outtakes; Wired Science on Twitter. Brandon is currently working on a book about ecological tipping points.

Winner (and Losers) of NASA’s Final Shuttle Patch Contest

Posted: 18 Feb 2010 11:44 AM PST

patches_8a

The final patch design for the Space Shuttle Program has been selected by a NASA committee from a pool of entriesby NASA employees and contractors.

It beat out 84 other prospective patches to be the final commemorative token of the program that definedthis generation of space travel.

"As the Space Shuttle Program has been an innovative, iconic gem in the history of American spaceflight, the overall shape of the patch and its faceted panels are reminiscent of a diamond or other fine jewel," wrote the winning artist, Blake Dumesnil a Hamilton Sundstrand employee, who works at Johnson Space Center.

The tradition of creating NASA program and mission patcheswas borrowed from the military. It began with the Gemini program of the 1960s.

"There's a long history of patches, so there's a very rich tradition of when someone says 'a mission patch' or 'a shuttle patch,' there's an idea that comes to mind," said Robert Pearlman, a space historian and creator of collectSPACE.com.

spaceshuttle

NASA selected 15 patches as finalists, which were then voted on by NASA employees and winnowed down by a committee. The People's Choice winner above was also the judge's selection.

Like the other finalists, it hewed very closely to the visual tradition established by NASA's original Shuttle patch (right) and the more than 130 that have come since. But, outside the top 15 selected by NASA, some would-be patch designers broke the mold and tried to create some innovative and, um … different, patches. Below,are some of our favorite patches that didn't win. We present the work with the original descriptions by the artists (unnamed by NASA), along with awards for the grooviest, and the most macho and most Soviet patches, among others.

11
Wired Science Award: Grooviest
"This was a Journey that will be remember for a life time. The goals that was accomplish and the lost of brave people dedicate there lives with the shuttle program. For years of training putting together this greatest technology of wonder. The ISS and Hubble, both fine work. With these's they will have more great discovery out there more far beyond the universe. With is picture is the memories of our lost and hard work."

18
Wired Science Award: Machoest
"The patch depicts a strong American arm with American flag sleeve, placing a Space Shuttle into orbit. In the background is the Earth and the SRBs and External Tank. The International Space Station is also in the background. The patch is a simple depiction of America's might provided by the most sophisticated spacecraft system built by man … yet!"

20
Wired Science Award: Abstractest
"The design is to have all five shuttles launching (triangles) with exhaust flames billowing in red/white covering the left side of the patch under the shuttles. Showing the Space telescope (rectangle) and Space Station (the other object) against a blue sky. If there is enough room on the patch a part of the earth could be added to the right side. The astronauts' names can be added to the outside of the circle. But as this is the last fight you may not want to add the names."

66
Wired Science Award: Lovingest
"The dove represents peace.
The nucleus symbol represents science.
The stars represent the 5 shuttles.
The shuttle with wings represents the fallen astronauts.
The purple heart represents the astronaut's bravery."

46
Wired Science Award: Microsoft Paintbrushiest
"• Major successes for Space Shuttle
• 3 symbols represent Spacelab science
• Hubble Telescope
• International Space Station
• Number of Flights – 139
• Name of all shuttles (test & operational)
• 1 star for each lost crew member
• History of shuttle 1977 – 2010
• Main Shuttle launch/landing point marked (KSC)
• Shuttle flying-off into history"

85
Wired Science Award: Sovietest
No description given.

See Also:

WiSci 2.0: Alexis Madrigal's Twitter, Google Reader feed, and green tech history research site; Wired Science on Twitter and Facebook.

The First and Last Meeting of Everyone with a Fully Sequenced Genome

Posted: 18 Feb 2010 02:00 AM PST

lavaamp

Nearlyevery person who has had their entire genome sequenced willgather ina single room near Boston on April 27. It's the last time this will ever happen.

Within a year, the dozens of people in this elite group will have been joined bya thousandor more people. Soon after that, hobbyists may be roaming the streets with handheld DNA analyzers, high school athletes may experiment with gene therapy to enhance their performance and pharmacists might check our genetic records before filling prescriptions.

"There was a time that only guys in white labcoats had the credentials and training to operate computers," said Jason Bobe, co-organizer of the GET conference, where the fully sequenced group will meet."Nowadays, we're all experts to some degree. This is happening in genetics too."

Bobe hopes torecruit 100,000 peopleto donate their genetic information to create a public database for medical research.

knomeThe next five years will bring massive genetics experiments and breakthroughs in personalized cancer treatment, according toHarvard University geneticist George Church. Doctors will test medications on stem cells derived from their patients to check whether they will work.

The first human genome sequence, finished in 2003,cost an estimated $2.7 billion. Today, the price has dropped below $1,500 for a complete sequence, and it'son the way to becomingsoinexpensive that most everyone will be able to afford it.

But it's not clear how we will use all of that information.Personalized medicine may be the most important use of DNA analysis, but many industries will be affected by the plummeting costs of gene reading equipment.

"Lets not overlook the ways that genomics will be incorporated into other aspects of our lives," Bobe said, "like our foods, our households, our backyards, consumer goods, our identities and social interactions."

The shelves of most big grocery stores are already lined with products that contain genetically modified vegetables. Students have used DNA bar code analysis to identify fake tuna in fancy sushi restaurants. And anyone can sign up for a dating website that matches people based on their genetic traits.

"Genetics know-how will have spread even faster than the rise of computers from obscurity in 1980 to access for everyone today, even in developing nations," Church said.

Access to the event, however, will be limited. Only two-hundred people can attend, and tickets will cost $999. Butanyone will be able to watch video clips of the best discussions for free.

Images: 1) The LavaAmp is an experimental DNA copying machine that could cost less than $100 and allow hobbyists to do genetic tests at home./Aaron Rowe. 2)A $68,500 genome sequence from Knomecomes on one of these fancy flash drives./Knome

See Also:

NASA Brings the Dark Side of the Sun to Your iPhone

Posted: 17 Feb 2010 06:28 PM PST

3d-sun-screens

As the sun reawakens from an anomalously quiet period, keep track of solar flares, sunspots and coronal mass ejections witha new iPhone app that puts the real-time status of the sun in your hand.

"This is more than cool," Dick Fisher, director of NASA's Heliophysics Division, said in a press release. "It's transformative. For the first time ever, we can monitor the sun as a living, breathing 3-dimensional sphere."

With the free 3D Sun app, you can set your phone to alert you when a new solar flare erupts, watch video of a solar prominence or a comet heading into the sun. You can manipulate an image of the sun in three-dimensions with your finger.

The data is streamed to Earth by NASA'stwin STEREO spacecraft which monitor the sun from two different spots, one ahead of Earth in its orbit and one behind, giving stereoscopic images to give a sort of three-dimensional view, similar to the way our two eyes do.

3d-sun4The pair cover 87 percent of the sun's surface, effectively giving us a view of the "dark side" of the sun. This means anyone can see on their phone parts of the sun that even the most powerful telescopes on Earth can't see.

The STEREO spacecraft watch the extreme untraviolet portion of the electromagnetic spectrum because the most exciting solar phenomena show up best at these wavelengths.

"That's why the 3D sun looks false-color green,"said STEREO program scientist Lika Guhathakurta in a press release. "These are not white-light images."

The app was built by a team of programmers led by Tony Phillips, editor of http://science.nasa.gov/. The team plans to release 3D Sun 2.0 which will have higher-res images and data from more wavelengths.

This is the second free iPhone app NASA has released recently.NASA's first app brings you loads of space photos from the Hubble Space Telescope and other NASA missions, videos from NASA TV of science updates, mission activity, rocket launches and other events, mission status updates and live countdowns clocks. And you can track the International Space Station as well.

See Also:

How to Do the Ultimate Aging Study

Posted: 17 Feb 2010 01:27 PM PST

grandparents

Longevity is one of the hottest areas of science, but there's a curious hole in the research: Scientifically speaking, nobody knows how to measure aging, much less predict reliably how people will respond to time's ravages.

After all, aging isn't just chronological. Some people are spry and nimble in their elder years. Others are afflicted by the diseases of aging — heart disease, diabetes, cancer, Parkinson's, Alzheimer's, dementia and stroke —by middle age.

Many researchers think those diseases are manifestations of a common underlying cause, known conversationally as aging but as yet undefined by science. They call for studies that would gather exhaustive clinical and genetic data from thousands of people over many years, hopefully identifying the biological mechanisms of growing both older and unhealthier.

"Unlike models of drug development for the diseases of aging, which have consensus endpoints to evaluate, we have not reached a consensus in aging," said gerontologist Don Ingram of the Pennington Biomedical Reseach Center. "We don't know how to predict how someone will function later in life, and we need to."

That such a basic gap exists seems counterintuitive. After all, longevity-enhancing research has never been so prominent. Following leads revealed by animals on calorically restricted diets —they tend to live longer, apparently because dietary stress triggers cell-protecting routines that prevent aging diseases — scientists have found genes and pathways that can be targeted by drugs.

Resveratrol, a natural compound that affects mitochondrial function and DNA repair, and its pharmaceutical derivatives have been used to prevent diabetes in obese mice. Now they're being tested in humans. Manipulations of the growth-regulating IGF-1 pathway have extended lifespans in lab animals. Rapamycin, used to suppress autoimmune responses in people receiving organ transplants, has extended the lives of elderly mice. It's now being tested in mice against specific diseases.

All these findings hint at a universal aging process, and the concept has finally gone mainstream. Longevity research earned a December U.S. News and World Report cover story, and a Time cover package this month. But these experimental results are preliminary, and only tweak pieces of a larger puzzle. Gerontologists say that to develop drugs that slow the aging process, they need to know far more about aging's biology.

"We need to have a set of thousands of people, representing all groups, that are closely followed on health measures. They'd be tested three or four times a year, for five or 10 years. Then you'd have a good sense of the trajectory of aging," said David Harrison, a Jackson Laboratory gerontologist who co-authored the landmark paper on rapamycin's life-extending potential.

According to Harrison, people enrolled in the proposed study could, after several years, opt to take rapamycin. That would let researchers see whether it works in people as it does in mice. If so, they'd also have a detailed account of resulting gene and protein changes, and insight into whether rapamycin works better in some people than others.

Rapamycin does have toxic side effects. Though treatment could be stopped immediately, safety couldn't be guaranteed. But as Harrison noted, "There are hundreds of clinics for rich people to take anti-aging treatments that are at best placebos, and they pay ungodly amounts of money. Wouldn't some people, at least, like to participate in a science-based study where they have their trajectory of aging measured and monitored?"

For now, such a study wouldn't be run by the Jackson Laboratory or a governmental funding agency, like the National Institutes of Health. It would likely take place under the auspices of a private foundation, with study participants footing much of the bill.

But take rapamycin out of the equation, and a long-term study of aging biomarkers would be suitable for institutional funding. Of course, it would still be expensive. But even a long-term study of aging in rodents would be useful, and it would also be more affordable, said Felipe Sierra, director of the National Institute on Aging's Division of Aging Biology.

"If we had a set of biomarkers that at 12 months of age predicted which mice would die younger or older, then we could shorten mouse studies to 12 months," said Sierra.

To get funding, such a study would have to overcome the mixed legacy of a decade-long project launched by the NIA in the late 1980s. Researchers looked for biological markers in rodents, but had neither the technology nor the understanding necessary to find them.

"We spent a lot of money and got nothing. Now it's taboo in our field," said Sierra. "But we weren't ready. Each one of us was looking for a single biomarker of aging. It turns out that there's no such thing. But with the advent of modern metabolomics and proteomics, it might be possible to do this."

Sierra noted that — at a moment of cheap gene sequencing and high-powered genome association studies, when desktop computers crunch terabytes of gene and protein data — the most reliable indicator of aging is still whether people look old. It's hardly scientific.

"The technology has advanced to the point where we should be able to try," said Sierra.

Image:Lynn Lin/Flickr

See Also:

Brandon Keim's Twitter stream and reportorial outtakes; Wired Science on Twitter. Brandon is currently working on a book about ecological tipping points.

No comments:

Post a Comment