Crocs Uncover

Bizarre Species

jueves, 20 de enero de 2011

Yellowstone Has Bulged as Magma Pocket Swells


Yellowstone National Park's supervolcano just took a deep "breath," causing miles of ground to rise dramatically, scientists report.

The simmering volcano has produced major eruptions—each a thousand times more powerful than Mount St. Helens's 1980 eruption—three times in the past 2.1 million years. Yellowstone's caldera, which covers a 25- by 37-mile (40- by 60-kilometer) swath of Wyoming, is an ancient crater formed after the last big blast, some 640,000 years ago.

Since then, about 30 smaller eruptions—including one as recent as 70,000 years ago—have filled the caldera with lava and ash, producing the relatively flat landscape we see today.

But beginning in 2004, scientists saw the ground above the caldera rise upward at rates as high as 2.8 inches (7 centimeters) a year.
The rate slowed between 2007 and 2010 to a centimeter a year or less. Still, since the start of the swelling, ground levels over the volcano have been raised by as much as 10 inches (25 centimeters) in places.

"It's an extraordinary uplift, because it covers such a large area and the rates are so high," said the University of Utah's Bob Smith, a longtime expert in Yellowstone's volcanism.

Scientists think a swelling magma reservoir four to six miles (seven to ten kilometers) below the surface is driving the uplift. Fortunately, the surge doesn't seem to herald an imminent catastrophe, Smith said. (Related: "Under Yellowstone, Magma Pocket 20 Percent Larger Than Thought.")

"At the beginning we were concerned it could be leading up to an eruption," said Smith, who co-authored a paper on the surge published in the December 3, 2010, edition of Geophysical Research Letters.

"But once we saw [the magma] was at a depth of ten kilometers, we weren't so concerned. If it had been at depths of two or three kilometers [one or two miles], we'd have been a lot more concerned."

Studies of the surge, he added, may offer valuable clues about what's going on in the volcano's subterranean plumbing, which may eventually help scientists predict when Yellowstone's next volcanic "burp" will break out.

Yellowstone Takes Regular Breaths

Smith and colleagues at the U.S. Geological Survey (USGS) Yellowstone Volcano Observatory have been mapping the caldera's rise and fall using tools such as global positioning systems (GPS) and interferometric synthetic aperture radar (InSAR), which gives ground-deformation measurements.

Ground deformation can suggest that magma is moving toward the surface before an eruption: The flanks of Mount St. Helens, for example, swelled dramatically in the months before its 1980 explosion. (See pictures of Mount St. Helens before and after the blast.)

But there are also many examples, including the Yellowstone supervolcano, where it appears the ground has risen and fallen for thousands of years without an eruption.

According to current theory, Yellowstone's magma reservoir is fed by a plume of hot rock surging upward from Earth's mantle. (Related: "New Magma Layer Found Deep in Earth's Mantle?")

When the amount of magma flowing into the chamber increases, the reservoir swells like a lung and the surface above expands upward. Models suggest that during the recent uplift, the reservoir was filling with 0.02 cubic miles (0.1 cubic kilometer) of magma a year.

When the rate of increase slows, the theory goes, the magma likely moves off horizontally to solidify and cool, allowing the surface to settle back down.

Based on geologic evidence, Yellowstone has probably seen a continuous cycle of inflation and deflation over the past 15,000 years, and the cycle will likely continue, Smith said.

Surveys show, for example, that the caldera rose some 7 inches (18 centimeters) between 1976 and 1984 before dropping back about 5.5 inches (14 centimeters) over the next decade.

"These calderas tend to go up and down, up and down," he said. "But every once in a while they burp, creating hydrothermal explosions, earthquakes, or—ultimately—they can produce volcanic eruptions."

Yellowstone Surge Also Linked to Geysers, Quakes?

Predicting when an eruption might occur is extremely difficult, in part because the fine details of what's going on under Yellowstone are still undetermined. What's more, continuous records of Yellowstone's activity have been made only since the 1970s—a tiny slice of geologic time—making it hard to draw conclusions.

"Clearly some deep source of magma feeds Yellowstone, and since Yellowstone has erupted in the recent geological past, we know that there is magma at shallower depths too," said Dan Dzurisin, a Yellowstone expert with the USGS Cascades Volcano Observatory in Washington State.

"There has to be magma in the crust, or we wouldn't have all the hydrothermal activity that we have," Dzurisin added. "There is so much heat coming out of Yellowstone right now that if it wasn't being reheated by magma, the whole system would have gone stone cold since the time of the last eruption 70,000 years ago."

The large hydrothermal system just below Yellowstone's surface, which produces many of the park's top tourist attractions, may also play a role in ground swelling, Dzurisin said, though no one is sure to what extent.

"Could it be that some uplift is caused not by new magma coming in but by the hydrothermal system sealing itself up and pressurizing?" he asked. "And then it subsides when it springs a leak and depressurizes? These details are difficult."

And it's not a matter of simply watching the ground rise and fall. Different areas may move in different directions and be interconnected in unknown ways, reflecting the as yet unmapped network of volcanic and hydrothermal plumbing.

The roughly 3,000 earthquakes in Yellowstone each year may offer even more clues about the relationship between ground uplift and the magma chamber.

For example, between December 26, 2008, and January 8, 2009, some 900 earthquakes occurred in the area around Yellowstone Lake.

This earthquake "swarm" may have helped to release pressure on the magma reservoir by allowing fluids to escape, and this may have slowed the rate of uplift, the University of Utah's Smith said.

"Big quakes [can have] a relationship to uplift and deformations caused by the intrusion of magma," he said. "How those intrusions stress the adjacent faults, or how the faults might transmit stress to the magma system, is a really important new area of study."

Overall, USGS's Dzurisin added, "the story of Yellowstone deformation has gotten more complex as we've had better and better technologies to study it."

Star Clock BC


Move over, Bill Gates. It appears that the world's first PC was invented during biblical times. It was a device so sophisticated that with a turn of a hand crank, mathematical gears mapped the positions of planets and stars. Now, more than 100 years since the discovery, experts are still vying to understand how such an advanced technology could have existed 2,000 years ago.

Multiple Asteroid Strikes May Have Killed Mars’s Magnetic Field


Once upon a time, Mars had a magnetic field, just like Earth. Four billion years ago, it vanished, taking with it the planet’s chances of evolving life as we know it. Now scientists have proposed a new explanation for its disappearance.

A model of asteroids striking the red planet suggests that, while no single impact would have short-circuited the dynamo that powered its magnetism, a quick succession of 20 asteroid strikes could have done the job.

“Each one crippled a little bit,” said geophysicist Jafar Arkani-Hamed of the University of Toronto, author of the new study. “We believe those were enough to cripple, cripple, cripple, cripple until it killed all of the dynamo forever.”

Rocky planets like Earth, Mars, Mercury and even the moon get their magnetic fields from the movement of molten iron inside their cores, a process called convection. Packets of molten iron rise, cool and sink within the core, and generate an electric current. The planet’s spinning turns that current into a magnetic field in a system known as a dynamo.

Magnetic fields can shield a planet from the constant rain of high-energy particles carried in the solar wind by deflecting charged particles away from the surface. Some studies have suggested that Earth’s magnetic field could have protected early life forms from the sun’s most harmful radiation, allowing more complex life to develop. But traces of magnetism in the Martian surface reveal that the red planet lost its magnetic field some four billion years ago, leaving its atmosphere to be dessicated by the harsh solar wind.

Previous studies suggested that a massive impact could have shut down Mars’s dynamo by warming the mantle layer, disrupting the heat flow from the core to the mantle and shutting down convection. The fact that the crust of Mars’s younger impact craters is not magnetized supports this idea. Earlier computer models by geophysicist James Roberts of Johns Hopkins University showed that the largest known impacts on Mars could turn the mantle to a warm blanket, bringing the dynamo to a standstill.

But Arkani-Hamed’s new study in the Journal of Geophysical Research suggests that just one impact wouldn’t suffice. The dynamo would recover in less than one hundred million years. “The magnetic field should come back again,” he said.

To make his case, Arkani-Hamed modeled the heat that could have been produced when — according to some geophysicists — an asteroid the size of Texas hit Mars about 4.5 billion years ago, producing the biggest impact in our solar system’s history. Called the Borealis impact, it may have flattened Mars’s entire northern hemisphere.

This mega-impact would have flattened out the heat cycle inside the planet, too, snuffing out the dynamo within about 20,000 years. Without the cold compress of the mantle to siphon heat away from the core, convection wouldn’t have a chance.

But left alone, convection would have recovered in the outer parts of the core, and eventually penetrated deep and started the whole core churning again. The Borealis impact would have crippled the dynamo, but not killed it outright.

“If there were a dynamo at 4.5 billion years, it could cease, go away and regenerate after about 100 million years,” he said.

But perhaps several impacts in a row could do the job. The planet’s crater record shows that Mars suffered 20 impacts in quick succession between 4.2 and 3.9 billion years ago. In work to be presented at the Lunar and Planetary Science Conference in The Woodlands, Texas this March, Arkani-Hamed teamed up with Roberts to show that just the five largest of these impacts could have shut down the magnetic field. The impacts came so rapidly that the dynamo had no time to recover before the next crippling blow arrived.

“This research is important because it shows that this scenario is plausible. It could have physically happened,” said Wesley Watters of Cornell University, who was not involved in the new research. “But to test this model versus another is enormously difficult to do.”

To really figure out when and how Mars lost its magnetic field, we’d need to know the ages of lots of Martian rocks with the same kind of precision with which we know them on Earth.

“We just don’t have that for Mars,” he said.

More Asteroids Could Have Made Life's Ingredients


A wider range of asteroids were capable of creating the kind of amino acids used by life on Earth, according to new NASA research.

Amino acids are used to build proteins, which are used by life to make structures like hair and nails, and to speed up or regulate chemical reactions. Amino acids come in two varieties that are mirror images of each other, like your hands. Life on Earth uses the left-handed kind exclusively. Since life based on right-handed amino acids would presumably work fine, scientists are trying to find out why Earth-based life favored left-handed amino acids.

In March, 2009, researchers at NASA's Goddard Space Flight Center in Greenbelt, Md., reported the discovery of an excess of the left-handed form of the amino acid isovaline in samples of meteorites that came from carbon-rich asteroids. This suggests that perhaps left-handed life got its start in space, where conditions in asteroids favored the creation of left-handed amino acids. Meteorite impacts could have supplied this material, enriched in left-handed molecules, to Earth. The bias toward left-handedness would have been perpetuated as this material was incorporated into emerging life.

In the new research, the team reports finding excess left-handed isovaline (L-isovaline) in a much wider variety of carbon-rich meteorites. "This tells us our initial discovery wasn't a fluke; that there really was something going on in the asteroids where these meteorites came from that favors the creation of left-handed amino acids," says Dr. Daniel Glavin of NASA Goddard. Glavin is lead author of a paper about this research published online in Meteoritics and Planetary Science January 17.

"This research builds on over a decade of work on excesses of left-handed isovaline in carbon-rich meteorites," said Dr. Jason Dworkin of NASA Goddard, a co-author on the paper.

"Initially, John Cronin and Sandra Pizzarello of Arizona State University showed a small but significant excess of L-isovaline in two CM2 meteorites. Last year we showed that L-isovaline excesses appear to track with the history of hot water on the asteroid from which the meteorites came. In this work we have studied some exceptionally rare meteorites which witnessed large amounts of water on the asteroid. We were gratified that the meteorites in this study corroborate our hypothesis," explained Dworkin.

L-isovaline excesses in these additional water-altered type 1 meteorites (i.e. CM1 and CR1) suggest that extra left-handed amino acids in water-altered meteorites are much more common than previously thought, according to Glavin. Now the question is what process creates extra left-handed amino acids. There are several options, and it will take more research to identify the specific reaction, according to the team.

However, "liquid water seems to be the key," notes Glavin. "We can tell how much these asteroids were altered by liquid water by analyzing the minerals their meteorites contain. The more these asteroids were altered, the greater the excess L-isovaline we found. This indicates some process involving liquid water favors the creation of left-handed amino acids."

Another clue comes from the total amount of isovaline found in each meteorite. "In the meteorites with the largest left-handed excess, we find about 1,000 times less isovaline than in meteorites with a small or non-detectable left-handed excess. This tells us that to get the excess, you need to use up or destroy the amino acid, so the process is a double-edged sword," says Glavin.

Whatever it may be, the water-alteration process only amplifies a small existing left-handed excess, it does not create the bias, according to Glavin. Something in the pre-solar nebula (a vast cloud of gas and dust from which our solar system, and probably many others, were born) created a small initial bias toward L-isovaline and presumably many other left-handed amino acids as well.

One possibility is radiation. Space is filled with objects like massive stars, neutron stars, and black holes, just to name a few, that produce many kinds of radiation. It's possible that the radiation encountered by our solar system in its youth made left-handed amino acids slightly more likely to be created, or right-handed amino acids a bit more likely to be destroyed, according to Glavin.

It's also possible that other young solar systems encountered different radiation that favored right-handed amino acids. If life emerged in one of these solar systems, perhaps the bias toward right-handed amino acids would be built in just as it may have been for left-handed amino acids here, according to Glavin.

The research was funded by the NASA Astrobiology Institute (NAI), which is administered by NASA's Ames Research Center in Moffett Field, Calif.; the NASA Cosmochemistry program, the Goddard Center for Astrobiology, and the NASA Post Doctoral Fellowship program. The team includes Glavin, Dworkin, Dr. Michael Callahan, and Dr. Jamie Elsila of NASA Goddard.

New Reactor Paves the Way for Efficiently Producing Fuel from Sunlight


Using a common metal most famously found in self-cleaning ovens, Sossina Haile hopes to change our energy future. The metal is cerium oxide -- or ceria -- and it is the centerpiece of a promising new technology developed by Haile and her colleagues that concentrates solar energy and uses it to efficiently convert carbon dioxide and water into fuels.
Solar energy has long been touted as the solution to our energy woes, but while it is plentiful and free, it can't be bottled up and transported from sunny locations to the drearier -- but more energy-hungry -- parts of the world. The process developed by Haile -- a professor of materials science and chemical engineering at the California Institute of Technology (Caltech) -- and her colleagues could make that possible.

The researchers designed and built a two-foot-tall prototype reactor that has a quartz window and a cavity that absorbs concentrated sunlight. The concentrator works "like the magnifying glass you used as a kid" to focus the sun's rays, says Haile.

At the heart of the reactor is a cylindrical lining of ceria. Ceria -- a metal oxide that is commonly embedded in the walls of self-cleaning ovens, where it catalyzes reactions that decompose food and other stuck-on gunk -- propels the solar-driven reactions. The reactor takes advantage of ceria's ability to "exhale" oxygen from its crystalline framework at very high temperatures and then "inhale" oxygen back in at lower temperatures.

"What is special about the material is that it doesn't release all of the oxygen. That helps to leave the framework of the material intact as oxygen leaves," Haile explains. "When we cool it back down, the material's thermodynamically preferred state is to pull oxygen back into the structure."

Specifically, the inhaled oxygen is stripped off of carbon dioxide (CO2) and/or water (H2O) gas molecules that are pumped into the reactor, producing carbon monoxide (CO) and/or hydrogen gas (H2). H2 can be used to fuel hydrogen fuel cells; CO, combined with H2, can be used to create synthetic gas, or "syngas," which is the precursor to liquid hydrocarbon fuels. Adding other catalysts to the gas mixture, meanwhile, produces methane. And once the ceria is oxygenated to full capacity, it can be heated back up again, and the cycle can begin anew.

For all of this to work, the temperatures in the reactor have to be very high -- nearly 3,000 degrees Fahrenheit. At Caltech, Haile and her students achieved such temperatures using electrical furnaces. But for a real-world test, she says, "we needed to use photons, so we went to Switzerland." At the Paul Scherrer Institute's High-Flux Solar Simulator, the researchers and their collaborators -- led by Aldo Steinfeld of the institute's Solar Technology Laboratory -- installed the reactor on a large solar simulator capable of delivering the heat of 1,500 suns.

In experiments conducted last spring, Haile and her colleagues achieved the best rates for CO2 dissociation ever achieved, "by orders of magnitude," she says. The efficiency of the reactor was uncommonly high for CO2 splitting, in part, she says, "because we're using the whole solar spectrum, and not just particular wavelengths." And unlike in electrolysis, the rate is not limited by the low solubility of CO2 in water. Furthermore, Haile says, the high operating temperatures of the reactor mean that fast catalysis is possible, without the need for expensive and rare metal catalysts (cerium, in fact, is the most common of the rare earth metals -- about as abundant as copper).

In the short term, Haile and her colleagues plan to tinker with the ceria formulation so that the reaction temperature can be lowered, and to re-engineer the reactor, to improve its efficiency. Currently, the system harnesses less than 1% of the solar energy it receives, with most of the energy lost as heat through the reactor's walls or by re-radiation through the quartz window. "When we designed the reactor, we didn't do much to control these losses," says Haile. Thermodynamic modeling by lead author and former Caltech graduate student William Chueh suggests that efficiencies of 15% or higher are possible.

Ultimately, Haile says, the process could be adopted in large-scale energy plants, allowing solar-derived power to be reliably available during the day and night. The CO2 emitted by vehicles could be collected and converted to fuel, "but that is difficult," she says. A more realistic scenario might be to take the CO2 emissions from coal-powered electric plants and convert them to transportation fuels. "You'd effectively be using the carbon twice," Haile explains. Alternatively, she says, the reactor could be used in a "zero CO2 emissions" cycle: H2O and CO2 would be converted to methane, would fuel electricity-producing power plants that generate more CO2 and H2O, to keep the process going.

The work was funded by the National Science Foundation, the State of Minnesota Initiative for Renewable Energy and the Environment, and the Swiss National Science Foundation.

Robotic Ghost Knifefish Is 'Born'


Researchers at Northwestern University have created a robotic fish that can move from swimming forward and backward to swimming vertically almost instantaneously by using a sophisticated, ribbon-like fin.
The robot -- created after observing and creating computer simulations of the black ghost knifefish -- could pave the way for nimble robots that could perform underwater recovery operations or long-term monitoring of coral reefs.

Led by Malcolm MacIver, associate professor of mechanical and biomedical engineering at Northwestern's McCormick School of Engineering and Applied Science, the team's results are published in the Journal of the Royal Society Interface.

The black ghost knifefish, which works at night in rivers of the Amazon basin, hunts for prey using a weak electric field around its entire body and moves both forward and backward using a ribbon-like fin on the underside of its body.

MacIver, a robotics expert who served as a scientific consultant for "Tron: Legacy" and is science advisor for the television series "Caprica," has studied the knifefish for years. Working with Neelesh Patankar, associate professor of mechanical engineering and co-author of the paper, he has created mechanical models of the fish in hopes of better understanding how the nervous system sends messages throughout the body to make it move.

Planning for the robot -- called GhostBot -- began when graduate student Oscar Curet, a co-author of the paper, observed a knifefish suddenly moving vertically in a tank in MacIver's lab.

"We had only tracked it horizontally before," said MacIver, a recent recipient of the Presidential Early Career Award for Scientists and Engineers. "We thought, 'How could it be doing this?'"

Further observations revealed that while the fish only uses one traveling wave along the fin during horizontal motion (forward or backward depending on the direction on the wave), while moving vertically it uses two waves. One of these moves from head to tail, and the other moves tail to head. The two waves collide and stop at the center of the fin.

The team then created a computer simulation that showed that when these "inward counterpropagating waves" are generated by the fin, horizontal thrust is canceled and the fluid motion generated by the two waves is funneled into a downward jet from the center of the fin, pushing the body up. The flow structure looks like a mushroom cloud with an inverted jet.

"It's interesting because you're getting force coming off the animal in a completely unexpected direction that allows it to do acrobatics that, given its lifestyle of hunting and maneuvering among tree roots, makes a huge amount of sense," MacIver said.

The group then hired Kinea Design, a design firm founded by Northwestern faculty that specializes in human interactive mechatronics, and worked closely with its co-founder, Michael Peshkin, professor of mechanical engineering, to design and build a robot. The company fashioned a forearm-length waterproof robot with 32 motors giving independent control of the 32 artificial fin rays of the lycra-covered artificial fin. (That means the robot has 32 degrees of freedom. In comparison, industrial robot arms typically have less than 10.) Seven months and $200,000 later, the GhostBot came to life.

The group took the robot to Harvard University to test it in a flow tunnel in the lab of George V. Lauder, professor of ichthyology and co-author of the paper. The team measured the flow around the robotic fish by placing reflective particles in the water, then shining a laser sheet into the water. That allowed them to track the flow of the water by watching the particles, and the test showed the water flowing around the biomimetic robot just as computer simulations predicted it would.

"It worked perfectly the first time," MacIver said. "We high-fived. We had the robot in the real world being pushed by real forces."

The robot is also outfitted with an electrosensory system that works similar to the knifefish's, and MacIver and his team hope to next improve the robot so it can autonomously use its sensory signals to detect an object and then use its mechanical system to position itself near the object.

Humans excel at creating high-speed, low-maneuverability technologies, like airplanes and cars, MacIver said. But studying animals provides a platform for creating low-speed, high-maneuverability technologies -- technologies that don't currently exist. Potential applications for such a robot include underwater recovery operations, such as plugging a leaking oil pipe, or long-term monitoring of oceanic environments, such as fragile coral reefs.

While the applied work on the robot moves ahead in the lab, the group is pursuing basic science questions as well. "The robot is a tool for uncovering the extremely complicated story of how to coordinate movement in animals," MacIver said. "By simulating and then performing the motions of the fish, we're getting insight into the mechanical basis of the remarkable agility of a very acrobatic, non-visual fish. The next step is to take the sensory work and unite the two."

Nanoscale Rope: Complex Nanomaterials That Assemble Themselves


Scientists at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have coaxed polymers to braid themselves into wispy nanoscale ropes that approach the structural complexity of biological materials.
Serkeley Lab scientists have developed a nanoscale rope that braids itself, as seen in this atomic force microscopy image of the structure at a resolution of one-millionth of a meter.

Their work is the latest development in the push to develop self-assembling nanoscale materials that mimic the intricacy and functionality of nature's handiwork, but which are rugged enough to withstand harsh conditions such as heat and dryness.

Although still early in the development stage, their research could lead to new applications that combine the best of both worlds. Perhaps they'll be used as scaffolds to guide the construction of nanoscale wires and other structures. Or perhaps they'll be used to develop drug-delivery vehicles that target disease at the molecular scale, or to develop molecular sensors and sieve-like devices that separate molecules from one another.

Specifically, the scientists created the conditions for synthetic polymers called polypeptoids to assemble themselves into ever more complicated structures: first into sheets, then into stacks of sheets, which in turn roll up into double helices that resemble a rope measuring only 600 nanometers in diameter (a nanometer is a billionth of a meter).

"This hierarchichal self assembly is the hallmark of biological materials such as collagen, but designing synthetic structures that do this has been a major challenge," says Ron Zuckermann, who is the Facility Director of the Biological Nanostructures Facility in Berkeley Lab's Molecular Foundry.

In addition, unlike normal polymers, the scientists can control the atom-by-atom makeup of the ropy structures. They can also engineer helices of specific lengths and sequences. This "tunability" opens the door for the development of synthetic structures that mimic biological materials' ability to carry out incredible feats of precision, such as homing in on specific molecules.

"Nature uses exact length and sequence to develop highly functional structures. An antibody can recognize one form of a protein over another, and we're trying to mimic this," adds Zuckermann.

Zuckermann and colleagues conducted the research at The Molecular Foundry, which is one of the five DOE Nanoscale Science Research Centers premier national user facilities for interdisciplinary research at the nanoscale. Joining him were fellow Berkeley Lab scientists Hannah Murnen, Adrianne Rosales, Jonathan Jaworski, and Rachel Segalman. Their research was published in a recent issue of the Journal of the American Chemical Society.

The scientists worked with chains of bioinspired polymers called a peptoids. Peptoids are structures that mimic peptides, which nature uses to form proteins, the workhorses of biology. Instead of using peptides to build proteins, however, the scientists are striving to use peptoids to build synthetic structures that behave like proteins.

The team started with a block copolymer, which is a polymer composed of two or more different monomers.

"Simple block copolymers self assemble into nanoscale structures, but we wanted to see how the detailed sequence and functionality of bioinspired units could be used to make more complicated structures," says Rachel Segalman, a faculty scientist at Berkeley Lab and professor of Chemical and Biomolecular Engineering at University of California, Berkeley.

With this in mind, the peptoid pieces were robotically synthesized, processed, and then added to a solution that fosters self assembly.

The result was a variety of self-made shapes and structures, with the braided helices being the most intriguing. The hierarchical structure of the helix, and its ability to be manipulated atom-by-atom, means that it could be used as a template for mineralizing complex structures on a nanometer scale.

"The idea is to assemble structurally complex structures at the nanometer scale with minimal input," says Hannah Murnen. She adds that the scientists next hope is to capitalize on the fact that they have minute control over the structure's sequence, and explore how very small chemical changes alter the helical structure.

Says Zuckermann, "These braided helices are one of the first forays into making atomically defined block copolymers. The idea is to take something we normally think of as plastic, and enable it to adopt structures that are more complex and capable of higher function, such as molecular recognition, which is what proteins do really well."

X-ray diffraction experiments used to characterize the structures were conducted at beamlines 8.3.1 and 7.3.3 of Berkeley Lab's Advanced Light Source, a national user facility that generates intense x-rays to probe the fundamental properties of substances. This work was supported in part by the Office of Naval Research.

Like Humans, Amoebae Pack a Lunch Before They Travel


Some amoebae do what many people do. Before they travel, they pack a lunch. In results of a study reported January 19 in the journal Nature, evolutionary biologists Joan Strassmann and David Queller of Rice University show that long-studied social amoebae Dictyostellum discoideum (commonly known as slime molds) increase their odds of survival through a rudimentary form of agriculture.

Research by lead author Debra Brock, a graduate student at Rice, found that some amoebae sequester their food--particular strains of bacteria--for later use.

"We now know that primitively social slime molds have genetic variation in their ability to farm beneficial bacteria as a food source," says George Gilchrist, program director in the National Science Foundation's Division of Environmental Biology, which funded the research. "But the catch is that with the benefits of a portable food source, comes the cost of harboring harmful bacteria."

After these "farmer" amoebae aggregate into a slug, they migrate in search of nourishment--and form a fruiting body, or a stalk of dead amoebae topped by a sorus, a structure containing fertile spores. Then they release the bacteria-containing spores to the environment as feedstock for continued growth.

The findings run counter to the presumption that all "Dicty" eat everything in sight before they enter the social spore-forming stage.

Non-farmer amoebae do eat everything, but farmers were found to leave food uneaten, and their slugs don't travel as far.

Perhaps because they don't have to.

The advantages of going hungry now to ensure a good food supply later are clear, as farmers are able to thrive in environments in which non-farmers find little food.

The researchers found that about a third of wild-collected Dicty are farmers.

Instead of consuming all the bacteria they encounter, these amoebae eat less and incorporate bacteria into their migratory systems.

Brock showed that carrying bacteria is a genetic trait by eliminating all living bacteria from four farmers and four non-farmers--the control group--by treating them with antibiotics.

All amoebae were grown on dead bacteria; tests confirmed that they were free of live bacteria.

When the eight clones were then fed live bacteria, the farmers all regained their abilities to seed bacteria colonies, while the non-farmers did not.

Dicty farmers are always farmers; non-farmers never learn.

Rice graduate student Tracy Douglas co-authored the paper with Brock, Queller and Strassmann. She confirmed that farmers and non-farmers belong to the same species and do not form a distinct evolved group.

Still, mysteries remain.

The researchers want to know what genetic differences separate farmers from non-farmers. They also wonder why farmer clones don't migrate as far as their counterparts.

It might be a consequence of bacterial interference, they say, or an evolved response, since farmers carry the seeds of their own food supply and don't need to go as far.

Also, some seemingly useless or even harmful bacteria are not consumed as food, but may serve an as-yet-undetermined function, Brock says.

That has implications for treating disease as it may, for instance, provide clues to the way tuberculosis bacteria invade cells, says Strassmann, infecting the host while resisting attempts to break them down.

The results demonstrate the importance of working in natural environments with wild organisms whose complex ties to their living environment have not been broken.

martes, 18 de enero de 2011

Mount Etna Erupts Overnight


Towering nearly 11,000 feet (3,350 meters) over the island of Sicily, Europe's tallest and most active volcano began trembling Tuesday afternoon, seismologists told the OurAmazingPlanet news site. Wednesday and Thursday saw flames and ash flung hundreds of yards into the sky, closing down area airports.Lava heads for the sea as Mount Etna erupts Wednesday night on Sicily.

Though spectacular, this week's lava fountains aren't exactly a surprise. "We expected Etna to return to activity in this period," volcanologist Boris Behncke told OurAmazingPlanet. "There had been lots of premonitory signals."

The vent that spewed the lava pictured had out on smaller shows around Christmas and New Year's, Behncke added.Lava pours from a pit crater high up Mount Etna Wednesday night. Despite its nearly constant activity, the Sicilian volcano rarely causes harm, since its eruptions occur so high up and its lava moves relatively slowly.

Hot Asphalt


Photograph by Gabriel Bouys, AFP/Getty Images

Lava from normally harmless Mount Etna sears a street and edges blisteringly close to a restaurant near the Sicilian city of Nicolosi in July 2001.

Mount Etna's most destructive eruption in history went on for four months in 1669, singeing a dozen villages, breaking through Catania's city walls, and finally steaming to a standstill in the sea.

Dark-Matter Galaxy Detected: Hidden Dwarf Lurks Nearby?


Richard A. Lovett in Seattle, Washington

for NGN
Published January 14, 2011

An entire galaxy may be lurking, unseen, just outside our own, scientists announced Thursday.

The invisibility of "Galaxy X"—as the purported body has been dubbed—may be due less to its apparent status as a dwarf galaxy than to its murky location and its overwhelming amount of dark matter, astronomer Sukanya Chakrabarti speculates.

Detectable only by the effects of its gravitational pull, dark matter is an invisible material that scientists think makes up more than 80 percent of the mass in the universe.

Chakrabarti, of the University of California, Berkeley, devised a technique similar to that used 160 years ago to predict the existence of Neptune, which was given away by the wobbles its gravity induced in Uranus's orbit.

Based on gravitational perturbations of gases on the fringes of our Milky Way galaxy, Chakrabarti came to her conclusion that there's a heretofore unknown dwarf galaxy about 260,000 light-years away.

With an estimated mass equal to only one percent the mass of the Milky Way, Galaxy X is still the third largest of the Milky Way's satellite galaxies, Chakrabarti predicts. The two Magellanic are each about ten times larger.

If it exists, Galaxy X isn't likely to be composed entirely of dark matter.

It should also have a sprinkling of dim stars, Chakrabarti said. "These should provide enough light for astronomers to see it, now that they know where to look," she said.

The reason the dark matter galaxy hasn't yet been seen, she added, is because it lies in the same plane as the Milky Way disc. Clouds of gas and dust stand between us and Galaxy X, confounding telescopes.

Galaxy X Addresses Fundamental Problem

If Galaxy X's existence is confirmed, it would be a major step in verifying our understanding of how the universe condensed from primordial matter and energy after the big bang, Chakrabarti said.

Current theory correctly predicts the distribution of distant galaxies, she said. But it also predicts hundreds of dwarf galaxies around the Milky Way, and to date only a few dozen have been found.

This "missing satellite problem" she said, "is a fundamental problem in cosmology."

More Dark Galaxies Out There?

Galaxy X could soon lead to Galaxies Y and Z, according to Chakrabarti.

"This is basically a new method to render dark galaxies visible," she said, adding that her technique should be able to detect dim dwarf galaxies as small as a thousandth the mass of the Milky Way.

The new finding is a useful contribution to projects aiming to map the distribution of dark matter on the far edges of the universe, according to David Pooley, a Texas-based dark matter astronomer with Eureka Scientific, a private corporation that helps scientists secure research funding.

"All of these dark matter studies are really starting to map out the distribution of dark matter," said Pooley, who was not part of Chakrabarti's team. "Any information we get is extremely valuable."


Galaxy X: The Search Begins


Now that astronomers know where to look for Galaxy X, they should be able to find it, especially if they conduct the search in dust-penetrating infrared light, Chakrabarti said.

"Say you're looking for a car with very dim headlights, in the fog," she said. "If you know approximately where to look, you would have a better chance of finding it."

Chakrabarti hopes to do some looking herself within the next few months and is seeking to secure time at a large infrared telescope.

Even if Galaxy X isn't confirmed, she said, her findings will still shed new light on a shady subject.

The absence of X would mean there's some other oddity out there throwing off the calculations—perhaps an unexpected distribution pattern of the halo of dark matter thought to surround the Milky Way.

"We still stand to learn something very fundamental," she said.

Pterosaurs 10 Times Heavier than Biggest Birds


* Fossilized footprints left behind by pterosaurs were used to estimate the body weight of these flying reptiles.
* The new weight estimation technique determined that pterosaurs weighed up to 320 pounds.
* The method may be applied to dinosaurs, providing further evidence for how heavy these animals were.

The Museo Civico di Scienze Naturali's diorama shows the pterosaur Eudimorphon ranzii. New calculations suggests these reptiles weight up to 320 pounds. Click to enlarge this image.

Today's heaviest flying bird, the 48-pound Kori Bustard, was a lightweight compared to dinosaur-era flying reptiles known as pterosaurs, according to a new study that presents a novel method for estimating pterosaur weight.

Calculations based on footprints left behind by pterosaurs, also known as pterodactyls, reveal these animals weighed up to 320 pounds. The study, published in the journal Palaeogeography, Palaeoclimatology, Palaeoecology, is believed to be the first to use fossilized footprints to infer the body weight of an extinct animal.

Prior studies utilized everything from pterosaur bone mass to volumetric modeling to figure out how much the prehistoric "winged lizards" weighed.

"My study provides good independent evidence that large pterosaurs were quite heavy," author Tai Kubo told Discovery News.

Kubo, a paleontologist at the Fukui Prefectural Dinosaur Museum in Japan, began his study by collecting trackways from 17 species of living reptiles housed at Ueno Zoo in Tokyo. These animals included crocodiles, lizards, tortoises and a frog.

"I went to the zoo every week for two years," he said. "Since some reptiles, such as crocodiles, are very reluctant to move, and often it is very difficult to predict which way they will move, I had to just keep waiting for a whole day to collect one trackway."

After collecting the footprints, Kubo noted the weights of each animal. He determined there was a connection between weight and a calculated relationship between the individual's fore and hind limb foot sizes measured from their tracks.

Kubo then took this mathematical formula and applied it to fossilized trackways that had previously been attributed to pterosaurs. He found that many pterosaurs were at least 10 times heavier than the heaviest modern flying birds.

Michael Habib, a Chatham University assistant professor of biology, told Discovery News that he thinks Kubo's new method for estimating pterosaur weight is "sensible and interesting." He indicated, however, that the jury is still out on just how heavy the largest pterosaurs really were.

Habib mentioned that prior estimates, using other methods, concluded that big pterosaurs weighed anywhere from 441 to 573 pounds.

"I suspect that the trackway method of body mass estimation will be made more precise in the future," Habib said.

Donald Henderson, curator of dinosaurs at the Royal Tyrrell Museum, wondered if Kubo's data might have been different if he had used footprints made by living birds instead of the other animals Kubo selected for his study.

"I think the largest pterosaurs -- things like Quetzalcoatlus -- reached several hundred kilograms," Henderson told Discovery News. "Unfortunately, we have nothing even beginning to resemble a complete skeleton for the largest pterosaurs, so their body shapes and volumes, and associated body masses, are not known with any certainty."

Kubo, however, holds that "footprints can be a good indicator of body weight."

In the future, he hopes to apply his weight estimation calculations to dinosaurs.

"Before applying the method to dinosaurs," Kubo said, "we have to obtain a lot of data about body weight and footprints of extant birds and mammals, since their postures are more similar to those of dinosaurs than those of extant reptiles and amphibians."

He added, "I think weight can be measured from any footprint, and the result can tell us a lot of things."

2,550-Year-Old Celtic Beer Recipe Resurrected


Early Celtic rulers of a community in what’s now southwestern Germany liked to party, staging elaborate feasts in a ceremonial center. The business side of their revelries was located in a nearby brewery capable of turning out large quantities of a beer with a dark, smoky, slightly sour taste, new evidence suggests.

sciencenewsSix specially constructed ditches previously excavated at Eberdingen-Hochdorf a 2,550-year-old Celtic settlement, were used to make high-quality barley malt, a key beer ingredient, says archaeobotanist Hans-Peter Stika of the University of Hohenheim in Stuttgart. Thousands of charred barley grains unearthed in the ditches about a decade ago came from a large malt-making enterprise, Stika reports in a paper published online January 4 in Archaeological and Anthropological Sciences.

Stika bases that conclusion on a close resemblance of the ancient grains to barley malt that he made by reproducing several methods that Iron Age folk might have used. He also compared the ancient grains to malt produced in modern facilities. Upon confirming the presence of malt at the Celtic site, Stika reconstructed malt-making techniques there to determine how they must have affected beer taste.

The oldest known beer residue and brewing facilities date to 5,500 years ago in the Middle East, but archaeological clues to beer’s history are rare (SN: 10/2/04, p. 216).

At the Celtic site, barley was soaked in the specially constructed ditches until it sprouted, Stika proposes. Grains were then dried by lighting fires at the ends of the ditches, giving the malt a smoky taste and a darkened color. Lactic acid bacteria stimulated by slow drying of soaked grains, a well-known phenomenon, added sourness to the brew.

Unlike modern beers that are flavored with flowers of the hop plant, the Eberdingen-Hochdorf brew probably contained spices such as mugwort, carrot seeds or henbane, in Stika’s opinion. Beer makers are known to have used these additives by medieval times. Excavations at the Celtic site have yielded a few seeds of henbane, a plant that also makes beer more intoxicating.

“These additives gave Celtic beer a completely different taste than what we’re used to today,” Stika says.

Heated stones placed in liquefied malt during the brewing process — a common practice later in Europe — would have added a caramelized flavor to this fermented Celtic drink, he adds. So far, no fire-cracked stones have been found at Eberdingen-Hochdorf but they may have been used to heat pulpy malt slowly, a practice documented at later brewing sites, Stika says. He suspects that fermentation was triggered by using yeast-coated brewing equipment or by adding honey or fruit, which both contain wild yeasts.

Celts consisted of Iron Age tribes, loosely tied by language and culture, that inhabited much of Western Europe from about the 11th to the first century B.C.

In the same report Stika describes another tidbit for fans of malt beverage history: A burned medieval structure from the 14th century A.D., recently unearthed in Berlin during a construction project, contains enough barley malt to have brewed 500 liters of beer, the equivalent of nearly 60 cases.

Classics professor Max Nelson of the University of Windsor in Canada, an authority on ancient beer, largely agrees with Stika’s conclusions. Malt-making occurred at Eberdingen-Hochsdorf and malt was probably stored in the medieval Berlin building, Nelson says.

Other stages of brewing occurred either at these sites, as suggested by Stika, or nearby, in Nelson’s view.

“Stika’s experiments go a long way toward showing how precisely barley was malted in ancient times,” he remarks.

Beer buffs today would regard Celtic beer as a strange brew not only for its flavor but because it would have been cloudy, contained yeasty sediment and been imbibed at room temperature, Nelson notes.

Stika’s insights into the range of techniques and ingredients available to Celtic beer makers should inspire modern “extreme brewers” to try out the recipe that he describes, says anthropologist Bettina Arnold of the University of Wisconsin–Milwaukee.

Perhaps they’ll find out whether Roman emperor Julian, in a 1,600-year-old poem, correctly described Celtic beer as smelling “like a billy goat.”

Chandra Images Torrent of Star Formation


A new Chandra X-ray Observatory image of Messier 82, or M82, shows the result of star formation on overdrive. M82 is located about 12 million light years from Earth and is the nearest place to us where the conditions are similar to those when the Universe was much younger with lots of stars forming.
M82 is a so-called starburst galaxy, where stars are forming at rates that are tens or even hundreds of times higher than in a normal galaxy. The burst of star birth may be caused by a close encounter or collision with another galaxy, which sends shock waves rushing through the galaxy. In the case of M82, astronomers think that a brush with its neighbor galaxy M81 millions of years ago set off this torrent of star formation.

M82 is seen nearly edge-on with its disk crossing from about 10 o'clock to about 4 o'clock in this image from Chandra (where low, medium, and high-energy X-rays are colored red, green, and blue respectively.) Among the 104 point-like X-ray sources in the image, eight so far have been observed to be very bright in X-rays and undergo clear changes in brightness over periods of weeks and years. This means they are excellent candidates to be black holes pulling material from companion stars that are much more massive than the Sun. Only a handful of such binary systems are known in the Local Group of galaxies containing the Milky Way and M31.

Chandra observations are also important in understanding the rapid rate at which supernovas explode in starburst galaxies like M82. When the shock waves travel through the galaxy, they push on giant clouds of gas and dust, which causes them to collapse and form massive stars. These stars, in turn, use up their fuel quickly and explode as supernovas. These supernovas produce expanding bubbles of multimillion-degree gas that extend for millions light years away from the galaxy's disk. These bubbles are seen as the large red areas to the upper right and lower left of the image.

NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra's science and flight operations from Cambridge, Mass.

Physicists Observe Exotic State in an Unconventional Superconductor


A new fractional vortex state observed in an unconventional superconductor may offer the first glimpse of an exotic state of matter predicted theoretically for more than 30 years. In a paper published in the January 14 issue of Science, University of Illinois physicists, led by Raffi Budakian, describe their observations of a new fractional vortex state in strontium ruthenium oxide (SRO). Such states may provide the basis for a novel form of quantum computing in which quantum information is encoded in the topological properties of a physical system.
"We've been on the trail of a state of matter called a half-quantum vortex for more than three years," said Budakian. "First proposed in the 1970s to exist in superfluid helium-3, a half-quantum vortex can be thought of as a 'texture' that arises from the spin phase of the superconducting order parameter."

Budukian's group investigated strontium ruthenium oxide (SRO), an unconventional superconductor that has been proposed as the solid-state analog of the A-phase of superfluid helium-3. Using state-of-the-art nanofabrication methods and exquisitely sensitive cantilever-based magnetometry techniques developed by the group, the researchers observed minute fluctuations in the magnetism of tiny rings of SRO.

"Strontium ruthenium oxide is a unique and fascinating material, and the half-quantum vortices that have been conjectured to exist in it are particularly interesting," said Anthony J. Leggett, the John D. and Catherine T. MacArthur Professor and Center for Advanced Study Professor of Physics, who shared the 2003 Nobel Prize in Physics for his work on superfluid helium-3. "It is believed that these half-quantum vortices in SRO may provide the basis for topological quantum computing. If this novel form of computing is eventually realized, this experiment will certainly be seen as a major milestone along the road there."

Budakian is an assistant professor of physics and a principal investigator in the Frederick Seitz Materials Research Laboratory at Illinois. Five years ago, he was instrumental in pioneering a technique, magnetic resonance force microscopy, to measure the force exerted on a micrometer-scale silicon cantilever by the spin of a single electron in a bulk material. He and his group have now adapted their ultrasensitive cantilever measurements to observe the magnetic behavior of SRO.

In the experiment, the researchers first fabricated a micron-sized ring of SRO and glued it to the tip of the silicon cantilever. How small are these rings? Fifty of them would fit across the width of a human hair. And the tips of the cantilevers are less than 2 μm wide.

"We take the high-energy physics approach to making these rings. First we smash the SRO, and then we sift through what's left," said Budakian.

The researchers first pulverize the large crystals of SRO into fragments, choose a likely micron-sized flake, and drill a hole in it using a focused beam of gallium ions. The resulting structure, which looks like a microscopic donut, is glued onto the sensitive silicon cantilever and then cooled to 0.4 degrees above absolute zero.

"Positioning the SRO ring on the cantilever is a bit like dropping one grain of sand precisely atop a slightly larger grain of sand," said Budakian, "only our 'grains of sand' are much smaller."

Budakian added that this technique is the first time such tiny superconducting rings have been fabricated in SRO.

Being able to make these rings is crucial to the experiment, according to Budakian, because the half-quantum vortex state is not expected to be stable in larger structures.

"Once we have the ring attached to the cantilever, we can apply static magnetic fields to change the 'fluxoid' state of the ring and detect the corresponding changes in the circulating current. In addition, we apply time-dependent magnetic fields to generate a dynamic torque on the cantilever. By measuring the frequency change of the cantilever, we can determine the magnetic moment produced by the currents circulating the ring," said Budakian.

"We've observed transitions between integer fluxoid states, as well as a regime characterized by 'half-integer' transitions," Budakian noted, "which could be explained by the existence of half-quantum vortices in SRO."

In addition to the advance in fundamental scientific understanding that Budakian's work provides, the experiment may be an important step toward the realization of a so-called "topological" quantum computer, as Leggett alluded.

Unlike a classical computer, which encodes information as bits whose values are either 0 or 1, a quantum computer would rely on the interaction among two-level quantum systems (e.g., the spins of electrons, trapped ions, or currents in superconducting circuits) to encode and process information. The massive parallelism inherent in quantal time evolution would provide rapid solutions to problems that are currently intractable, requiring vast amounts of time in conventional, classical machines.

For a functional quantum computer, the quantum bits or "qubits" must be strongly coupled to each other but remain sufficiently isolated from random environmental fluctuations, which cause the information stored in the quantum computer to decay -- a phenomenon known as decoherence. Currently, large-scale, international projects are underway to construct quantum computers, but decoherence remains the central problem for real-world quantum computation.

According to Leggett, "A rather radical solution to the decoherence problem is to encode the quantum information nonlocally; that is, in the global topological properties of the states in question. Only a very restricted class of physical systems is appropriate for such topological quantum computing, and SRO may be one of them, provided that certain conditions are fulfilled in it. One very important such condition is precisely the existence of half-quantum vortices, as suggested by the Budakian experiment."

This work was supported by the U.S. Department of Energy Office of Basic Sciences, grant DEFG02-07ER46453 through the Frederick Seitz Materials Research Laboratory at the University of Illinois at Urbana-Champaign, the Grants-in-Aid for the "Topological Quantum Phenomena" and the Global COE "Next Generation of Physics" programs from the Ministry of Education, Culture, Sports, Science and Technology of Japan.

ARkStorm: California’s Other 'Big One'


For emergency planning purposes, scientists unveiled a hypothetical California scenario that describes a storm that could produce up to 10 feet of rain, cause extensive flooding (in many cases overwhelming the state's flood-protection system) and result in more than $300 billion in damage.

The "ARkStorm Scenario," prepared by the U.S. Geological Survey and released at the ARkStorm Summit in Sacramento on Jan. 13-14, combines prehistoric geologic flood history in California with modern flood mapping and climate-change projections to produce a hypothetical, but plausible, scenario aimed at preparing the emergency response community for this type of hazard.

The USGS, the Federal Emergency Management Agency and the California Emergency Management Agency convened the two-day summit to engage stakeholders from across California to take action as a result of the scenario's findings, which were developed over the last two years by more than 100 scientists and experts.

"The ARkStorm scenario is a complete picture of what that storm would do to the social and economic systems of California," said Lucy Jones, chief scientist of the USGS Multi-Hazards Demonstration Project and architect of ARkStorm. "We think this event happens once every 100 or 200 years or so, which puts it in the same category as our big San Andreas earthquakes. The ARkStorm is essentially two historic storms (January 1969 and February 1986) put back to back in a scientifically plausible way. The model is not an extremely extreme event."

Jones noted that the largest damages would come from flooding -- the models estimate that almost one-fourth of the houses in California would experience some flood damage from this storm.

"The time to begin taking action is now, before a devastating natural hazard event occurs," said USGS Director, Marcia McNutt. "This scenario demonstrates firsthand how science can be the foundation to help build safer communities. The ARkStorm scenario is a scientifically vetted tool that emergency responders, elected officials and the general public can use to plan for a major catastrophic event to help prevent a hazard from becoming a disaster."

To define impacts of the ARkStorm, the USGS, in partnership with the California Geological Survey, created the first statewide landslide susceptibility maps for California that are the most detailed landslide susceptibility maps ever created. The project also resulted in the first physics-based coastal storm modeling system for analyzing severe storm impacts (predicting wave height and coastal erosion) under present-day scenarios and under various climate-change and sea-level-rise scenarios.

Because the scenario raised serious questions about existing national, state and local disaster policy and emergency management systems, ARkStorm became the theme of the 2010 Extreme Precipitation Symposium at U.C. Davis John Muir Institute of the Environment, attracting over 200 leaders in meteorology and flood management. ARkStorm is part of the efforts to create a National Real-Time Flood Mapping initiative to improve flood management nationwide. ARkStorm also provided a platform for emergency managers, meteorologists and hydrologists to work together to develop a scaling system for west coast storms.

"Cal EMA is proud to partner with the USGS in this important work to protect California from disasters," said Cal EMA Acting Secretary Mike Dayton. "In order to have the most efficient and effective plans and response capabilities, we have to have the proper science to base it on. Californians are better protected because of the scientific efforts of the United States Geological Survey."

According to FEMA Region IX Director, Nancy Ward, "The ARkStorm report will prove to be another invaluable tool in engaging the whole of our community in addressing flood emergencies in California. It is entirely possible that flood control infrastructure and mitigation efforts could be overwhelmed by the USGS ARkStorm scenario, and the report suggests ways forward to limit the damage that is sure to result."

The two-day summit included professional flood managers, emergency mangers, first responders, business continuity managers, forecasters, hydrologists and decision makers. Many of the scientists responsible for coordinating the ARkStorm scenario presented the science behind the scenario, including meteorology, forecasting, flood modeling, landslides and physical and economic impacts.

The ARkStorm Scenario is the second scenario from the USGS Multi-Hazards Demonstration Project led by Jones, which earlier created the ShakeOut earthquake scenario. More information about the ARkStorm Summit is online (http://urbanearth.usgs.gov/arkstorm-summit/). The ARkStorm Scenario, USGS Open-File Report 2010-1312, is also online (http://pubs.usgs.gov/of/2010/1312/).

ScienceDaily: Your source for the latest research news and science breakthroughs -- updated daily Science News Share Blog Cite Print Bookmark


Do you want that in a pill or a shot? A pill, thank you, but most patients never have that choice. The problem with administering many medications orally is that a pill often will not dissolve at exactly the right site in the gastrointestinal tract where the medicine can be absorbed into the bloodstream. A new magnetic pill system developed by Brown University researchers could solve the problem by safely holding a pill in place in the intestine wherever it needs to be.The scientists describe the harmless operation of their magnetic pill system in rats online the week of Jan. 17 in the Proceedings of the National Academy of Sciences. Applied to people in the future, said senior author Edith Mathiowitz, the technology could provide a new way to deliver many drugs to patients, including those with cancer or diabetes. It could also act as a powerful research tool to help scientists understand exactly where in the intestine different drugs are best absorbed.

"With this technology you can now tell where the pill is placed, take some blood samples and know exactly if the pill being in this region really enhances the bioavailability of the medicine in the body," said Mathiowitz, professor of medical science in Brown's Department of Molecular Pharmacology, Physiology, and Biotechnology. "It's a completely new way to design a drug delivery system."

The two main components of the system are conventional-looking gelatin capsules that contain a tiny magnet, and an external magnet that can precisely sense the force between it and the pill and vary that force, as needed, to hold the pill in place. The external magnet can sense the pill's position, but because the pill is opaque to x-rays, the researchers were also able to see the pill in the rat's bodies during their studies.

Safety first

The system is not the first attempt to guide pills magnetically, but it is the first one in which scientists can control the forces on a pill so that it's safe to use in the body. They designed their system to sense the position of pills and hold them there with a minimum of force.

"The most important thing is to be able to monitor the forces that you exert on the pill in order to avoid damage to the surrounding tissue," said Mathiowitz. "If you apply a little more than necessary force, your pill will be pulled to the external magnet, and this is a problem."

To accomplish this, the team including lead author and former graduate student Bryan Laulicht took careful measurements and built an external magnet system with sophisticated computer control and feedback mechanisms.

"The greatest challenges were quantifying the required force range for maintaining a magnetic pill in the small intestines and constructing a device that could maintain intermagnetic forces within that range," said Laulicht, who is now a postdoctoral scholar at MIT.

Even after holding a pill in place for 12 hours in the rats, the system applied a pressure on the intestinal wall that was less than 1/60th of what would be damaging.

The next step in the research is to begin delivering drugs using the system and testing their absorption, Mathiowitz and Laulicht said.

"Then it will move to larger animal models and ultimately into the clinic," Laulicht said. "It is my hope that magnetic pill retention will be used to enable oral drug delivery solutions to previously unmet medical needs."

In addition to Mathiowitz and Laulicht, authors on the paper include Brown researchers Nicholas Gidmark and Anubhav Tripathi. Brown University funded the research.

The risk of a massive "superstorm"


A group of more than 100 scientists and experts say in a new report that California faces the risk of a massive "superstorm" that could flood a quarter of the state's homes and cause $300 billion to $400 billion in damage. Researchers point out that the potential scale of destruction in this storm scenario is four or five times the amount of damage that could be wrought by a major earthquake.

It sounds like the plot of an apocalyptic action movie, but scientists with the U.S. Geological Survey warned federal and state emergency officials that California's geological history shows such "superstorms" have happened in the past, and should be added to the long list of natural disasters to worry about in the Golden State.
The threat of a cataclysmic California storm has been dormant for the past 150 years. Geological Survey director Marcia K. McNutt told the New York Times that a 300-mile stretch of the Central Valley was inundated from 1861-62. The floods were so bad that the state capital had to be moved to San Francisco, and Governor Leland Stanford had to take a rowboat to his own inauguration, the report notes. Even larger storms happened in past centuries, over the dates 212, 440, 603, 1029, 1418, and 1605, according to geological evidence.

The risk is gathering momentum now, scientists say, due to rising temperatures in the atmosphere, which has generally made weather patterns more volatile.
The scientists built a model that showed a storm could last for more than 40 days and dump 10 feet of water on the state. The storm would be goaded on by an "atmospheric river" that would move water "at the same rate as 50 Mississippis discharging water into the Gulf of Mexico," according to the AP. Winds could reach 125 miles per hour, and landslides could compound the damage, the report notes.
Such a superstorm is hypothetical but not improbable, climate researchers warn. "We think this event happens once every 100 or 200 years or so, which puts it in the same category as our big San Andreas earthquakes," Geological Survey scientist Lucy Jones said in a press release.Federal and state emergency management officials convened a conference about emergency preparations for possible superstorms last week.

miércoles, 12 de enero de 2011

Welfare Robots' to Ease Burden in Graying Japan


The Yurina Care Robot is a robotic bed that can move a person into just about any position. * A robot wheelchair automatically moves besides a walking person.
* A robotic arm that attaches to a wheelchair can pick up objects.
* A robot predicts a person's wishes by reading their faces.

The Yurina Care Robot is a robotic bed that can move a person into just about any position.

Robotic wheelchairs, mechanical arms and humanoid waiters are among the cutting-edge inventions on show at a robotics fair in Japan, a country whose population is ageing rapidly.

To ease the burden in a nation with one of the world's highest life expectancies, engineers have come up with technologies to make life easier for the elderly and disabled, and their caregivers.

A new robot wheelchair developed at Saitama University near Tokyo doesn't need to be propelled manually by the user or pushed by a caregiver but can instead automatically move besides a walking person.

"Imagine if you enter a store having somebody push your wheelchair," said one of the researchers, Yoshinori Kobayashi, assistant professor for information science and technology at the university.

"Your relationship with him or her may not look equal to a store clerk or onlookers. You would fail to be treated as an independent person."

The wheelchair is fitted with a camera and sensors that detect both the caregiver and obstacles ahead to safely guide the vehicle, which can also be stopped or overridden with a joystick by the person in the wheelchair.

Kobayashi, pointing at a shortage of caregivers in Japan, said the team is working on a system that would allow more than one automated wheelchair to follow a caregiver at the same time.

Another innovation to make life easier for the disabled is the "RAPUDA" robotic arm, a "portable manipulator" being developed by Woo-Keun Yoon of the National Institute of Advanced Industrial Science and Technology.

Patients who are largely paralysed but can move some part of their body, such as a fingertip, a toe, or their neck, will be able to control the arm and utilise its reach and strength.

The arm can be attached to wheelchairs, tables and other objects and extended to up to one metre (three feet) to pick up objects, helping improve quality of life for the disabled, said Yoon.

"It's important that you can have tea when you want to, or pick up what you like in a shop," Yoon said. "Otherwise you have to ask people many times... and end up saying 'thank you' many times and exhausting yourself."

Another invention from Saitama University, still in the development phase, is a robot that is being designed to serve its human masters and predict their wishes by literally reading their faces.

With its female appearance and dressed, manga-cartoon-style, in a French maid uniform, the humanoid on display at the fair can move about with the help of a Segway people mover hidden under its full-length skirt.

Sensors and software inside the machine are designed to detect when a human being is looking at it and to respond by, for example, bringing the person a drink, said Kobayashi, adding that the robot-maid could one day work in homes for the elderly and restaurants.

Learning about the complexities of human behaviour will be a challenge, said Kobayashi, adding: "Humanoids need to learn how they should behave so as not to leave one particular person in a group unattended."

Japanese researchers have come up with other "welfare robot" devices in recent years, including the "YURINA", made by Japan Logic Machine, a wheeled robot that can lift bed-ridden people with its arms.

After receiving instructions on a touch panel embedded in the robot's head, it can lift a patient's legs to change their diapers, or place them in a hammock to carry them safely for a bath.

"Elderly care requires a lot of heavy physical work," said marketing official Yoshitaka Takata. "This robot can be a lot of help."

Japanese people are living longer than ever, with the average life expectancy now a world-record 86.44 years for women and 79.59 years for men, the health and welfare ministry said Monday.

Deep-Sea Robots to Hunt Minerals


* Deep-sea mining robots will explore the ocean floor for rare minerals.
* Experts believe that as some minerals become scarcer worldwide, exploiting hard-to-reach underwater deposits will become feasible.

Resource-poor Japan plans to use deep-sea mining robots to exploit rare earths and precious metals on the ocean floors around the island nation within a decade, according to a media report.

The state-backed Japan Oil, Gas and Metals National Corp (JOGMEC) plans to deploy the remote-controlled robots at depths of up to 2,000 meters (6,600 feet), the Yomiuri Shimbun said without naming sources.

Experts believe that as some minerals become scarcer worldwide, exploiting hard-to-reach underwater deposits and pumping them up to mother ships will become feasible, despite the huge challenges, the daily said.

Japan and its Asian high-tech rivals are scrambling to secure rare earths and other minerals needed for products from fuel-efficient hybrid cars and batteries to cellphones and liquid crystal display televisions.

The JOGMEC project will focus on seabed volcanoes, where so-called hydrothermal vents belch out minerals, near the Izu and Ogasawara island chain, south of Tokyo, and the southwestern Okinawa islands, the report said.

Japanese experts believe the bottom of the ocean could also yield precious metals such as silver and gold and supplies of what they see as a potential next-generation fuel -- methane hydrate, also dubbed "fire ice".

The project would cost about 30 billion yen ($360 million) and is expected to start production in about 10 years, the report said.

Japan, which has long been one of the world's largest importers of industrial materials, is believed to have abundant underwater resources estimated to be worth about 200 trillion yen ($2 trillion), the Yomiuri said.

The project is expected to help Japan secure its own mineral and energy resources amid soaring prices of industrial commodities and tighter restrictions on rare earth exports by dominant producer China.

In 2009 the Japan Agency for Marine-Earth Science and Technology announced plans to send robotic submarines to study areas near seabed volcanoes.

Under that strategy, Japan eyed exploring the seabed within its exclusive economic zone, an area which extends 200 nautical miles (370 kilometers) offshore or to the half-way points to neighboring countries.

Fastest Spinning Dust Found; Solves Cosmic "Fog" Puzzle


for NGN
Published January 11, 2011

Small specks of dust found in our Milky Way galaxy are the fastest twirlers yet—spinning more than ten billion times a second, astronomers announced today.

Scientists found the tiny grains—each just 10 to 50 atoms wide—using the recently launched European Space Agency's Planck spacecraft. The find helps solve the mystery of a diffuse microwave "fog" in our galaxy that's puzzled astronomers for decades.

The odd radiation has long been associated with dense, dusty clouds between stars, but its exact source was unclear.

According to the Planck team, the new data suggest that some dust particles in these interstellar clouds are constantly colliding with fast-moving atoms and ultraviolet light.

The nonstop bombardments set the grains spinning, and their ultrafast rotation causes the grains to glow at much higher microwave frequencies than dust found elsewhere in the universe.

Understanding the different behaviors of space dust could help astronomers figure out exactly how stars and planets begin to take shape, said Planck team member Peter Martin.

"Most of the heavier elements that eventually go into building planets—and even you and me—spent most of their life in this universe as dust particles," said Martin, a professor of astronomy at the University of Toronto.

"Planck is giving us some of the most detailed surveys of our galaxy's gas and dust structure and distribution, which we think can give us hints to the birthing process of stars and even the way galaxies like ours can form."

Scraping Bugs off the Cosmic Windshield

Finding the source of the microwave fog will ultimately help the Planck team refine its studies of the cosmic microwave background, or CMB, radiation that was emitted during the big bang, more than 13 billion years ago.

Launched in early 2009, Planck's main mission is to study the CMB. But even though this radiation permeates the universe, the faint glow can be tricky to detect.

Similar wavelengths from sources closer to Earth need to be weeded out for scientists to be sure they're getting an accurate picture of the CMB.

"If we neglect their different emissions, then we get greater errors in our background measurements," said Charles Lawrence, Planck team member and a cosmologist at the Jet Propulsion Laboratory (JPL) in Pasadena, California.

"It's like having different bugs splattered on the windshield of your car and blocking your view outside of things in the distance," Lawrence said.

"We need a clear window on the universe free of any foreground sources of emission so we can accurately measure the background radiation."


Coldest, Biggest Things in the Universe


In addition to the spinning dust, Planck's initial data on non-CMB sources—released today at a meeting of the American Astronomical Society in Seattle, Washington—revealed a menagerie of unusual structures.

For example, Planck found the largest population yet of cold cores, the coldest known objects in the universe.

These clouds of frigid dust and gas inside galaxies have average temperatures of just 7 Kelvin (-447 degrees F, or -266 degrees C). Such cold clouds are hotbeds of star formation, because dense dust helps keep gases cool, allowing the gases to collapse and begin forming stars.

Inside cold cores, stars are at their very earliest stages of formation and are therefore barely detectable as heat sources.

"Each and every one of the over 900 cold cores in the newly released catalog are potentially the sites for the very youngest star formation," said Douglas Scott, a professor at the University of British Columbia in Canada and an investigator for the Planck mission.

"The very cool thing with this announcement is that we have an all-sky survey of these objects, which can now be followed up with other telescopes, where we can get intimate details of stellar birth."

Planck's supersensitivity to microwaves also allowed scientists to discover new examples of the biggest objects in the universe: The craft's initial data revealed nearly 190 galaxy clusters, including some that were previously unknown.

When the CMB travels through hot gas surrounding a galaxy cluster, the gas shifts the radiation's energy level. This shift leaves a distinctive spectral signature known as the Sunyaev-Zeldovich effect.

With Planck able to track this effect, hopes are high that the telescope will discover thousands more galaxy clusters.

"We are talking about the biggest gravitationally bound objects in the universe," JPL's Lawrence said. "So by studying them we learn about how galaxies and structures on the large scale in the universe get together."

Cahokia


At its peak in the mid-12th century, Cahokia was by far the largest native community in North America. Smaller settlements shared similar features, including platform mounds, large plazas, and protective stockades.

Why Earthlings Obsess Over Alien Penetration


What do aliens want? Science fiction and UFO abduction stories kind of make you wonder why aliens bother coming here at all. They descend upon Earth for some nominal reason—enslavement of the masses, a study of our physiology, a quick natural resource grab, whatever—and in the end the humans whomp them. They return to the stars, caudal appendage tucked between their legs, with a begrudging acknowledgment that those bald apes may be primitive, but their indomitable humanity makes them impossible to overcome.

Preposterous! In real life, humans are sucky and weak. If metropolis-sized spacecrafts actually plopped out of the skies, we’d toss over the keys to the planet in a nanosecond and surf away on waves of our own fear-pee. But we keep envisioning it going some other way. Meanwhile, we depict aliens doing really weird stuff. Slurping a sleeping lady onto a flying saucer? Tractor-beaming some hapless loser into and out of a spaceship orifice over and over again? Sending a giant alien downtown to smoosh people into jelly smears like hamsters under a hot lady’s stiletto heel? Probing?

That’s right: We basically think aliens will want to have sex with us, and we expect all sorts of penetration—some of it pleasant, a lot of it nonconsensual, a bit of it merely kinky. Once your body is snatched, the real “alien invasion” begins.

I know what you’re thinking: What about the nice aliens? Your naïveté is adorable. And dangerous. Look at Starman: “Hey, Karen Allen! I look like your dead husband. Wanna do it? Whoa, look at the time! I gotta go, sweetie! This planet is … uh … too hot … yeah, I need to get to a colder climate or something. Here’s a floating Ben Wa ball. And a baby.” In Cocoon, a lady made out of light rapes Steve Guttenberg’s soul in a pool. And is it that crazy to think that a movie called Close Encounters is really a space-age Red Shoe Diary? Just imagine John Williams’ five-tone riff on a buttery saxophone. The possibilities of E.T.’s erect, luminous finger are too numerous and disturbing to list. One word, though: “Oooooouuuuuuuch.”

This vision is truly irrational. It is almost certainly a product of our fear of the unknown. I mean, that’s a long way to travel to have sex with a different species. How far would you go to hook up with a zebra?

Unless … One thing could explain all this alien misbehavior: intergalactic fetish porn. Earth is the San Fernando Valley of the Milky Way. We’re just bestiality to horny higher life-forms. Think about it! How else could they fund all those invasions? It would take a multibillion-credit industry to make the prohibitive cost of interstellar travel worth it.

So next time you’re puzzling your way through the hot human-on-alien action of a movie like Skyline or Battle: Los Angeles, ask yourself: Why do those people make alien-fighting so complicated? If the extraterrestrial entertainment industry is anything like ours, we humans shouldn’t have to worry about uploading viruses into the alien defense grid, or sending a hate-filled Mel Gibson after the Visitors with a bucket of water. Fellow earthlings: Don’t play their game. Should a hairless being with no mouth and giant eyes drag you onto some two-bit star cruiser and start pulling out tools, simply refuse to sign a video release. All my Hollywood experience tells me that this rarely used legal loophole is the only way you will be returned to your car in the desert unprobed. You might never win a Galactic Adult Video News Award, but you can sleep easy knowing that nowhere out past the final frontier is a Rigelian cephalopod stimulating its sex-lump to your undignified violation on the space-Internet.

Audi’s Electrifying R8 e-Tron



Gorgeous is the only way to describe Audi’s awesome R8 e-Tron electric supercar, and we don’t need much excuse to show it off. Especially when it’s wrapped in matte black aluminum and carbon fiber with subtle yellow highlights like this beauty Audi showed off at its tastefully Kubrickian booth at CES.

We got a tour of the outside, but the Audions wouldn’t let us film the inside. Trust us though. It’s gorgeous. And this car’s got some beast to go with the beauty: Audi claims it’ll do zero to 100 km/hr in 4.8 seconds, put down more than 300 horsepower and get around 150 miles per charge. It’s a quattro, too, with one motor for each wheel.

Audi plans to build a few hundred R8 e-Trons and sell them to select (well-heeled) buyers in 2012. So far, the only action it’s seen is on the track at Le Mans last May and some covert test spins.

But the car is pretty much finalized except for one detail: Like all electrics, it’s practically silent, and the engineers have yet to decide on an “appropriate noise profile.” We recommended installing some speakers behind the wheels and blasting Rammstein on a loop.

Biggest-Ever Night Sky Image Released to Public


SEATTLE — The Sloan Digital Sky Survey collaboration released the largest-ever digital color image of the sky today.

The new image consists of 1.2 trillion pixels and covers a third of the night sky, capturing half a billion individual stars and galaxies. Every yellow dot in the image (above) is a galaxy, and zooming in on each dot reveals a galaxy’s detailed structure and individual star-forming regions.

“We have that sort of detail for this entire area,” said astronomer Michael Blanton of New York University at a press conference here at the American Astronomical Society meeting. “It’s not just really big, it’s also really useful.”

The survey has been taking multicolored images of the sky from a single 2.5 meter telescope in New Mexico since 1998 and releasing the photos to the public almost immediately. Its sharp shots of millions of galaxies laid the foundations for citizen science projects such as Galaxy Zoo, Google Sky and World Wide Telescope.

The first two iterations of the survey, called SDSS-I and SDSS-II, covered part of the sky called the northern galactic cap (bottom right in the image above). SDSS-III ran from July 2008 to December 2009 and completed the survey by covering the entire southern galactic cap (bottom left).

“What makes this a really special moment is that this release really completes the mission of the SDSS camera that’s been going on for 11 years,” Blanton said. The camera that took images in the part of the electromagnetic spectrum visible to human eyes stopped running in December 2009.

But SDSS will continue to take data in other wavelengths for several years. The new publicly available data dump (which is about 30 terabytes in size) includes spectra of millions of galaxies up to 7 billion light-years away, which will help astronomers understand the history of the universe and the effects of dark energy on cosmic structure. A survey called BOSS, which began in 2009 and will run until 2014, will convert the new 2-D image into the largest 3-D map of the distant universe.

SDSS-III also collected spectra from millions of stars as part of a survey called SEGUE-2, which will help astronomers identify streams of stars that the Milky Way stole from smaller satellite galaxies. Astronomers believe large galaxies formed by cannibalizing smaller galaxies that got too close. The remains of these hapless galaxies show up as streams of stars that still move as a flock through the Milky Way.

“For the parts of the sky where we have SEGUE-2 data, we can make a more complete census of these streams, and get a better idea of how our galaxy grew by seeing the remnants of galaxies that were torn apart by our own galaxy,” said astronomer Connie Rockosi of the University of California, Santa Cruz.

“This is going to be a really unique reference for the next decade or longer,” Blanton said. “It’s a true legacy data set.”

Image: M. Blanton and the SDSS-III