miércoles, 30 de junio de 2010
The skull of the 12–13 million-year-old sperm whale fossil found off the coast of Peru measures an astounding 10 feet long.
Unlike modern sperm whales, this one had teeth in both jaws and might have eaten like killer whale.
The 10-foot sperm whale skull fossil is the largest ever found.
Its main food might have been baleen whales.
The massive skull and jaw of a 13-million-year-old sperm whale has been discovered eroding from the windblown sands of a coastal desert of Peru.
The extinct cousin of the modern sperm whale is the first fossil to rival modern sperm whales in size -- although this is a very different beast, say whale evolution experts.
"We could see it from very far," said paleontologist Olivier Lambert of the Muséum National d'Histoire Naturelle in Paris, France, who led the team which found the fossil.
The giant 3-meter (10-foot) skull of what's been dubbed Leviathan melvillei (in honor of the author of "Moby Dick") was found with teeth in its top and bottom jaws up to 36 centimeters (14 inches) long. The discovery is reported in the July 1 issue of the journal Nature.
Living sperm whales have teeth only in their lower jaws and are specialized to feed on giant squid, Lambert explained. They suck down squid like large spaghetti noodles rather than catch the prey with their teeth. The much toothier fossil sperm whales, however, may have eaten more like a outsized-orca, or killer whale: chomping great big bites out of its prey.
"These are very unusual attributes," said cetacea evolution expert Ewan Fordyce of the University of Otago in New Zealand. "It's remarkably big. That is unexpected."
Another sign that this ancient whale had a killer bite is the large hole in the skull to accommodate a large jaw muscle.
"This was a hunting predator that took chunks out of prey," said Fordyce.
It most likely fed on baleen whales, Lambert and his colleagues report, and lived in the same waters as the monster-sized shark called Carcharocles megalodon.
To learn more about its eating habits, Fordyce said it would be useful to look at the microscopic wear patterns on the teeth. If the wear lines are horizontal, it probably sucked in prey like today's whales. But if the wear lines are vertical, it would suggest a biter, like the orca.
"Many fossil sperm whales have been found in the past," said Lambert. "Most have been much smaller than modern sperm whales."
There have also been discoveries of isolated large sperm whale teeth fossils before, said Lambert. Those made it clear to researchers there was a bigger animal out there waiting to be found. And now they have found it.
"I think it's a great advance," said Fordyce of the discovery.
The fossil appears to also be a distant relative of today's sperm whales, said Fordyce, rather than a direct ancestor.
There are more than 3,000 known nudibranch species, and scientists estimate there are another 3,000 yet to be discovered. So-called Spanish dancers, like this one off the coast of New South Wales, Australia, boast some distinctions over other nudibranchs: First, they can be enormous, reaching a foot and a half (46 centimeters) long. Most nudibranchs are finger-size. Second, it can swim, a skill most of its cousins lack.
Nudibranchs are hermaphroditic, carrying both male and female reproductive organs. Mating pairs fertilize one another and lay up to two million eggs in coils, ribbons, or tangled clumps, as this purple-painted Hypselodoris is doing.
Generally oblong in shape, nudibranchs can be thick or flattened, long or short, ornately colored or drab to match their surroundings. Some max out at a quarter of an inch (6 millimeters), while others can reach a foot (30 centimeters) long or more.
Nudibranchs' unique lives and body chemistry may harbor breakthroughs that could benefit mankind. Scientists are attempting to derive pharmaceuticals from their chemical armory and get clues to learning and memory from their simple nervous systems.
For the first time, a team of astronomers has succeeded in investigating the earliest phases of the evolutionary history of our home Galaxy, the Milky Way. The scientists, from the Argelander Institute for Astronomy at Bonn University and the Max-Planck Institute for Radioastronomy in Bonn, deduce that the early Galaxy went from smooth to clumpy in just a few hundred million years.
The team publish their results in the journal Monthly Notices of the Royal Astronomical Society.
Led by Professor Dr. Pavel Kroupa, the researchers looked at the spherical groups of stars (globular clusters) that lie in the halo of the Milky Way, outside the more familiar spiral arms where the Sun is found. They each contain hundreds of thousands of stars and are thought to have formed at the same time as the 'proto-Galaxy' that eventually evolved into the Galaxy we see today.
Globular star clusters can be thought of as fossils from the earliest period of the history of the Galaxy and the astronomers found that they left a hint of the conditions under which they formed. The stars of the clusters condensed out of a cloud of molecular gas (relatively cool hydrogen), not all of which was used up in their formation. The residual gas was expelled by the radiation and winds coming from the freshly hatched population of stars.
"Due to this ejection of gas, the globular clusters expanded and thereby lost the stars that formed at their boundaries. This means that the present shape of the clusters was directly influenced by what happened in the early days of their existence," explains Michael Marks, PhD student of Professor Kroupa and lead author on the new paper.
The clusters were also shaped by the forming Milky Way and the Bonn scientists calculated exactly how the proto-Galaxy affected its smaller neighbours. Their results show that the gravitational forces exerted on the star clusters by the proto-Milky Way appear to increase with the metal content of their member stars (in astronomy 'metals' in stars are elements heavier than helium).
"The amount of e.g. iron in a star is therefore an age indicator. The more recently a star cluster was born, the higher the proportion of heavy elements it contains," adds Marks. But since the globular clusters are more or less the same age, these age differences can't be large. In order to explain the variation in the forces exerted on different globular clusters, the structure of the Milky Way had to change rapidly within a short time.
The giant gas cloud from which the Milky Way formed had to evolve from an overall smooth structure into a clumpy object in less than a few hundred million years in order to increase the strength of the forces significantly. This timespan corresponds to the astronomically short duration in which the proto-galaxy-sized gas cloud collapsed under its own gravity. In parallel, the globular clusters formed successively within the collapsing cloud. The material from which the somewhat younger globular clusters formed and which according to the results of this investigation felt stronger attractive forces, was previously enriched with heavy elements by fast-evolving stars in the older clusters.
Prof. Kroupa summarises their results. "In this picture we can elegantly combine the observational and theoretical results and understand why later forming, more metal-rich clusters experienced stronger force fields. On the back of this work, for the first time we have a detailed insight into the earliest evolutionary history of our Galaxy."
Many of the Milky Way's ancient stars are remnants of other smaller galaxies torn apart by violent galactic collisions around five billion years ago, according to researchers at Durham University.
Scientists at Durham's Institute for Computational Cosmology and their collaborators at the Max Planck Institute for Astrophysics, in Germany, and Groningen University, in Holland, ran huge computer simulations to recreate the beginnings of our galaxy.
The simulations revealed that the ancient stars, found in a stellar halo of debris surrounding the Milky Way, had been ripped from smaller galaxies by the gravity generated by colliding galaxies.
Cosmologists predict that the early Universe was full of small galaxies which led short and violent lives. These galaxies collided with each other leaving behind debris which eventually settled into more familiar looking galaxies like the Milky Way.
The researchers say their finding supports the theory that many of the Milky Way's ancient stars had once belonged to other galaxies instead of being the earliest stars born inside the galaxy when it began to form about 10 billion years ago.
The research, funded in the UK by the STFC, appears in the Monthly Notices of the Royal Astronomical Society.
Lead author Andrew Cooper, from Durham University's Institute for Computational Cosmology, said: "Effectively we became galactic archaeologists, hunting out the likely sites where ancient stars could be scattered around the galaxy.
"Our simulations show how different relics in the galaxy today, like these ancient stars, are related to events in the distant past.
"Like ancient rock strata that reveal the history of Earth, the stellar halo preserves a record of a dramatic primeval period in the life of the Milky Way which ended long before the Sun was born."
The computer simulations started from the Big Bang, around 13 billion years ago, and used the universal laws of physics to simulate the evolution of dark matter and the stars.
These simulations are the most realistic to date, capable of zooming into the very fine detail of the stellar halo structure, including star "streams" -- which are stars being pulled from the smaller galaxies by the gravity of the dark matter.
One in one hundred stars in the Milky Way belong to the stellar halo, which is much larger than the galaxy's familiar spiral disk. These stars are almost as old as the Universe.
Professor Carlos Frenk, Director of Durham University's Institute for Computational Cosmology, said: "The simulations are a blueprint for galaxy formation.
"They show that vital clues to the early, violent history of the Milky Way lie on our galactic doorstep.
"Our data will help observers decode the trials and tribulations of our galaxy in a similar way to how archaeologists work out how ancient Romans lived from the artefacts they left behind."
The research is part of the Aquarius Project, which uses the largest supercomputer simulations to study the formation of galaxies like the Milky Way.
Aquarius was carried out by the Virgo Consortium, involving scientists from the Max Planck Institute for Astrophysics in Germany, the Institute for Computational Cosmology at Durham University, UK, the University of Victoria in Canada, the University of Groningen in the Netherlands, Caltech in the USA and Trieste in Italy.
Durham's cosmologists will present their work to the public as part of the Royal Society's 350th anniversary 'See Further' exhibition, held at London's Southbank Centre until Sunday, July 4.
The highlight of their 'Cosmic Origins' exhibit is an award winning 3-D movie describing how the Milky Way formed. Visitors to the exhibit can also create their own star streams by colliding galaxies with an interactive 3-D simulation.
To the untrained eye, University of Colorado at Boulder Research Associate Craig Lee's recent discovery of a 10,000-year-old wooden hunting weapon might look like a small branch that blew off a tree in a windstorm.
Nothing could be further from the truth, according to Lee, a research associate with CU-Boulder's Institute of Arctic and Alpine Research who found the atlatl dart, a spear-like hunting weapon, melting out of an ice patch high in the Rocky Mountains close to Yellowstone National Park.
Lee, a specialist in the emerging field of ice patch archaeology, said the dart had been frozen in the ice patch for 10 millennia and that climate change has increased global temperatures and accelerated melting of permanent ice fields, exposing organic materials that have long been entombed in the ice.
"We didn't realize until the early 2000s that there was a potential to find archaeological materials in association with melting permanent snow and ice in many areas of the globe," Lee said. "We're not talking about massive glaciers, we're talking about the smaller, more kinetically stable snowbanks that you might see if you go to Rocky Mountain National Park."
As glaciers and ice fields continue to melt at an unprecedented rate, increasingly older and significant artifacts -- as well as plant material, animal carcasses and ancient feces -- are being released from the ice that has gripped them for thousands of years, he said.
Over the past decade, Lee has worked with other researchers to develop a geographic information system, or GIS, model to identify glaciers and ice fields in Alaska and elsewhere that are likely to hold artifacts. They pulled together biological and physical data to find ice fields that may have been used by prehistoric hunters to kill animals seeking refuge from heat and insect swarms in the summer months.
"In these instances, what we're finding as archaeologists is stuff that was lost," Lee said. "Maybe you missed a shot and your weapon disappeared into the snowbank. It's like finding your keys when you drop them in snow. You're not going to find them until spring. Well, the spring hasn't come until these things started melting for the first time, in some instances, in many, many thousands of years."
The dart Lee found was from a birch sapling and still has personal markings on it from the ancient hunter, according to Lee. When it was shot, the 3-foot-long dart had a projectile point on one end, and a cup or dimple on the other end that would have attached to a hook on the atlatl. The hunter used the atlatl, a throwing tool about two feet long, for leverage to achieve greater velocity.
Later this summer Lee and CU-Boulder student researchers will travel to Glacier National Park to work with the Salish, Kootenai and Blackfeet tribes and researchers from the University of Wyoming to recover and protect artifacts that may have recently melted out of similar locations.
"We will be conducting an unprecedented collaboration with our Native American partners to develop and implement protocols for culturally appropriate scientific methods to recover and protect artifacts we may discover," he said.
Quick retrieval of any organic artifacts like clothing, wooden tools or weapons is necessary to save them, because once thawed and exposed to the elements they decompose quickly, he said.
An estimated 10 percent of Earth's land surface is covered with perennial snow, glaciers and ice fields, providing plenty of opportunities for exploration, Lee said. However, once organic artifacts melt out of the ice, they could be lost forever.
"Ninety-five percent of the archaeological record that we usually base our interpretations on is comprised of chip stone artifacts, ground stone artifacts, maybe old hearths, which is a fire pit, or rock rings that would have been used to stabilize a house," Lee said. "So we really have to base our understanding about ancient times on these inorganic materials. But ice patches are giving us this window into organic technology that we just don't get in other environments."
The first experimental evidence showing how atmospheric nitrogen can be incorporated into organic macromolecules is being reported by a University of Arizona team.
The finding indicates what organic molecules might be found on Titan, the moon of Saturn that scientists think is a model for the chemistry of pre-life Earth.
Earth and Titan are the only known planetary-sized bodies that have thick, predominantly nitrogen atmospheres, said Hiroshi Imanaka, who conducted the research while a member of UA's chemistry and biochemistry department.
How complex organic molecules become nitrogenated in settings like early Earth or Titan's atmosphere is a big mystery, Imanaka said.
"Titan is so interesting because its nitrogen-dominated atmosphere and organic chemistry might give us a clue to the origin of life on our Earth," said Imanaka, now an assistant research scientist in the UA's Lunar and Planetary Laboratory. "Nitrogen is an essential element of life."
However, not just any nitrogen will do. Nitrogen gas must be converted to a more chemically active form of nitrogen that can drive the reactions that form the basis of biological systems.
Imanaka and Mark Smith converted a nitrogen-methane gas mixture similar to Titan's atmosphere into a collection of nitrogen-containing organic molecules by irradiating the gas with high-energy UV rays. The laboratory set-up was designed to mimic how solar radiation affects Titan's atmosphere.
Most of the nitrogen moved directly into solid compounds, rather than gaseous ones, said Smith, a UA professor and head of chemistry and biochemistry. Previous models predicted the nitrogen would move from gaseous compounds to solid ones in a lengthier stepwise process.
Titan looks orange in color because a smog of organic molecules envelops the planet. The particles in the smog will eventually settle down to the surface and may be exposed to conditions that could create life, said Imanaka, who is also a principal investigator at the SETI Institute in Mountain View, Calif.
However, scientists don't know whether Titan's smog particles contain nitrogen. If some of the particles are the same nitrogen-containing organic molecules the UA team created in the laboratory, conditions conducive to life are more likely, Smith said.
Laboratory observations such as these indicate what the next space missions should look for and what instruments should be developed to help in the search, Smith said.
Imanaka and Smith's paper, "Formation of nitrogenated organic aerosols in the Titan upper atmosphere," is scheduled for publication in the Early Online edition of the Proceedings of the National Academy of Sciences the week of June 28. NASA provided funding for the research.
The UA researchers wanted to simulate conditions in Titan's thin upper atmosphere because results from the Cassini Mission indicated "extreme UV" radiation hitting the atmosphere created complex organic molecules.
Therefore, Imanaka and Smith used the Advanced Light Source at Lawrence Berkeley National Laboratory's synchroton in Berkeley, Calif. to shoot high-energy UV light into a stainless steel cylinder containing nitrogen-and-methane gas held at very low pressure.
The researchers used a mass spectrometer to analyze the chemicals that resulted from the radiation.
Simple though it sounds, setting up the experimental equipment is complicated. The UV light itself must pass through a series of vacuum chambers on its way into the gas chamber.
Many researchers want to use the Advanced Light Source, so competition for time on the instrument is fierce. Imanaka and Smith were allocated one or two time slots per year, each of which was for eight hours a day for only five to 10 days.
For each time slot, Imanaka and Smith had to pack all the experimental equipment into a van, drive to Berkeley, set up the delicate equipment and launch into an intense series of experiments. They sometimes worked more than 48 hours straight to get the maximum out of their time on the Advanced Light Source. Completing all the necessary experiments took years.
It was nerve-racking, Imanaka said: "If we miss just one screw, it messes up our beam time."
At the beginning, he only analyzed the gases from the cylinder. But he didn't detect any nitrogen-containing organic compounds.
Imanaka and Smith thought there was something wrong in the experimental set-up, so they tweaked the system. But still no nitrogen.
"It was quite a mystery," said Imanaka, the paper's first author. "Where did the nitrogen go?"
Finally, the two researchers collected the bits of brown gunk that gathered on the cylinder wall and analyzed it with what Imanaka called "the most sophisticated mass spectrometer technique."
Imanaka said, "Then I finally found the nitrogen!"
Imanaka and Smith suspect that such compounds are formed in Titan's upper atmosphere and eventually fall to Titan's surface. Once on the surface, they contribute to an environment that is conducive to the evolution of life.
martes, 29 de junio de 2010
In recent years, there's been some uncertainty as to how we should deal with a nasty-looking asteroid tumbling toward Earth. If we're to believe the movies, we need to throw our nuclear arsenal at the offending space rock. But more recently, there have been some very strong arguments for more subtle asteroid deflection techniques.
Going against the recommendations of not using nuclear explosions destroy an asteroid on a collision course with Earth, physicist David Dearborn of the Lawrence Livermore National Laboratory has turned the "softly, softly" approach on its head.
Yes, prepare the missile silos again, it's time to detonate a 100 megaton firework.
Dearborn spoke at the semiannual meeting of the American Astronomical Society (AAS) last month after developing several computer models of nuclear detonations on or near different types of asteroids.
His results weren't very surprising: nuclear explosions work and they are a good way to protect the Earth from asteroid doom.
WIDE ANGLE: Asteroids!
But isn't this result obvious? Doesn't the nuclear missile handbook have a chapter called "Alternative Uses: Blowing Up Doomsday Asteroids"? Bruce Willis taught us this lesson ages ago.
Actually, Dearborn didn't want to confirm what we were all thinking, he has come up with a strategy for the best use of nuclear weapons. A strategy that might be quite useful should astronomers see an extinction event asteroid on our doorstep.
Dearborn has outlined two general guidelines:
1) If the asteroid is small, and we have a few decades to deal with the threat, it would be best to detonate a nuke next to the asteroid to nudge it slightly off course.
2) If the asteroid is big, and we only have a few weeks notice, detonate a nuke on the asteroid, hopefully ripping it to shreds.
The main advantage of a nuclear explosion is that a massive amount of energy is released rapidly, making the technique highly efficient; a nuke can be used to nudge, knock, slow down, speed up or destroy the space rock.
Personally, while I agree with Dearborn that nukes are our most powerful defense against asteroids, there are many other factors to consider.
Firstly, what if the resulting explosion rips the asteroid to shreds, but big chunks of asteroid then rain down on Earth, blanket bombing entire continents? Choosing whether to get hit by one big asteroid or a shower of smaller (but still rather big) asteroid chunks isn't a choice I'd want to make.
Secondly, what if the USA fired a missile at a medium-sized asteroid, only to deflect it into China? Wars have started over much less.
Thirdly, we don't have a very good understanding about the structure of asteroids. An asteroid composed of very loose rock held together under a mutual gravity (a "rubble pile") will act very differently to a solid, metallic asteroid when faced with a huge explosion. President Obama's plan to send NASA astronauts to an asteroid to study it up close suddenly seems like a good idea.
Also, a recent study showed that a direct hit by a nuclear weapon might rip the offending asteroid apart, but it could re-form if the bomb wasn't big enough.
So what is it to be? Do we use nuclear weapons? Or do we find more subtle ways to deflect asteroids?
Unfortunately, it is highly unlikely there will be a "live" nuclear test in space, so it is doubtful we'll know the true impact of a nuclear asteroid strike.
But there's one thing for certain -- and many experts agree -- if we have less than a decade to deflect a killer asteroid, I'd be the first in line to press the Big Red Button. Nuclear weaponry might be our only choice.
A centuries-old latex ball made by the Olmec in what is now Mexico (file photo).
Ancient civilizations in much of Mexico and Central America were making different grades of rubber 3,000 years before Charles Goodyear "stabilized" the stuff in the mid-19th century, new research suggests.
The Aztec, Olmec, and Maya of Mesoamerica are known to have made rubber using natural latex—a milky, sap-like fluid found in some plants. Mesoamerica extends roughly from central Mexico to Honduras and Nicaragua .
Ancient rubber makers harvested latex from rubber trees and mixed it with juice from morning glory vines, which contains a chemical that makes the solidified latex less brittle.
By mixing up rubber using different proportions of the two ingredients, researchers at the Massachusetts Institute of Technology found that tweaking the formula led to rubber products with different properties.
Some of the rubber came out more bouncy, suggesting it may have been used to make balls for the legendary Mesoamerican ball games.
As described in ancient Maya texts, ball games often had religious meaning—pitting good against evil. It's thought the ball games sometimes ended in human sacrifice, most famously in ritual decapitation.
Other latex-to-morning glory proportions created more durable rubber, such as what might have been used in Aztec rubber sandals, which were described by Spanish conquistadors but have never been found by archaeologists.
A 50-50 blend of morning glory juice and latex created rubber with maximum bounciness, while a 75-25 mix of latex and morning glory made the most durable material.
Latex a "Funky White Liquid"
The initial discovery of rubber from latex and morning glory isn't so far-fetched, noted study co-author Michael Tarkanian, a researcher with MIT's Center for Materials Research in Archaeology and Ethnology.
Morning glory plants tend to grow near rubber trees, and both plants were considered sacred in several Mesoamerican cultures. Morning glory, for example, was also used in religious ceremonies for its hallucinogenic properties.
To make their re-creations as accurate as possible, Tarkanian and colleague Dorothy Hosler harvested latex from rubber trees and juice from morning glory vines growing in Mexico.
The first challenge was getting the ingredients back to the lab. Latex isn't regulated by U.S. Customs, and so there are no official papers for carrying it into the country, Tarkanian explained, "which is a problem when you're bringing this funky white liquid across the border in Nalgene bottles."
Once safely back in the U.S., the researchers met a new hurdle: The latex needed to be warm.
"The process always worked in Mexico, but not in the air-conditioned labs at MIT," Tarkanian said. When the mixture was too cold, the molecules simply didn't bond.
Today most rubber is treated through a process called vulcanization, which cooks liquid latex with sulfur to increase strength and elasticity.
Aztec Captains of Industry?
According to Tarkanian, it's no surprise cultures such as the Aztec were making advanced versions of rubber. Despite their common depiction as primitive, violent people, the Aztec had a spirit of scientific inquiry, as shown by their experiments in metallurgy and other industries, he said.
"Their science, engineering, and development skills would have led them to try different combinations" of substances when making rubber, Tarkanian said.
Once the raw ingredients had been mixed, he added, the rubber took about ten minutes to form and another five minutes to harden, giving rubber workers just a few minutes to shape the final product.
Tarkanian and Hosler mostly created sheets of rubber in the lab, but they did make a few rubber balls during one session.
"At the end of the semester, we played the Mesoamerican ball game. The losing team was beheaded," Tarkanian quipped.
The Mesoamerican-rubber paper will appear in an upcoming issue of the journal Latin American Antiquity.
Space is a big place, and even with their giant telescopes, astronomers just can’t cover it all. This is where you come in. Yes, you.
Astronomy is one of the few scientific fields where amateur scientists can, and frequently do, make significant contributions. But now space scientists are increasingly also looking to people with little or no training for help with their research. Sometimes they are looking for free labor for tasks that humans can still do better than computers, like identifying different types of galaxies. Other times it’s numbers of eyes on the sky or feet on the ground they’re after. But more and more, they are finding ways to get regular citizens involved.
Amateur astronomers and regular folks have already had an impact on the science by making observations of fleeting cosmic phenomena that would have otherwise gone unnoticed.
When an asteroid or a comet hit Jupiter in July 2009 and then again earlier this month, amateur astronomers in Australia and the Philippines were the first to notice. Amateurs have invented new telescopes, kept tabs on variable stars and discovered comets. And you don’t even need any fancy equipment.
“We can learn a lot from someone taking a cellphone video of a meteor as it burns up in the atmosphere,” said Bill Cooke of NASA’s Meteoroid Environment Office.
But what if you’re not the lucky one who is in the right place at the right time? You are still needed. Citizen scientists have also become crucial for helping astronomers with one of their most intractable problems: too much data, too little time.
Here are some astronomy projects you can take part in right now, while you wait for your iPhone to capture a meteor.
Hunt for Meteorites
Last month, NASA tried to recruit meteorite hunters when cameras at NASA’s Marshall Spaceflight Center recorded the path of a meteor from its home in the asteroid belt to just 23 miles above the Earth’s surface. The 60-pound rock is thought to have smashed into the ground near Scottsboro, Alabama, on May 18.
“This is the first time our cameras picked up something we thought produced meteorites on the ground,” Cooke said. “If we find the one in Scottsboro, we know exactly where it came from.”
Knowing both the path the meteorite took and what it’s made of would give scientists a complete picture of the rock’s life, and they were anxious to find it. But after two days of searching, NASA’s meteorite basket came up empty.
So Cooke called on the masses. NASA issued a press release on May 20 asking anyone who found a funny rock near Scottsboro to call them.
“People in the public contribute a lot to meteor science,” Cooke said. “My hope was that it landed on somebody’s farm, and they thought, ‘Where the heck did that rock come from?’”
So far nobody has found the rock Cooke is after.Image: NASA
A sharp view of the starry sky is difficult, because the atmosphere constantly distorts the image. TU/e researcher Roger Hamelinck developed a new type of telescope mirror, which quickly corrects the image. His prototypes are required for future large telescopes, but also gives old telescopes a sharper view.
The atmosphere contains 'bubbles' of hot and cold air, each with their own refractive index, which distort the image. As a result, the light reaching ground-based telescopes is distorted. Hamelinck's system tackles this problem with a deformable mirror in the telescope. Under this ultrathin mirror there are actuators, which can wherever necessary quickly create bumps and dimples in the mirror. These bumps and dimples correct the continuously changing distortion created in the atmosphere. This is of crucial importance to the new generation of large telescopes in particular. Hamelinck: "In principle, larger telescopes also have a higher resolution, but attaining an optimal optical quality is hampered by the atmosphere. Therefore you absolutely need these corrections."
The principle of the 'adaptive deformable mirror' has been known some fifty odd years, but was limited especially by the technology. Thus, the actuators of earlier systems generated much heat, which caused the systems themselves to become a source of distortion. "Contrary to the old systems, this new system has an ultrathin mirror, so that very little power is needed for its deformation ," Hamelinck explains. "In combination with the efficient, electromagnetic reluctance actuators, this reduces the heat generation of the system to a very low level. Thanks to this, no active cooling is required." Hamelinck's working prototype has a five-centimeter diameter. Given that the design is scalable and expandable with modules, the system is suited for very large telescopes, such as the future 42-meter-big E-ELT (European Extra Large Telescope). The E-ELT is fitted inter alia with an adaptive mirror of 2.4 meters.
Research institute TNO is so enthusiastic about Hamelinck's work, that the institute is going to market it. Not only so for new telescopes, but also for existing ones. "It can be built into any telescope in the world," says Ben Braam, business developer Space & Science of TNO. "When you turn on the system, the image is suddenly enhanced. As if it is putting on new spectacles at long last." Affordable spectacles, in Braam's opinion. "I'm thinking in terms of fifty to one hundred thousand euro. Which is relatively cheap for that world."
Admittedly, the system does not correct for everything. Clouds continue to be a problem, for example. Consequently the best places for telescopes are still locations where one can enjoy a clear, cloudless sky most of the time. That would exclude the Netherlands, then.
University of Michigan aquatic ecologist Donald Scavia and his colleagues say this year's Gulf of Mexico "dead zone" is expected to be larger than average, continuing a decades-long trend that threatens the health of a $659 million fishery.
The 2010 forecast, released by the U.S. National Oceanic and Atmospheric Administration (NOAA), calls for a Gulf dead zone of between 6,500 and 7,800 square miles, an area roughly the size of Lake Ontario.
The most likely scenario, according to Scavia, is a Gulf dead zone of 6,564 square miles, which would make it the Gulf's 10th-largest oxygen-starved, or hypoxic, region on record. The average size over the past five years was about 6,000 square miles.
It is unclear what impact, if any, the Deepwater Horizon oil spill will have on the size of this year's Gulf dead zone because numerous factors are at work, the researchers say.
"We're not certain how this will play out. But one fact is clear: The combination of summer hypoxia and toxic-oil impacts on mortality, spawning and recruitment is a one-two punch that could seriously diminish valuable Gulf commercial and recreational fisheries," said Scavia, Special Counsel to the U-M President for Sustainability, director of the Graham Sustainability Institute, and a professor at the School of Natural Resources and Environment.
Farmland runoff containing fertilizers and livestock waste -- some of it from as far away as the Corn Belt -- is the main source of the nitrogen and phosphorus that cause the annual Gulf of Mexico hypoxia zone. Each year in late spring and summer, these nutrients flow down the Mississippi River and into the Gulf, fueling explosive algae blooms there.
When the algae die and sink, bottom-dwelling bacteria decompose the organic matter, consuming oxygen in the process. The result is an oxygen-starved region in bottom and near-bottom waters: the dead zone.
This year, the situation is complicated by uncertainties related to the Gulf oil spill.
If sufficient oil reaches the area typically subject to summer hypoxia, the size of this summer's Gulf dead zone could increase for two reasons: microbial breakdown of oil -- which consumes oxygen -- and the oil's potential to reduce diffusion of oxygen from the air into the water, the process that normally replenishes oxygen levels in the water column, Scavia said.
On the other hand, the presence of oil could restrict the growth of hypoxia-fueling algae, helping to limit the size of the Gulf dead zone.
The five largest Gulf dead zones on record have occurred since 2001. The biggest occurred in 2002 and measured 8,484 square miles.
"The growth of these dead zones is an ecological time bomb. Without determined local, regional and national efforts to control them, we are putting major fisheries at risk," Scavia said.
The computer models that generate hypoxia forecasts have been used to determine the nutrient-reduction targets required to shrink the size of the Gulf dead zone. The models rely on U.S. Geological Survey estimates of the amount of nitrogen feeding into the Gulf from the Mississippi and Atchafalaya rivers.
Hypoxia is of particular concern because it threatens valuable commercial and recreational Gulf fisheries. In 2008, the dockside value of commercial fisheries was $659 million. The 24 million fishing trips taken in 2008 by more than 3 million recreational fishers further contributed well over $1 billion dollars to the Gulf economy, according to NOAA.
"As with weather forecasts, this prediction uses multiple models to predict the range of the expected size of the dead zone," said Robert Magnien, director of NOAA's Center for Sponsored Coastal Ocean Research. "The strong track record of these models reinforces our confidence in the link between excess nutrients from the Mississippi River and the dead zone."
The 2010 spring nutrient load transported to the northern Gulf of Mexico is about 11 percent less than the average over the last 30 years, said Matt Larsen, USGS associate director for water.
"An estimated 118,000 metric tons of nitrogen, in the form of nitrate, were transported in May 2010 to the northern Gulf," Larsen said.
The Gulf hypoxia research team is supported by NOAA's Center for Sponsored Coastal Ocean Research and includes scientists from the University of Michigan, Louisiana State University and the Louisiana Universities Marine Consortium.
The official size of the 2010 Gulf hypoxic zone will be announced following a NOAA-supported monitoring survey led by the Louisiana Universities Marine Consortium, July 24 through Aug. 2.
As you go about your day-to-day activities, tiny bubbles of nitrogen come and go inside your tissues. This is not a problem unless you happen to experience large changes in ambient pressure, such as those encountered by scuba divers and astronauts. During large, fast pressure drops, these bubbles can grow and lead to decompression sickness, popularly known as "the bends."
A study in the Journal of Chemical Physics, which is published by the American Institute of Physics (AIP), may provide a physical basis for the existence of these bubbles, and could be useful in understanding decompression sickness.
A physiological model that accounts for these bubbles is needed both to protect against and to treat decompression sickness. There is a problem though. "These bubbles should not exist," says author Saul Goldman of the University of Guelph in Ontario, Canada.
Because they are believed to be composed mostly of nitrogen, while the surrounding atmosphere consists of both nitrogen and oxygen, the pressure of the bubbles should be less than that of the surrounding atmosphere. But if this were so, they would collapse.
"We need to account for their apparent continuous existence in tissues in spite of this putative pressure imbalance," says Goldman.
If, as is widely believed, decompression sickness is the result of the growth of pre-existing gas bubbles in tissues, those bubbles must be sufficiently stable to have non-negligible half-lives. The proposed explanation involves modeling body tissues as soft elastic materials that have some degree of rigidity. Previous models have focused on bubble formation in simple liquids, which differ from elastic materials in having no rigidity.
Using the soft-elastic tissue model, Goldman finds pockets of reduced pressure in which nitrogen bubbles can form and have enough stability to account for a continuous presence of tiny bubbles that can expand when the ambient pressure drops. Tribonucleation, the phenomenon of formation of new gas bubbles when submerged surfaces separate rapidly, provides the physical mechanism for formation of new gas bubbles in solution. The rapid separation of adhering surfaces results in momentary negative pressures at the plane of separation. Therefore, while these tiny bubbles in elastic media are metastable, and do not last indefinitely, they are replaced periodically. According to this picture, tribonucleation is the source, and finite half-lives the sink, for the continuous generation and loss small gas bubbles in tissues.
A Purdue University researcher has found a sort of fountain of youth for tomatoes that extends their shelf life by about a week.
Avtar Handa, a professor of horticulture, found that adding a yeast gene increases production of a compound that slows aging and delays microbial decay in tomatoes. Handa said the results, published in the early online version of The Plant Journal, likely would transfer to most fruits.
"We can inhibit the aging of plants and extend the shelf life of fruits by an additional week for tomatoes," Handa said. "This is basic fundamental knowledge that can be applied to other fruits."
The organic compound spermidine is a polyamine and is found in all living cells. Polyamines' functions aren't yet fully understood. Handa and Autar Mattoo, a research plant physiologist with the U.S. Department of Agriculture's Agricultural Research Service and collaborator in the research, had shown earlier that polyamines such as spermidine and spermine enhance nutritional and processing quality of tomato fruits.
"At least a few hundred genes are influenced by polyamines, maybe more," Mattoo said. "We see that spermidine is important in reducing aging. It will be interesting to discover what other roles it can have."
Savithri Nambeesan, who was a graduate student in Handa's laboratory, introduced the yeast spermidine synthase gene, which led to increased production of spermidine in the tomatoes. Fully ripe tomatoes from those plants lasted about eight days longer before showing signs of shriveling compared with non-transgenic plants. Decay and rot symptoms associated with fungi were delayed by about three days.
"It increased the quality of the fruit," Handa said. "If a tomato goes to market, people won't buy it if it has started to shrivel. If we can stop that wrinkling, we can extend the market time of the fruit."
Mattoo said the finding could have implications for areas that don't often get fresh fruit.
"Shelf life is a major problem for any produce in the world, especially in countries such as in Southeast Asia and Africa that cannot afford controlled-environment storage," Mattoo said.
Handa said tomato growers and possibly other fruit growers could use the finding soon if they wanted through either transgenic plants or natural breeding methods.
"We can add this gene to the tomatoes or look at natural variation and select the cultivars that already have a high level of this gene's expression," Handa said.
Handa and Mattoo will continue to study polyamines to discover how they control biological functions in fruits.
The US-Israel Binational Agricultural Research and Development Fund, the USDA Initiative for Future Agricultural Food Systems, and the Purdue Research Foundation funded the research.
viernes, 25 de junio de 2010
Currently, more than two billion LCD screens are nearing the end of their lives. Chances are that you have a few yourself, but if you're like most Americans, you probably won't recycle them.
Electronic waste is a serious problem. Toxic chemicals, including lead, cadmium and mercury, pose an environmental hazard to soil, should they leach into it. And hazardous materials, such as arsenic and acid, are used by people in developing countries to extract valuable metals from circuit boards and wires, which they sell for income. It's a good thing researchers are finding new ways to use old LCD screens.
In many research labs, scientists are trying to find uses for e-waste, such as turning the materials into Olympic medals and using old components to turn algae into a biofuel.
And now researchers at the University of York's Department of Chemistry have found a way to turn electronic waste from LCD screens into an anti-microbial substance that destroys infections such as Escherichia coli , some strains of Staphylococcus aureus and other unpronounceable, yet dangerous, types of bacteria.
Polyvinyl-alcohol (PVA) is a key element of LCD televisions. It's also a chemical compound that is compatible with the human body.
Andrew Hunt and his colleagues had to cool and then heat PVA, dehydrate it with ethanol, and add a dash of silver nanoparticles to enhance the material's anti-microbial properties. The final product could be used in hospital cleaning solutions to help to reduce infections. According to the York University press release, the product "could also be used in pills and dressings that are designed to deliver drugs to particular parts of the body."
Hunt and his team confess that more work needs to be done. Regulatory agencies still must guarantee that silver nanoparticles are suitable for human health applications.
But since LCD screens are the fastest growing source of electronic waste in the European Union, it's good to know that people are working on ways to diminish the potential hazards.
We will have more reporting on electronic waste when we launch a new Wide Angle on June 28. Get excited!
40,000 years ago rapid warming led to an increase in methane concentration. The culprit for this increase has now been identified. Mainly wetlands in high northern latitudes caused the methane increase, as discovered by a research team from the University of Bern and the German Alfred Wegener Institute for Polar and Marine Research in the Helmholtz Association. This result refutes an alternative theory discussed amongst experts, the so-called "clathrate gun hypothesis." The latter assumed that large amounts of methane were released from the ocean sediment and led to higher atmospheric methane concentrations and thus to rapid climate warming.
Earlier measurements on ice cores showed that the atmospheric methane concentration changed drastically in parallel to rapid climate changes occurring during the last ice age. Those climate changes – so-called Dansgaard-Oeschger events – were characterised by a sudden warming and an increase in methane concentration. However, it was not yet clear to what extent the climate changes 40,000 years ago led to the methane increase or vice versa. Climate researchers from the Universities in Bern and Copenhagen and from the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven now conclude that the methane increase at that time was largely due to higher methane emissions from wetlands. As published by the researchers in the current issue of the journal Science, these natural methane sources produced more methane especially in high northern latitudes in response to the warming. Through their study the researchers also refute another controversial hypothesis, which claimed that large amounts of methane stored as clathrate in the ocean sediment along the continental margins was released and triggered the rapid warming.
The scientists stress, however, that the climate conditions 40,000 years ago are not comparable to the current climate evolution. "Our results do not imply that methane or other greenhouse gases play no role for climate change. Our study reflects natural climate conditions during the last ice age, long before mankind affected global climate by emitting greenhouse gases. At that time climate warming caused an increase in methane concentration, generating in turn a more substantial greenhouse effect. Nowadays, additional methane and carbon dioxide are artificially emitted into the atmosphere by human activities and are the main driver of the observed climate warming."
Ongoing studies of the Alfred Wegener Institute in Arctic permafrost regions take on greater importance in view of these research results.
Novel analytical method: Clear isotopic "fingerprints"
In nature a few methane molecules (CH4) have one more neutron in the carbon and hydrogen atoms they are made of and are therefore a little heavier. Methane from wetland sources has fewer molecules with the heavier hydrogen atom than methane produced in the ocean. Accordingly, the marine and terrestrial methane sources have unique "isotopic fingerprints". Using these fingerprints, it is possible to quantify the emission of both sources. Developing a novel analytical method at the University of Bern and the Alfred Wegener Institute to take these "fingerprints" allowed the international team of scientists to come up with the unambiguous results now published in "Science".
The European Space Agency's Venus Express is helping planetary scientists investigate whether Venus once had oceans. If it did, it may even have begun its existence as a habitable planet similar to Earth.
These days, Earth and Venus seem completely different. Earth is a lush, clement world teeming with life, whilst Venus is hellish, its surface roasting at temperatures higher than those of a kitchen oven.
But underneath it all the two planets share a number of striking similarities. They are nearly identical in size and now, thanks to ESA's Venus Express orbiter, planetary scientists are seeing other similarities too.
"The basic composition of Venus and Earth is very similar," says Håkan Svedhem, ESA Venus Express Project Scientist. Just how similar planetary scientists from around the world will be discussing in Aussois, France, where they are gathering this week for a conference.
One difference stands out: Venus has very little water. Were the contents of Earth's oceans to be spread evenly across the world, they would create a layer 3 km deep. If you were to condense the amount of water vapour in Venus' atmosphere onto its surface, it would create a global puddle just 3 cm deep.
Yet there is another similarity here. Billions of years ago, Venus probably had much more water. Venus Express has certainly confirmed that the planet has lost a large quantity of water into space.
It happens because ultraviolet radiation from the Sun streams into Venus' atmosphere and breaks up the water molecules into atoms: two hydrogens and one oxygen. These then escape to space.
Venus Express has measured the rate of this escape and confirmed that roughly twice as much hydrogen is escaping as oxygen. It is therefore believed that water is the source of these escaping ions. It has also shown that a heavy form of hydrogen, called deuterium, is progressively enriched in the upper echelons of Venus's atmosphere, because the heavier hydrogen will find it less easy to escape the planet's grip.
"Everything points to there being large amounts of water on Venus in the past," says Colin Wilson, Oxford University, UK. But that does not necessarily mean there were oceans on the planet's surface.
Eric Chassefière, Université Paris-Sud, France, has developed a computer model that suggests the water was largely atmospheric and existed only during the very earliest times, when the surface of the planet was completely molten. As the water molecules were broken into atoms by sunlight and escaped into space, the subsequent drop in temperature probably triggered the solidification of the surface. In other words: no oceans.
Although it is difficult to test this hypothesis it is a key question. If Venus ever did possess surface water, the planet may possibly have had an early habitable phase.
Even if true, Chassefière's model does not preclude the chance that colliding comets brought additional water to Venus after the surface crystallised, and these created bodies of standing water in which life may have been able to form.
There are many open questions. "Much more extensive modelling of the magma ocean-atmosphere system and of its evolution is required to better understand the evolution of the young Venus," says Chassefière.
When creating those computer models, the data provided by Venus Express will prove crucial.
A research team led by Brown University has documented dozens of channels carved by melted water from glaciers located in the midlatitude region of Mars. The glaciofluvial valleys were carved in Mars' most recent epoch, the team reports, supporting the idea that the Red Planet was home to diverse watery environments in its recent past. Results are published in Icarus.
Planetary scientists have uncovered telltale signs of water on Mars -- frozen and liquid -- in the earliest period of the Red Planet's history. A new claim, made public this month, is that a deep ocean covered some of the northern latitudes.
But the evidence for water grows much more scant after the Noachian era, which ended 3.5 billion years ago. Now Brown University planetary geologists have documented running water that sprang from glaciers throughout the Martian middle latitudes as recently as the Amazonian epoch, several hundred million years ago. These glaciofluvial valleys were, in essence, tributaries of water created when enough sunlight reached the glaciers to melt a thin layer on the surface. This, the Brown researchers write, led to "limited surface melting" that formed channels that ran for several kilometers and could be more than 150 feet wide.
The finding is "more than 'Yes, we found water,'" said Caleb Fassett, postdoctoral research associate in geological sciences and lead author of the paper published in Icarus. "What we see now is there's this complex history of different environments where water is being formed."
Caleb Fassett, with Brown research analyst James Dickson, professor James Head III, and geologists from Boston University and Portland State University, analyzed 15,000 images snapped by the Context Camera (CTX) aboard the Mars Reconnaissance Orbiter to compile the first survey of glaciofluvial valleys on Mars. The survey was sparked by a glaciofluvial valley that Dickson, Fassett, and Head spotted within the Lyot crater, located in the planet's middle latitudes. The team, in a paper last year in Geophysical Research Letters, dated that meltwater-inspired feature to the Amazonian.
In his survey, Fassett found dozens of other Amazonian-era ice deposits that spawned supraglacial and proglacial valleys, most of them located on the interior and exterior of craters in Mars' midlatitude belt. "The youthfulness (of the features) is surprising," he said. "We think of [post-Noachian] Mars as really, really cold and really, really dry, so the fact that these exist, in those kinds of conditions, is changing how we view the history of water on the planet."
What makes the finding even more intriguing is that the Brown planetary scientists can study what they believe are similar conditions on Earth. Teams from Brown and Boston University have visited the Antarctic Dry Valleys for years, where the surfaces of glaciers melt during the austral summer, sparking enough meltwater to carve a channel. The team will return to the Dry Valleys later this year to continue the study of this microclimate.
"It's sort of crazy," said Dickson, a member of the Brown team who stayed in the Dry Valleys for three months last year. "You're freezing cold and there's glacial ice everywhere, and it gets just warm enough that you get a river."
Fassett plans to search for more glaciofluvial valleys as more images come from the CTX, which has mapped roughly 40 percent of the planet.
Contributing authors include Joseph Levy of Portland State (who earned his Ph.D. at Brown last year) and James Marchant of Boston University. The research was funded by NASA.
Researchers from the Wyss Institute for Biologically Inspired Engineering at Harvard University, Harvard Medical School and Children's Hospital Boston have created a device that mimics a living, breathing human lung on a microchip. The device, about the size of a rubber eraser, acts much like a lung in a human body and is made using human lung and blood vessel cells.
Because the lung device is translucent, it provides a window into the inner-workings of the human lung without having to invade a living body. It has the potential to be a valuable tool for testing the effects of environmental toxins, absorption of aerosolized therapeutics and the safety and efficacy of new drugs. Such a tool may help accelerate pharmaceutical development by reducing the reliance on current models, in which testing a single substance can cost more than $2 million.
"The ability of the lung-on-a-chip device to predict absorption of airborne nanoparticles and mimic the inflammatory response triggered by microbial pathogens, provides proof-of-principle for the concept that organs-on-chips could replace many animal studies in the future," says Donald Ingber, senior author on the study and founding director of Harvard's Wyss Institute.
The paper appears in the June 25 issue of Science.
Room to breathe
Until now, tissue-engineered microsystems have been limited either mechanically or biologically, says Ingber, who is also the Judah Folkman professor of vascular Biology at Harvard Medical School and Children's Hospital Boston. "We really can't understand how biology works unless we put it in the physical context of real living cells, tissues and organs."
With every human breath, air enters the lungs, fills microscopic air sacs called alveoli and transfers oxygen through a thin, flexible, permeable membrane of lung cells into the bloodstream. It is this membrane -- a three-layered interface of lung cells, a permeable extracellular matrix and capillary blood vessel cells -- that does the lung's heavy lifting. What's more, this lung-blood interface recognizes invaders such as inhaled bacteria or toxins and activates an immune response.
The lung-on-a-chip microdevice takes a new approach to tissue engineering by placing two layers of living tissues -- the lining of the lung's air sacs and the blood vessels that surround them -- across a porous, flexible boundary. Air is delivered to the lung lining cells, a rich culture medium flows in the capillary channel to mimic blood and cyclic mechanical stretching mimics breathing. The device was created using a novel microfabrication strategy that uses clear rubbery materials. The strategy was pioneered by another Wyss core faculty member, George Whitesides, the Woodford L. and Ann A. Flowers University Professor at Harvard University.
"We were inspired by how breathing works in the human lung through the creation of a vacuum that is created when our chest expands, which sucks air into the lung and causes the air sac walls to stretch," says first author Dan Huh, a Wyss technology development fellow at the Institute. "Our use of a vacuum to mimic this in our microengineered system was based on design principles from nature."
To determine how well the device replicates the natural responses of living lungs to stimuli, the researchers tested its response to inhaled living E. coli bacteria. They introduced bacteria into the air channel on the lung side of the device and at the same time flowed white blood cells through the channel on the blood vessel side. The lung cells detected the bacteria and, through the porous membrane, activated the blood vessel cells, which in turn triggered an immune response that ultimately caused the white blood cells to move to the air chamber and destroy the bacteria.
"The ability to recreate realistically both the mechanical and biological sides of the in vivo coin is an exciting innovation," says Rustem Ismagilov, professor of chemistry at the University of Chicago, who specializes in biochemical microfluidic systems.
The team followed this experiment with a "real-world application of the device," says Huh. They introduced a variety of nano-scaled particles (a nanometer is one-billionth of a meter) into the air sac channel. Some of these particles exist in commercial products; others are found in air and water pollution. Several types of these nanoparticles entered the lung cells and caused the cells to overproduce free radicals and to induce inflammation. Many of the particles passed through the model lung into the blood channel, and the investigators discovered that mechanical breathing greatly enhanced nanoparticle absorption. Benjamin Matthews, Harvard Medical School assistant professor in the Vascular Biology Program at Children's Hospital Boston, verified these new findings in mice.
"Most importantly, we learned from this model that the act of breathing increases nanoparticle absorption and that it also plays an important role in inducing the toxicity of these nanoparticles," Huh says.
"This lung-on-a-chip is neat and merges a number of technologies in an innovative way," says Robert Langer, MIT Institute professor. "I think it should be useful in testing the safety of different substances on the lung and I can also imagine other related applications, such as in research into how the lung functions."
According to Ismagilov, it's too early to predict how successful this field of research will be. Still, "the potential to use human cells while recapitulating the complex mechanical features and chemical microenvironments of an organ could provide a truly revolutionary paradigm shift in drug discovery," he says.
The investigators have not yet demonstrated the system's capability to mimic gas exchange between the air sac and bloodstream, a key function of the lungs, but, says Huh, they are exploring this now.
The Wyss Institute team is also working to build other organ models, such as a gut-on-a-chip, as well as bone marrow and even cancer models. Further, they are exploring the potential for combining organ systems.
For example, Ingber is collaborating with Kevin Kit Parker, associate professor at Harvard University's School of Engineering and Applied Sciences and another Wyss core faculty member, who has created a beating heart-on-a-chip. They hope to link the breathing lung-on-a-chip to the beating heart-on-a-chip. The engineered organ combination could be used to test inhaled drugs and to identify new and more effective therapeutics that lack adverse cardiac side effects.
This research was funded by the the National Institutes of Health, the American Heart Association and the Wyss Institute for Biologically Inspired Engineering at Harvard University.
A University of Southampton archaeologist and Oxford Archaeology have found evidence that Neanderthals were living in Britain at the start of the last ice age, 40,000 years earlier than previously thought.
Commissioned by Oxford Archaeology, the University of Southampton's Dr Francis Wenban-Smith discovered two ancient flint hand tools at the M25 / A2 road junction at Dartford in Kent, during an excavation funded by the Highways Agency. The flints are waste flakes from the manufacture of unknown tools, which would almost certainly have mostly been used for cutting up dead animals. Tests on sediment burying the flints show they date from around 100,000 years ago, proving Neanderthals were living in Britain at this time. The country was previously assumed to have been uninhabited during this period.
"I couldn't believe my eyes when I received the test results. We know that Neanderthals inhabited Northern France at this time, but this new evidence suggests that as soon as sea levels dropped, and a 'land bridge' appeared across the English Channel, they made the journey by foot to Kent," says Francis.
Early pre-Neanderthals inhabited Britain before the last ice age, but were forced south by a previous glaciation about 200,000 year ago. When the climate warmed up again between 130,000 and 110,000 years ago, they couldn't get back because, similar to today, the Channel sea-level was raised, blocking their path. This discovery shows they returned to our shores much earlier than 60,000 years ago, as previous evidence suggested.
"The fieldwork uncovered a significant amount of activity at the Dartford site in the Bronze Age and Roman periods, but it is deeper trenches excavated through much older sediments which have yielded the most interesting results -- shedding light on a long period when there was assumed to have been an absence of early man from Britain," comments Oxford Archaeology Project Manager David Score.
One theory is that Neanderthals may have been attracted back to Kent by the flint-rich chalk downs visible from France. These supported herds of mammoth, rhino, horse and deer -- an important source of food in sub-arctic conditions.
"These are people who had no real shelter -- no houses, not even caves, so we can only speculate that by the time they returned, they had developed physiologically to cope with the cold, as well as developing behavioural strategies such as planning winter stores and making good use of fire," says Dr Francis Wenban-Smith.
The last glacial period (or ice age) occurred between around 110,000 to 10,000 years ago, but this was interspersed with fluctuations when the climate temporarily warmed. It is unclear whether Neanderthal colonisation across North Western Europe and Britain was related to these minor fluctuations.. Dr Wenban-Smith believes more evidence is needed to date their occupations more accurately, to show how many were living in Kent at this time, how far they roamed into Britain and how long they stayed for. The Channel is also a critical area for further research, with the buried landscape between Boulogne and Newhaven -- provisionally christened "Boulognia" -- possibly containing the crucial evidence.
The excavation was carried out prior to construction work on the scheme by the Costain Group PLC. The archaeological investigations were designed by Jacobs Engineering U.K. Ltd, in consultation with Kent County Council. Other results from the project include the discovery of a woolly rhino tooth in the floodplain gravels of the River Darent, dated at around 40,000 years old.
jueves, 24 de junio de 2010
Sharks Carrying Drug-Resistant "Bacterial Monsters" Flushed medicines spawning antibiotic-resistant bacteria in oceans, study says.
Our leftover medicines are spawning drug-resistant "bacterial monsters" that thrive inside sharks, scientists say.
The finding suggests antibiotics such as penicillin may be leaching into the environment and spurring drug-resistant bacteria to evolve and multiply in the oceans.
"Bacteria have sex, basically. They can transfer genetic material," said study leader Mark Mitchell, professor of veterinary clinical medicine at the University of Illinois at Urbana-Champaign.
Mitchell and colleagues found antibiotic-resistant bacteria in seven species of shark—such as bull sharks, lemon sharks, and nurse sharks—as well as the redfish Sciaenops ocellata. The fish live in coastal waters off Belize, Florida, Louisiana, and Massachusetts.
Though random mutations can account for the drug-resistant bacteria, there's ample evidence for human origin, he noted.
"What do people do with antibiotics when they don't finish them? They flush them down the toilet [or] put them in the garbage," Mitchell said.
Trashed Medicines Making Monsters
Bacteria exposed to the drugs develop resistance, Mitchell said, so "we have the risk of creating these bacterial monsters."
These monsters may cause particularly virulent illnesses in sharks and fish. But the researchers are also concerned the resistant bacteria will find their way back into the human food chain.
Though sharks aren't a staple in the human diet, we eat what they eat—crab, shrimp, and other fish. So people should be aware of these risks and handle food appropriately to avoid infection, Mitchell cautioned. "I will eat things like sushi," he said. "But knowing there are those types of risks, I'm going to try and get it from healthy, wild-caught fish, where there might be more of a minimal exposure [to drugs]."
Findings appear this month in the Journal of Zoo and Wildlife Medicine.
The largest of all mollusks, the giant clam prefers the warm waters around Australia's Great Barrier Reef.
The giant clam gets only one chance to find a nice home. Once it fastens itself to a spot on a reef, there it sits for the rest of its life.
These bottom-dwelling behemoths are the largest mollusks on Earth, capable of reaching 4 feet (1.2 meters) in length and weighing more than 500 pounds (227 kg). They live in the warm waters of the South Pacific and Indian Oceans.
Giant clams achieve their enormous proportions by consuming the sugars and proteins produced by the billions of algae that live in their tissues. In exchange, they offer the algae a safe home and regular access to sunlight for photosynthesis, basking by day below the water's surface with their fluted shells open and their multi-colored mantles exposed. They also use a siphon to draw in water to filter and consume passing plankton.
Giant clams have a wildly undeserved reputation as man-eaters, with South Pacific legends describing clams that lie in wait to trap unsuspecting swimmers or swallow them whole. No account of a human death by giant clam has ever been substantiated, and scientists say its adductor muscles, used to close the shell, move far too slowly to take a swimmer by surprise. Even the largest specimen would simply retreat into its shell rather than attempt to sample human prey.
The adductor muscle of the giant clam is actually considered a delicacy, and overharvesting of the species for food, shells, and the aquarium trade have landed it on at least one group's "vulnerable" list.
Cosmologists at UCL are a step closer to determining the mass of the elusive neutrino particle, not by using a giant particle detector, but by gazing up into space.
Although it has been shown that a neutrino has a mass, it is vanishingly small and extremely hard to measure -- a neutrino is capable of passing through a light year (about six trillion miles) of lead without hitting a single atom.
New results using the largest ever survey of galaxies in the universe puts total neutrino mass at no larger than 0.28 electron volts -- less than a billionth of the mass of a single hydrogen atom. This is one of the most accurate measurements of the mass of a neutrino to date.
The research is due to be published in an upcoming issue of the journal Physical Review Letters, and will be presented at the Weizmann:UK conference at UCL on 22-23 June 2010. It resulted from the PhD thesis of Shaun Thomas, supervised by Prof. Ofer Lahav and Dr. Filipe Abdalla.
Professor Ofer Lahav, Head of UCL's Astrophysics Group, said: "Of all the hypothetical candidates for the mysterious Dark Matter, so far neutrinos provide the only example of dark matter that actually exists in nature. It is remarkable that the distribution of galaxies on huge scales can tell us about the mass of the tiny neutrinos."
The work is based on the principle that the huge abundance of neutrinos (there are trillions passing through you right now) has a large cumulative effect on the matter of the cosmos, which naturally forms into "clumps" of groups and clusters of galaxies. As neutrinos are extremely light they move across the universe at great speeds which has the effect of smoothing this natural "clumpiness" of matter. By analysing the distribution of galaxies across the universe (i.e. the extent of this "smoothing-out" of galaxies) scientists are able to work out the upper limits of neutrino mass.
Central to this new calculation is the existence of the largest ever 3D map of galaxies, called Mega Z, which covers over 700,000 galaxies recorded by the Sloan Digital Sky Survey and allows measurements over vast stretches of the known universe.
The Cosmologists at UCL were able to estimate distances to galaxies using a new method that measures the colour of each of the galaxies. By combining this enormous galaxy map with information from the temperature fluctuations in the after-glow of the Big Bang, called the Cosmic Microwave Background radiation, they were able to put one of the smallest upper limits on the size of the neutrino particle to date.
Dr. Shaun Thomas commented: "Although neutrinos make up less than 1% of all matter they form an important part of the cosmological model. It's fascinating that the most elusive and tiny particles can have such an effect on the Universe."
Dr. Filipe Abadlla added: "This is one of the most effective techniques available for measuring the neutrino masses. This puts great hopes to finally obtain a measurement of the mass of the neutrino in years to come."
The authors are confident that a larger survey of the Universe, such as the one they are working on called the international Dark Energy Survey, in which UCL is heavily involved, will yield an even more accurate weight for the neutrino, potentially at an upper limit of just 0.1 electron volts.
The work was funded in part by the Science and Technologies Facilities Council and the Royal Society.
Researchers set fine nets over coral just before mass spawning events in order to collect the eggs and sperm as they were broadcast into the water column. Gametes were collected and returned to the lab within an hour to be raised under controlled conditions. (Credit: Iliana Baums laboratory, Penn State University)
Discoveries about tropical coral reefs are expected to be invaluable in efforts to restore the corals, which are succumbing to bleaching and other diseases at an unprecedented rate as ocean temperatures rise worldwide. The research gives new insights into how the scientists can help to preserve or restore the coral reefs that protect coastlines, foster tourism, and nurture many species of fish. The research, which will be published in the journal PLoS One, was accomplished by an international team whose leaders include Iliana Baums, an assistant professor of biology at Penn State University.
The team focused on one of the most abundant reef-building species in the Caribbean, Montastraea faveolata, known as the mountainous star coral. Though widespread, this species is listed as endangered on the Red List of the International Union for the Conservation of Nature because its numbers have declined significantly -- in recent years, up to 90 percent of the population has been lost in some areas.
Discovering how corals respond to ocean warming is complicated because corals serve as hosts to algae. The algae live in the coral and feed on its nitrogen wastes. Through photosynthesis, the algae then produce the carbohydrates that feed the coral. When this complex and delicate symbiosis is upset by a rise in ocean temperature, the coral may expel the algae in a phenomenon known as coral bleaching, which may cause the death of both algae and coral. The challenge is to figure out why some corals cope with the heat stress better than others.
"We decided to focus on coral larvae because the successful dispersal and settlement of larvae is key to the survival of reefs," explains Baums. "Also, since free-swimming larvae do not yet have symbiotic algae, we can record the expression of different genes in our samples and know that we are looking at the molecular response of the coral itself to heat stress."
Star coral broadcasts eggs and sperm into the water column in mass spawning events, which occur in the Caribbean a few days after the full moon in August. Fertilization occurs quickly when the larvae reach the surface, and then they drift for as much as two weeks before settling on the hard surfaces where they will spend the rest of their lives. Free-swimming larvae are especially vulnerable to ecological changes because they have limited energetic reserves. Scientifically, studying coral larvae has distinct advantages over documenting the response of adult coral to thermal stress.
Logistically, however, studying larvae scientifically is not so easy. "We have to find suitable reefs with known, and therefore roughly predictable, spawning habits," explains Baums. "These reefs have to be close enough to shore that we can get into the water and out to the corals within the first hour of spawning, which always happens at night. When we see that the corals are about to spawn, we set up nets over coral colonies to catch the fragile gametes before they can reach the surface, then we rush back to shore to set up controlled matings and get the young corals back into aquarium tanks before they die." Once spawning started, the scientists worked nearly around the clock for a few days. If they failed to capture enough larvae, or if the larvae died in captivity, the experiment could not be repeated until the following year.
The team successfully collected spawn from two populations of mountainous star coral, one off Key Largo, Florida, and one off Puerto Morales, Mexico. Keeping spawn from the two sites separate, the scientists allowed fertilization to occur in captivity, then they raised the embryos at different temperatures. They recorded the developmental stage and gene expression in the embryos between 12 and 48-to-50.5 hours after fertilization, comparing those embryos raised at normal temperatures with those raised at temperatures that were 1-to-2 degrees Centigrade higher.
The embryos from Florida and Mexico developed similarly in the first 50 hours, with the high-temperature embryos maturing only slightly faster than the embryos raised at normal temperatures. Strikingly, larvae raised at higher temperatures showed many more irregular, misshapen embryos than those raised at normal temperatures. For example, after 46 hours, fully 50 percent of the high-temperature embryos from Florida were deformed as compared to the normal-temperature embryos, none of which were malformed. The Mexican samples showed the same pattern but those embryos were less strongly affected by the elevated temperature. Although both populations represent the same species, they responded differently to heat stress, showing genetic variability within the species.
In addition to examining the physical appearance of embryos as they developed, the team extracted RNA from approximately 1,500 embryos from each location to see how much of each of 1,300 molecular products were being transcribed at a given time. Genes that were transcribed in different amounts between high-temperature and normal-temperature samples were called deferentially expressed genes. Twenty-four hours after fertilization, embryos from the same site showed similar gene expression profiles regardless of the temperature at which they were raised. As the time since fertilization increased, the samples showed more and more deferentially expressed genes, 458 in all. Of the 218 deferentially expressed genes that were sensitive to temperature, almost none were shared between the two locations on the first day of sampling, but by the second day, roughly 25 percent were shared between samples from Mexico and Florida. By 48 hours, thermal stress -- not sampling location -- became the dominant factor influencing gene expression. At that point, the gene expression of coral subjected to similar temperatures clustered together regardless of their place of origin.
The team then classified the deferentially expressed genes into functional groups and found that the genes most sensitive to temperature changes were primarily those involved in cell proliferation, growth, and development. The genes that varied according to location of origin were most often involved in cell adhesion, protein degradation, and protein biosynthesis.
"Our study shows that the response of larvae to changing conditions depends upon where the parent colonies lived," says Baums. "Clearly the coral larvae from Mexico and Florida respond differently to heat stress, even though they belong to the same species, showing adaptations to local conditions. The two populations have different adaptive potential."
Baums said she is excited by the clear evidence of local adaptations in populations that this study documented. Previous work by Baums and her colleagues has included experiments in restoring damaged coral reefs by creating larvae from controlled genetic crosses, growing them in captivity until they settle onto ceramic tiles, and then transplanting them into selected areas to replenish damaged reefs. Some crosses survive in higher-temperature water better than others, some survive in captivity better than others, and some settle more reliably onto the prepared tiles that are used to form or restore colonies. The new information from the current study will be invaluable in restoration work.
"Variation among populations in gene expression offers the species as a whole a better chance of survival under changing conditions," Baums said. "We might be able to screen adult populations for their ability to produce heat-resistant larvae and focus our conservation efforts on those reefs."
Scientists are reporting development of a new use for magnetic levitation, or "maglev," the futuristic technology best known for enabling high-speed passenger trains to float above the tracks. In ACS' bi-weekly Journal of Agricultural and Food Chemistry, they describe putting maglev to use in an inexpensive sensor for analyzing food, water, and other beverages.
George Whitesides and colleagues note that measurements of a substance's density are important in the food industry, health care, and other settings because they provide key information about a substance's chemical composition. Density measurements, for instance, can determine the sugar content of soft drinks, the amount of alcohol in wine, or whether irrigation water contains too much salt to use on a farmer's field. Existing devices for making those measurements are far from ideal, and a need exists for simpler, less expensive, easy-to-use technology.
The scientists describe development of a special sensor that uses maglev to meet those needs, suspending solid or liquid samples with the aid of magnets to measure their density. About the size of an ice cube, the sensor consists of a fluid-filled container with magnets at each end. Samples of different materials can be placed inside, and the distance they migrate through the fluid provides a measure of their density. The scientists showed that the device could quickly estimate the salt content of different water samples and the relative fat content in different kinds of milk, cheese, and peanut butter. "Potential applications of maglev may include evaluating the suitability of water for drinking or irrigation, assessing the content of fat in foods and beverages, or monitoring processing of grains (e.g., removing husk or drying)," the article notes.
The electron orbits a phosphorus atom embedded in the silicon lattice, shown in silver. The undisturbed electron density distribution, calculated from the quantum mechanical equations of motion is shown in yellow. A laser pulse can modify the electron’s state so that it has the density distribution shown in green. Our first laser pulse, arriving from the left, puts the electron into a superposition of both states, which we control with a second pulse, also from the left, to give a pulse which we detect, emerging to the right. The characteristics of this "echo" pulse tell us about the superposition we have made. (Credit: UCL)
The remarkable ability of an electron to exist in two places at once has been controlled in the most common electronic material -- silicon -- for the first time. The research findings -- published in Nature by a UK-Dutch team from the University of Surrey, UCL (University College) London, Heriot-Watt University in Edinburgh, and the FOM Institute for Plasma Physics near Utrecht -- marks a significant step towards the making of an affordable "quantum computer."
According to the research paper, the scientists have created a simple version of Schrodinger's cat -- which is paradoxically simultaneously both dead and alive -- in the cheap and simple material out of which ordinary computer chips are made.
"This is a real breakthrough for modern electronics and has huge potential for the future," explained Professor Ben Murdin, Photonics Group Leader at the University of Surrey. "Lasers have had an ever increasing impact on technology, especially for the transmission of processed information between computers, and this development illustrates their potential power for processing information inside the computer itself. In our case we used a far-infrared, very short, high intensity pulse from the Dutch FELIX laser to put an electron orbiting within silicon into two states at once -- a so-called quantum superposition state. We then demonstrated that the superposition state could be controlled so that the electrons emit a burst of light at a well-defined time after the superposition was created. The burst of light is called a photon echo; and its observation proved we have full control over the quantum state of the atoms."
And the development of a silicon based "quantum computer" may be only just over the horizon. "Quantum computers can solve some problems much more efficiently than conventional computers -- and they will be particularly useful for security because they can quickly crack existing codes and create un-crackable codes," Professor Murdin continued. "The next generation of devices must make use of these superpositions to do quantum computations. Crucially our work shows that some of the quantum engineering already demonstrated by atomic physicists in very sophisticated instruments called cold atom traps, can be implemented in the type of silicon chip used in making the much more common transistor."
Professor Gabriel Aeppli, Director of the London Centre for Nanotechnology added that the findings were highly significant to academia and business alike. "Next to iron and ice, silicon is the most important inorganic crystalline solid because of our tremendous ability to control electrical conduction via chemical and electrical means," he explained. "Our work adds control of quantum superpositions to the silicon toolbox."