lunes, 28 de febrero de 2011
Project Icarus is an ambitious five-year study into launching an unmanned spacecraft to an interstellar destination. Headed by the Tau Zero Foundation, a non-profit group of scientists dedicated to interstellar spaceflight, Icarus is working to develop a spacecraft that can travel to a nearby star. Dr. Robert Adams, study lead in the Advanced Concepts Office at the NASA Marshall Space Flight Center and Lead Designer for the Mission Analysis and Performance Module for Project Icarus, explains the realities behind getting an interstellar spacecraft to its destination.
One critical feature of any interstellar voyage is the "mission analysis," which organizes science objectives, plots movement through space and determines abort and correction scenarios. Some refer to this effort as "trajectory analysis" or "astrodynamics" but it encompasses more than simply plotting a path through the heavens.
Complex missions, like the one the Icarus team is tackling, involve integration of multiple spacecraft stages, gravity assists around planets and stars, correctional maneuvers, and other tools of the trade.
The mission analyst must juggle all of these options to find the most efficient path to the target, while balancing the cost and risk needs of the mission.
Vast Distances, Epic Timescales
Clearly, the major challenge that one needs to overcome for any interstellar mission is the vast distances that separate the stars. Consider, for example, the nearest star system to the sun: Alpha Centauri. At 4.4 light years away, Alpha Centauri is a binary system, stars A and B revolving about a central point and a third star, Proxima Centauri, which revolves around the other two.
Using current technology, a one-way trip to the Alpha Centauri system would take approximately 75,000 years. To put this in perspective, that's about one hundred and fifty times the amount of time that has passed since Columbus discovered America.
Now let's consider how fast we'd like to get there. Ideally, a very long-duration mission should be a maximum trip time of 50 years. A young man or woman can join the mission team immediately after leaving college and still be alive and nearing the end of their career when the probe reaches its target.
Having someone carrying the torch throughout insures some continuity to the project. Because Icarus is tasked with designing an interstellar mission that would reach the target solar system in under one hundred years, this fact in itself raises yet more remarkable challenges -- specifically the creation of an organization that could endure for that length of time.
With a maximum 100 year mission time, our vehicle must achieve a velocity of roughly 0.1 c (10 percent of the speed of light) to reach even the closest stars in this time frame. This is about a thousand times faster than any craft ever built.
For convenience, let's give ourselves a maximum top speed of 0.15 c so that we can put several target star systems in play. Achieving a velocity of 15% the speed of light is daunting, but not impossible using technology we have access to today.
In future Project Icarus articles, the team will discuss various propulsion schemes in more detail. Also to come are the challenges in protecting the vehicle as it continuously slams into interstellar dust at extreme velocities. In this article, we will focus our attention on the most efficient trajectory to reach our target velocity.
Finally, we must strongly consider deceleration methods when we reach our target star, since this is one of the objectives of the Icarus mission. At 15 percent the speed of light, a spacecraft would buzz through a typical solar system in a matter of hours. A 50 year trip that culminates in less than a day's observation and acquisition of data is… unsatisfying. So if possible, we'd like to find a way to slow down the spacecraft and stay a while.
There are a few tools in the bag that we pull out for any difficult space mission. Staging our vehicle allows us to cast off mass like tanks and engines when they're no longer needed. Gravity assists also allow us to gain velocity by making close passes to planets in our solar system. We take a little mission time to fly up on a planet, say Jupiter, and take on it like a NASCAR driver "slingshotting" around a competitor. This method of accelerating space vehicles has been used since 1959 when the Soviet probe Luna 3 photographed the far side of the moon.
However, our usual tricks are useful but are not sufficient to meet the challenges of the Icarus mission because of the incredible velocities we need to achieve. Here we need to bring back a very old and largely forgotten maneuver.
First described by Hermann Oberth in 1927, the two-burn escape maneuver can be very effective for this mission. Consider a spacecraft orbiting a much more massive body like the sun. Oberth described how the spacecraft could reverse its thrust, actually decelerating and slowing down, to drop closer to the massive body.
When the spacecraft reaches the closest point to the body it flips around and makes a "hard burn" to accelerate as much as possible. Calculations by the author show that such a maneuver can achieve 2-3 times the velocity possible without the maneuver. It is important to appreciate that this is very different to the famous 'gravity assist' maneuver.
Seems like getting something for free, doesn't it? And we all know that the universe is notoriously stingy with freebies. However, this maneuver does not violate any laws of physics.
Note that the fuel in the vehicle is also in orbit around the massive body. If the vehicle burns some fuel while dropping to a lower orbit, then much more propellant can be burned at the lower orbit. The propellant is then left behind in a much lower (less energetic) orbit. Oberth realized that releasing the chemical or nuclear energy of a fuel was not actually using all of the propellant's available energy. The propellant also has mechanical (kinetic and potential) energy that can be released -- thus accelerating the vehicle -- by use of his maneuver.
To Jupiter, the Sun, then the Stars
By using Oberth's maneuver around the sun, we can anticipate that a vehicle that can achieve a velocity of 5 percent the speed of light will actually blast out of the solar system at our target 15 percent the speed of light. However there are still a number of issues to be investigated.
What is the spacecraft's maximum acceleration? How close can the spacecraft actually get to the sun without suffering dire radiation and heating effects? How many stages will the vehicle need to have? And attaining 5 percent the speed of light before Oberth's maneuver is still a very challenging task. Future articles will discuss the propulsion systems that have potential to reach these velocities.
A possible mission profile using the options mentioned above would start in Low Earth Orbit (LEO) where the Icarus probe is built. Using conventional propulsion systems such as liquid (chemical) rockets, the craft begins its journey to Jupiter. The chemical rockets will be jettisoned immediately after this Trans-Jovian Injection (TJI).
Jupiter's gravity will drag the Icarus probe into a new trajectory perpendicular to the plane of the solar system. Additionally this gravity assist will shorten the Icarus orbit so that it starts falling back towards the sun. A short burst from its fusion engines will tighten the probes orbit, so that years later the probe will fly deep inside the sun's atmosphere, the solar corona.
A set of fuel tanks will be jettisoned at Jupiter after the short burst. As the probe nears this closest approach to the sun it will start firing at maximum thrust for weeks, or months. The probe will have to use some of its propulsive force to steer towards the sun as it accelerates. The probe will swing around the sun while continuing to accelerate. Multiple sets of drop tanks will be dropped along the way. The thrusting will not stop until the probe achieves the target velocity.
The probe will continue to thrust in short bursts on its long journey to the target star. These bursts will make up any drag caused by interstellar gases and dust, and to make minor course corrections as needed. As the probe approaches the target star it will conduct the Oberth maneuver in reverse, thrusting at maximum as it swings around the target star. If possible, the probe will slow sufficiently to be captured around the target star, and extend the time the probe has to study the star system.
It is not yet clear if the Icarus team will be able to find the right combination of propulsive technologies and mission options that will enable the full acceleration and deceleration at the target star. And many of the mission elements described above will be traded against numerous other options.
The Icarus team is committed to find the mission profile that will allow humanity to take those first few steps out to the stars.
What could be worse a meteorite hitting you? Two meteorites hitting you, at the same time!
As shown in the above HiRISE image, this is exactly what happened on Mars. These two impact craters were formed simultaneously, but how do we know that?
If one meteor smashed into the planet, followed by another impact at a later date, one of the craters would overlap the other. But for this double impact to look so symmetrical, they had to have impacted at the same time.
Is this a case of simple luck? Did the Cosmos decide to throw two space rocks at Mars into the same place at the same time? Unlikely, it was most likely caused by one object, that split into two when entering the Martian atmosphere. That way, both halves (coincidentally of approximately the same size, in this case) impacted right next to each other, creating this fascinating double-impact crater, sharing one crater rim.
Interestingly, as noted on the HiRISE mission website, we know of some very oddly shaped asteroids and comets that could split into two when hitting a planetary atmosphere. Remember asteroid Itokawa (that was visited by the Japanese Hayabusa probe) with the double-lobed, "rubble pile" shape? The loose consistency of Itokawa would most likely cause it to disintegrate and break apart on hitting a planet, creating simultaneous impact craters.
Also, comet Hartley 2 with its signature "dog-bone" shape could break into two, creating a double impact like this Mars example.
But why stop at a double-impact crater when you could have a triple-impact crater? Mars has one of those too! It's less defined as the double-impact crater shown above, probably because this triple impact crater is older, thus more eroded, but it's impressive all the same:
Images: A double impact cater on Mars as imaged by the HiRISE camera on NASA's Mars Reconnaissance Orbiter (top), a triple impact crater also imaged by HiRISE (bottom). Credit: NASA/JPL/University of Arizona.
* A male dog that lived 7,000 years ago in Siberia ate human food and was buried as though he were a human.
* The Husky-like dog's remains suggest he worked alongside humans throughout his lifetime.
* A wolf was also ritualistically buried nearby, perhaps serving as a protector to humans in the afterlife.
Burial remains of a dog that lived over 7,000 years ago in Siberia suggest the male Husky-like animal probably lived and died similar to how humans did at that time and place, eating the same food, sustaining work injuries, and getting a human-like burial.
"Based on how northern indigenous people understand animals in historic times, I think the people burying this particular dog saw it as a thinking, social being, perhaps on par with humans in many ways," said Robert Losey, lead author of a study about the dog burial, which has been accepted for publication in the Journal of Anthropological Archaeology.
"I think the act of treating it as a human upon its death indicates that people knew it had a soul, and that the mortuary rites it received were meant to ensure that this soul was properly cared for," added Losey, an associate professor of anthropology at the University of Alberta.
For the study, Losey collaborated with excavation director Vladimir Bazaliiskii and researchers Sandra Garvie-Lok, Mietje Germonpre, Jennifer Leonard, Andrew Allen, Anne Katzenberg, and Mikhail Sablin. Bazaliiskii found the buried dog at the Shamanka cemetery near Lake Baikal, Siberia.
"Just like the humans in the cemetery, the dog was buried with other items, (such as) a long spoon made of antler," Losey said.
The dog was carefully laid to rest lying on his right side in a grave pit that, at other levels, also contained five partial human skeletons.
DNA and stable isotope analysis determined the animal was indeed a dog and that he ate exactly what humans at the site consumed: fish, freshwater seal meat, deer, small mammals, and some plant foods.
The canine's life, as well as that of the people, wasn't easy, though.
"The dog's skeleton, particularly its vertebrate spines, suggests that it was repeatedly used to transport loads," Losey explained. "This could have included carrying gear on its back that was used in daily activities like hunting, fishing, and gathering plant foods and firewood. The dog also could have been used to transport gear for the purposes of relocating settlements on a seasonal basis."
Additional fractures suggest the dog suffered numerous blows during its lifetime, possibly from the feet of red deer during hunting outings. The researchers cannot rule out that humans hit the dog, but its older age at burial, food provisions, and more suggest otherwise.
From the same general time period, the scientists also found a wolf burial at a site called Lokomotiv near the Irkut and Angara rivers in Siberia.
The wolf, which did not consume human-provided foods, appears to have died of old age. Its remains were found wrapped around a human skull. There is no evidence the wolf interacted with the person when alive.
"Perhaps the burial of the wolf with the human head placed between its feet was done to send the spirit or soul of the wolf with this particular human to the afterlife, perhaps as its protector," Losey said.
Susan Crockford, adjunct professor of anthropology at the University of Victoria and author of the book "A Practical Guide to In Situ Dog Remains for the Field Archaeologist" (2009), told Discovery News that she was "surprised to see the description of the wolf/human internment," she added. "That is definitely unusual."
Crockford isn't supporting any particular interpretation of the burials just yet, however, since she said, "There can be many reasons for the ritual treatment of dogs, including ones we might never imagine.
Photograph by Laurent Ballesta
The coelacanth was thought to have gone extinct with the dinosaurs. Rediscovered in 1938, it is chronicled here in a rare photographic account.
It's not every day that a living fossil shows up in a fisherman's net.
But that's what happened in 1938, when a South African museum curator named Marjorie Courtenay-Latimer spied a bizarre creature with thick scales, unusual fins, and an extra lobe on its tail, amid an otherwise ordinary haul of fish. Though she didn't know it straightaway, Courtenay-Latimer had rediscovered the coelacanth, which was assumed to have died out at the end of the Cretaceous period but somehow outlasted many of its prehistoric peers, dwelling deep in the ocean, undisturbed—and undetected—for eons.
Since this chance sighting, Latimeria chalumnae have been found in several pockets in the Indian Ocean. No one knows how many there are—maybe as few as 1,000 or as many as 10,000. Because of the depth of their habitat, they have mainly been photographed by submersibles and remotely operated vehicles. Divers first documented the fish in 2000; in January and February 2010, a specially trained team dived deep to take pictures of a small colony in Sodwana Bay, South Africa.
From the earliest days of aviation, pilots have relied upon paper maps to help find their way. Even in an era of GPS and advanced avionics, you still see pilots lugging around 20 pounds or more of charts. But those days are numbered, because maps are giving way to iPads.
The Federal Aviation Administration is allowing charter company Executive Jet Management to use Apple’s tablet as an approved alternative to paper charts. The authorization follows three months of rigorous testing and evaluation of the iPad and Mobile TC, a map app developed by aviation chartmaker Jeppesen.
The latest decision applies only to Executive Jet Management, but it has implications for all of aviation. By allowing the company’s pilots to use the Apple iPad as a primary source of information, the FAA is acknowledging the potential for consumer tablets to become avionics instruments.
The iPad has been popular with pilots of all types since its introduction last year. But until now, it could not be used in place of traditional paper charts or FAA-approved devices such as more expensive, purpose built electronic flight bags. The iPad was OK for reference, but not as a pilot’s sole source of information. The new FAA authorization changes all that.
To receive FAA authorization, Jeppesen and Executive Jet Management went through a rigorous approval process. It included rapid-decompression testing from a simulated altitude of 51,000 feet and ensuring the tablet will not interfere with critical navigation or electronic equipment.
Executive Jet tested the iPad and Mobile TC in 10 aircraft flown by 55 pilots during 250 flights.
The first thought many pilots, not to mention passengers, will have is: What happens if the iPad or the app crashes?
Jeff Buhl, Jeppesen’s product manager for the Mobile TC app, says the Apple iOS operating system and the app proved “extremely stable” during testing. In the “unlikely” event of a crash, he says, it takes but a moment to get them running again.
“The recovery time for an application crashing or the OS crashing is extremely rapid,” Buhl says. During the evaluation period with the FAA, the production app did not crash. But even if it did, Buhl says it’s ready to go again “in 4-6 seconds from re-launch to previous state.”
The FAA says each individual operator — in this case Executive Jet Management — must develop specific procedures for dealing with system or software crashes and other issues. Under the authorization, Executive Jet Management will require a second approved electronic device, which most likely will be another iPad, in the cockpit.
Although this authorization applies to just one company, it is a milestone for all operators, including major airlines, because it opens the door for them to embrace the iPad. Though any company wishing to follow Executive Jet’s lead will have to endure equally rigorous scrutiny by the FAA.
Agency spokesman Les Dorr says the process is no different from what is required for any other electronic device (.pdf) used to display navigation information.
“As far as the iPad is concerned, we do that on a case-by-case basis when an airline applies to be able to use it,” Dorr says.
The FAA is already seeing more requests to use the iPad in the cockpit. Alaska Airlines began testing the iPad back in November and there are about 100 pilots currently evaluating the device according to spokeswoman Marianne Lindsey. She says in addition to the convenience, there is a practical weight saving aspect to using the iPad as well, “it’s replaced about 25 pounds of manuals and charts.”
Jeppesen’s director of portfolio management, Tim Huegel, says several carriers are looking into using the iPad and TC Mobile, and with the FAA granting one approval, it should become increasingly easy for others to follow Executive Jet’s lead.
“We’ll be able to reuse a lot of the documentation and the lessons learned working with Executive Jet Management to help our commercial customers as they now begin to pursue FAA authorization,” he says.
The charts available with Mobile TC include charts for visual flight rules and for instrument flight rules, which are more commonly used by commercial operators. The app only shows an electronic version of the paper charts Jeppesen has been producing for years, but Huegel says future versions could incorporate the iPad’s GPS capability.
He sees a day when tablets provide “door-to-door management” of a pilot’s information, from crew scheduling to weather information to navigation charts.
Photo: The Mobile TC app on an iPad. (Jim Merithew)
The ultra-dense remains of the galaxy’s youngest supernova are full of bizarre quantum matter.
Two new studies show for the first time that the core of the neutron star Cassiopeia A, is a superfluid, a friction-free state of matter that normally only exists in ultra-cold laboratory settings.
“The interior of neutron stars is one of the best kept secrets of the universe,” said astrophysicist Dany Page of the National Autonomous University in Mexico, lead author of a paper in the Feb. 25 Physical Review Letters describing the state of the star. “It looks like we broke one of them.”
Cassiopeia A (Cas A) was a massive star 11,000 light-years away whose explosion was observed from Earth about 330 years ago. The supernova left behind a tiny, compact body called a neutron star, in which matter is so densely packed that electrons and protons are forced to fuse into neutrons. Neutron star material is some of the most extreme matter in the universe. Just a teaspoonful of neutron star stuff weighs about 6 billion tons.
The neutron star in Cas A was first spotted in 1999, shortly after the Chandra X-Ray Observatory began scanning the sky for objects that emit X-rays.
Last year, astronomers Craig Heinke of the University of Alberta and Wynn Ho of the University of Southampton noticed something odd: The neutron star was cooling down at an alarmingly fast rate. In just 10 years, the star had cooled from 2.12 million degrees to 2.04 million degrees, a drop of 4 percent.
Theoretical models predicted that neutron stars should cool slowly as the neutrons inside decayed into electrons, protons and nearly-massless particles called neutrinos that flee the star quickly, taking heat with them.
But ordinary neutron decay is too slow. Two competing groups of physicists, one led by Page and one including Heinke and Ho, saw that something else must be going on in Cas A.
Almost simultaneously, both teams came to the same solution: The matter inside the neutron star is converting to a superfluid as astronomers watch. Heinke and Ho’s paper will appear in the Monthly Notices of the Royal Astronomical Society.
Here’s how it works: Normally, the laws of quantum mechanics dictate that a collection of neutrons can get only so cold, but no colder. But at extremely cold temperatures in the lab, or the extremely high pressures inside a neutron star, pairs of neutrons can link up. Together, the neutron pairs relax into the lowest energy state quantum physics allows, and convert to a superfluid.
“A superfluid is essentially a macroscopic quantum liquid, in which if you take any given particle in the fluid, it’s moving in essentially the same way as the particles around it,” said Bennett Link of the University of Montana, who was not involved in the new studies. “The whole system behaves as a quantum system even though it’s large in size.”
Superfluids flow without friction. On Earth, they can climb walls and escape from airtight containers. When the particles in a superfluid are charged, the fluid is a superconductor, which carries electricity with no resistance.
As the neutrons and protons in the neutron star link up to form superfluids, they release massive amounts of neutrinos. The mass exodus of neutrinos fleeing Cas A explains the rapid cooling, the physicists conclude.
The idea that neutron stars should contain superfluids had been around since the 1950s. Page and colleagues had even predicted theoretically that the core of Cas A in particular should be a superfluid.
“We knew that it was there, our models had it all included before, but we did not have the data to actually hang our coats on,” said Madappa Prakash of Ohio University, a coauthor on Page’s paper.
Page didn’t expect that superfluidity would actually show itself in Cas A. When he learned that Heinke and Ho had seen the star’s temperature drop precipitously, “I jumped and my head hit the ceiling,” he said.
Both teams knew the other group was working on the same idea, and raced in friendly competition to publish their theory first. Page’s team ended up winning the race by one day. Heinke and Ho were waiting for one more observation from Chandra, taken in November 2010, before submitting their paper for publication.
The papers differ only in the details. The two teams made different assumptions about how hot the neutrons were to begin with, so their calculations for the temperature at which the superfluid state is possible are different.
Both teams predict that Cas A will continue to cool down over the next 10 years.
“That allows people to test it against alternative hypotheses, such as, it’s some kind of episodic thing,” Link said. “If it’s still cooling at the same rate, that would give evidence for their hypothesis, that we are actually seeing a superfluid form.”
X-ray Image: NASA/CXC/UNAM/Ioffe/D.Page,P.Shternin et al; Optical Image: NASA/STScI; Illustration: NASA/CXC/M.Weiss
Stars are balls of glowing gas, with a nearly spherical shape. Accordingly, one would expect that when some stars explode as supernovae at the end of their lives, the resulting colossal fireballs should share this spherical symmetry. However, recent investigations are revealing that some of these events are not round. New data gathered at Calar Alto Observatory reinforce this surprising finding.
As one knows from the Sun, stars are nearly perfect spheres of glowing gas. One might expect that a star retained this shape, even when dramatic events happen during its lifetime. Therefore, both the slow, steady stellar winds from massive stars, as well as the cataclysmic explosions called supernovae, in which some stars end their lives, were assumed to be symmetric -quasi-spherical clouds of matter expelled into space.
However, recent developments in the observation of supernovae are providing increasing evidence that the explosion of a (nearly round) star can result in a strongly deformed fireball.
Supernovae of various kinds
The most powerful stellar explosions are called supernovae. Their amazing luminosity makes them visible over huge intergalactic distances. Some supernovae arise as a result of the interaction of peculiar stars, white dwarfs, with other stars placed very close to them. These are the so-called thermonuclear supernovae. Other explosions, core-collapse or gravitational supernovae, happen when very massive stars die. These stars have consumed the fuel that makes them shine, the energy source that supports their internal structure against the tendency to shrink and collapse due to the pull of gravity. They suffer an energy crisis that leads to an extremely violent collapse and, after that, to an explosion of apocalyptic intensity.
We are now interested in one specific sub-class of gravitational supernovae: those labelled as "Type IIn supernovae." So far only three of them have been observed with techniques capable of providing information on the shape of the explosions. But, interestingly enough, in all three cases strong evidence of an asymmetric fireball has been found! The most recent of these studies was conducted by an international team of astronomers lead by F. Patat (ESO, Garching, Germany), who observed supernova 2010jl in November 2010 using Calar Alto telescopes and instruments.
Supernova 2010jl scrutinized
Supernova 2010jl appeared in the constellation Leo during the first days of November 2010. Its host galaxy was UGC 5189A, a strangely shaped specimen, an example of a galaxy in strong tidal interaction with some neighbouring galaxies. Such interaction usually leads to an intense formation of new stars, the more massive of which later will appear as gravitational supernovae. The distance to UGC 5189A is estimated to be some 160 million light-years (49 megaparsecs). This means that, although the event was seen in November 2010, the explosion really took place 160 million years ago.
Patat's team observed this explosion using a specific technique, called spectropolarimetry, which allows to infer information on the shape of an object, even though the object itself appears as a simple, tiny point at the telescope. They made use of the spectropolarimetric capabilities of the instrument CAFOS attached to the Zeiss 2.2 m Calar Alto reflector. In the course of these observations, the researchers analysed in detail the excellent performance of this instrument, which allowed them to deduce interesting details about the process of the stellar explosion.
Light propagates through space as a wave, a vibration of the electromagnetic field that can be compared to the waves produced when a stone is dropped on the surface of water. But water waves imply only vertical movements of the surface (up and down), while natural light waves oscillate in all possible planes: up-down, left-right, and all intermediate combinations; none of them is preferred over the others. Several physical mechanisms can, however, lead to an emission of light in which one of the oscillation directions is dominant: in these cases we speak of polarized light. All processes leading to polarization imply the existence of privileged directions in the emitter, i.e. a certain degree of asymmetry. The observations of SN 2010jl show, in the researchers' words, that light from the supernova "appears to be polarized at a very significant level across the whole spectral range; […] the level of polarization measured in SN 2010jl (~2%) is indicative of a substantial asphericity, of axial ratio ≤0.7."
Where does the asymmetry come from?
Gravitational supernovae arise from massive stars. In the case of SN 2010jl, it has been estimated that the parent star had a mass around thirty times that of the Sun, if not larger. Such heavy stars drive their lives wildly, consume their resources rapidly and shine only for a few million years (which is short compared with the estimated total lifespan for the Sun -some ten thousand million years). The intense energy output tears material out from the stellar surface. So, the star is continuously emitting not only energy, but also some amount of matter, atomic and subatomic particles that constitute the stellar wind and form an envelope around the star. When the final hour comes and the star explodes as a supernova, the expanding fireball collides with this envelope, and emits light due to processes that happen both inside the hot gas and at the contact surface between the hot gas and the envelope.
In SN 2010jl, the processes responsible for the polarization of light are due to the interaction with the envelope. So, the question arises: is the asymmetry caused by an intrinsically non-spherical explosion, or are we facing a more symmetrical fireball interacting with an elongated envelope? In any case, both the explosion and the envelope come from the same almost spherical star. Rotation and magnetic fields are no doubt involved in the generation of the asymmetry, but further studies are needed to clarify this point. Calar Alto telescopes and instruments will be ready to help in this effort.
The Austrian research group led by physicist Rainer Blatt suggests a fundamentally novel architecture for quantum computation. They have experimentally demonstrated quantum antennas, which enable the exchange of quantum information between two separate memory cells located on a computer chip. This offers new opportunities to build practical quantum computers.
The researchers have published their work in the scientific journal Nature.
Six years ago scientists at the University of Innsbruck realized the first quantum byte -- a quantum computer with eight entangled quantum particles; a record that still stands. "Nevertheless, to make practical use of a quantum computer that performs calculations, we need a lot more quantum bits," says Prof. Rainer Blatt, who, with his research team at the Institute for Experimental Physics, created the first quantum byte in an electromagnetic ion trap. "In these traps we cannot string together large numbers of ions and control them simultaneously."
To solve this problem, the scientists have started to design a quantum computer based on a system of many small registers, which have to be linked. To achieve this, Innsbruck quantum physicists have now developed a revolutionary approach based on a concept formulated by theoretical physicists Ignacio Cirac and Peter Zoller. In their experiment, the physicists electromagnetically coupled two groups of ions over a distance of about 50 micrometers. Here, the motion of the particles serves as an antenna. "The particles oscillate like electrons in the poles of a TV antenna and thereby generate an electromagnetic field," explains Blatt. "If one antenna is tuned to the other one, the receiving end picks up the signal of the sender, which results in coupling." The energy exchange taking place in this process could be the basis for fundamental computing operations of a quantum computer.
Antennas amplify transmission
"We implemented this new concept in a very simple way," explains Rainer Blatt. In a miniaturized ion trap a double-well potential was created, trapping the calcium ions. The two wells were separated by 54 micrometers. "By applying a voltage to the electrodes of the ion trap, we were able to match the oscillation frequencies of the ions," says Blatt.
"This resulted in a coupling process and an energy exchange, which can be used to transmit quantum information." A direct coupling of two mechanical oscillations at the quantum level has never been demonstrated before. In addition, the scientists show that the coupling is amplified by using more ions in each well. "These additional ions function as antennas and increase the distance and speed of the transmission," says Rainer Blatt, who is excited about the new concept. This work constitutes a promising approach for building a fully functioning quantum computer.
"The new technology offers the possibility to distribute entanglement. At the same time, we are able to target each memory cell individually," explains Rainer Blatt. The new quantum computer could be based on a chip with many micro traps, where ions communicate with each other through electromagnetic coupling. This new approach represents an important step towards practical quantum technologies for information processing.
The quantum researchers are supported by the Austrian Science Fund FWF, the European Union, the European Research Council and the Federation of Austrian Industries Tyrol.
Sulfur is the sixth most abundant element on Earth and plays a key role in many geological and biological processes. A French-German team including CNRS1 and the Université Paul Sabatier has identified, on the basis of laboratory measurements, a novel form of sulfur present in geological fluids: the S3- ion. The discovery calls existing theories about the geological transport of sulfur into question, and could provide ways of identifying new deposits of precious metals such as gold and copper.
These findings are published in the Feb. 25, 2011 issue of the journal Science.
Until now, geochemists believed that inside Earth, only two forms of molecules contained sulfur: sulfides (based on H2S or S2-) and sulfates (based on H2SO4 or SO42-). Yet they had no way of directly plunging a probe into the hydrothermal fluids2 that flow through rocks to verify this theory. To get round this problem and test their hypothesis, the French-German team first created fluids similar to those in Earth's crust and mantle, i.e. aqueous solutions containing elementary sulfur (S) and thiosulfates (molecules containing the S2O32− ion). They then used a diamond anvil cell to bring the fluids to the temperatures and pressures found at depths of several kilometers (several hundred degrees and tens of thousands of atmospheres).
The researchers used an optical method known as Raman spectroscopy to identify the chemical species, and they were astounded to discover not two, but three forms of sulfur, the third being the trisulfur ion S3-. This was a double surprise: although S3- was already known to chemists (it is found in sulfur-containing silicate glass and ultramarine pigments for instance), it had never been observed in an aqueous solution.
The detection of S3- during these experiments means that sulfur must be considerably more mobile in hydrothermal fluids in Earth's crust than was previously thought. This is because, unlike sulfides and sulfates, which attach to minerals as soon as they appear in fluids, S3- proves to be extremely stable in the aqueous phase. In other words, below ground these ions must flow for long distances in dissolved form, taking with them the noble metals to which they may be bound. This chemical species may therefore be the main metal transporting agent in two major types of gold and copper deposits: Archaean greenstone belts3 and subduction zone magmas.
This discovery could provide additional indicators in the search for new deposits, by helping geologists to identify the pathways along which metals travel prior to forming veins. In addition, the presence of S3- in hydrothermal fluids could affect sulfur isotope fractionation models (a sort of equivalent to the carbon-14 dating technique), which until now have taken no account of this chemical species. These new findings could for instance help scientists to find out more about the geological conditions in Earth's crust and on its surface shortly after the appearance of life.
1. Laboratoire 'Géosciences Environnement Toulouse' (CNRS/Université Paul Sabatier/IRD) and Bayerisches Geoinstitut/University of Bayreuth.
2. A hydrothermal fluid is a natural hot, aqueous fluid whose temperature usually exceeds 100°C.
3. These rocks formed during the Archaean era, between -4 and -2.5 billion years ago.
viernes, 25 de febrero de 2011
Even a regional nuclear war could spark "unprecedented" global cooling and reduce rainfall for years, according to U.S. government computer models.
Widespread famine and disease would likely follow, experts speculate.
During the Cold War a nuclear exchange between superpowers—such as the one feared for years between the United States and the former Soviet Union—was predicted to cause a "nuclear winter."
In that scenario hundreds of nuclear explosions spark huge fires, whose smoke, dust, and ash blot out the sun for weeks amid a backdrop of dangerous radiation levels. Much of humanity eventually dies of starvation and disease.
Today, with the United States the only standing superpower, nuclear winter is little more than a nightmare. But nuclear war remains a very real threat—for instance, between developing-world nuclear powers, such as India and Pakistan.
To see what climate effects such a regional nuclear conflict might have, scientists from NASA and other institutions modeled a war involving a hundred Hiroshima-level bombs, each packing the equivalent of 15,000 tons of TNT—just 0.03 percent of the world's current nuclear arsenal.
The researchers predicted the resulting fires would kick up roughly five million metric tons of black carbon into the upper part of the troposphere, the lowest layer of the Earth's atmosphere.
In NASA climate models, this carbon then absorbed solar heat and, like a hot-air balloon, quickly lofted even higher, where the soot would take much longer to clear from the sky.
(Related: "'Nuclear Archaeologists' Find World War II Plutonium.")
Reversing Global Warming?
The global cooling caused by these high carbon clouds wouldn't be as catastrophic as a superpower-versus-superpower nuclear winter, but "the effects would still be regarded as leading to unprecedented climate change," research physical scientist Luke Oman said during a press briefing Friday at a meeting of the American Association for the Advancement of Science in Washington, D.C.
Earth is currently in a long-term warming trend. After a regional nuclear war, though, average global temperatures would drop by 2.25 degrees F (1.25 degrees C) for two to three years afterward, the models suggest.
At the extreme, the tropics, Europe, Asia, and Alaska would cool by 5.4 to 7.2 degrees F (3 to 4 degrees C), according to the models. Parts of the Arctic and Antarctic would actually warm a bit, due to shifted wind and ocean-circulation patterns, the researchers said.
After ten years, average global temperatures would still be 0.9 degree F (0.5 degree C) lower than before the nuclear war, the models predict.
Years Without Summer
For a time Earth would likely be a colder, hungrier planet.
"Our results suggest that agriculture could be severely impacted, especially in areas that are susceptible to late-spring and early-fall frosts," said Oman, of NASA's Goddard Space Flight Center in Greenbelt, Maryland.
"Examples similar to the crop failures and famines experienced following the Mount Tambora eruption in 1815 could be widespread and last several years," he added. That Indonesian volcano ushered in "the year without summer," a time of famines and unrest.
All these changes would also alter circulation patterns in the tropical atmosphere, reducing precipitation by 10 percent globally for one to four years, the scientists said. Even after seven years, global average precipitation would be 5 percent lower than it was before the conflict, according to the model.
In addition, researcher Michael Mills, of the National Center for Atmospheric Research in Colorado, found large decreases in the protective ozone layer, leading to much more ultraviolet radiation reaching Earth's surface and harming the environment and people.
"The main message from our work," NASA's Oman said, "would be that even a regional nuclear conflict would have global consequences."
Before succumbing to her legendary death-by-snake in 30 B.C., Cleopatra VII, last queen of Egypt, gave birth to twins.
Alexander Helios and Cleopatra Selene II were born in 40 B.C., two of the eight children sired by Roman general Mark Antony during his lifetime.
As it happens, the asteroid 216 Kleopatra also had twins: Two small moons were recently found orbiting the space rock.
Kleopatra the asteroid was discovered in 1880, waaaay out in the so-called main belt, between the orbits of Mars and Jupiter. (See an interactive solar system.)
Using Earth-based telescopes in Chile and Puerto Rico, scientists discovered in 2000 that Kleopatra has two lobes connected by a long midsection, reminiscent of a dog's bone—albeit one that's 135 miles (217 kilometers) long.
Red Rover, Red Rover, let Kleo come over?
Radar picture courtesy Stephen Ostro et al. (JPL), Arecibo Radio Telescope, NSF, NASA
The radar images that revealed her curves also told scientists that Kleopatra is most likely made of metal, with a loose, gravelly surface texture. But the asteroid's interior structure remained a mystery.
In 2008 higher resolution pictures taken with a telescope in Hawaii confirmed the dog-bone shape and uncovered the two orbiting offspring, each about five miles (eight kilometers) wide.
Just as children's behavior can say a lot about a parent, the motions of these moons can be used to decipher the properties of the ancient asteroid.
Writing in this month's issue of the journal Icarus, astronomers based in California, Texas, and France carefully charted the orbits of the two small moons.
Given the moons' paths, sizes, shapes, and masses, the team could determine that Kleopatra must have a density of 3.6 grams per cubic centimeter.
Assuming the asteroid really is metallic, that means it's not very dense for its size ... somewhere between 30 and 50 percent empty space.
In other words, it seems mighty Kleopatra is just a porous pile of rubble.
This makes perfect sense, the study authors say. Kleopatra likely has its odd shape because it's a product of multiple collisions.
To start, a solid metallic asteroid was turned to smithereens when it had a violent clash with another of its kind millions to billions of years ago. The debris coalesced into a rubble pile asteroid.
A second, angled impact sent the hunk of loose material spinning rapidly about a hundred million years ago, causing the body to stretch into its current shape and, over time, to cast off the moons.
Feb. 25, 2011 -- Brown-eyed, bearded, furrow faced, and tired: this is how Ötzi the Iceman might have looked, according to the latest reconstruction based on 20 years of research and investigations.
Realized by two Dutch experts, Alfons and Adrie Kennis, the model was produced with the latest in forensic mapping technology that uses three-dimensional images of the mummy's skull as well as infrared and tomographic images.
The new reconstruction shows a prematurely old man, with deep-set eyes, sunken cheeks, a furrowed face and ungroomed beard and hair.
Although he looks tired, Ötzi has vivid brown eyes. Indeed, recent research on the 5,300-year-old mummy has shown that the Stone Age man did not have blue eyes as previously thought.
Believed to have died around the age of 45, Ötzi was about 1.60 meters (5 foot, 3 inches) tall and weighed 50 kilograms (110 pounds).
The model will go on display beginning March 1 to Jan. 15, 2012, at the South Tyrol Museum of Archaeology in Bolzano, Italy.
Called "Ötzi 20," the exhibition celebrates the 20th anniversary of the mummy’s discovery. The Iceman’s frozen body was found in a melting glacier in the Ötzal Alps -- hence the Ötzi name -- on Sept. 19, 1991.
By Rossella Lorenzi
See it now, folks! Between the Bearded Lady and the Monster Spider –- the World's Smallest Computer!
Seriously, the world's smallest computer. It barely covers the “N” on a penny. The prototypical sensor device developed by researchers at the University of Michigan is intended to monitor eye pressure for glaucoma patients. It connects wirelessly to other computers and is charged with a solar cell, needing just 1.5 hours of sunlight or 10 hours of indoor light to reach full power.
The mini-computer has potential for a plethora of applications, from sensing pollution, structural integrity, tracking and surveillance -- virtually any way one could think of to make an object “smart.” Furthermore, researchers can control the size and shape of the computer's antenna, dictating how it communicates with other devices. So the chip's radio doesn't need tuning from the outside; a number of the tiny computers could automatically start talking to each other as soon as they turn on then, as long as they are built to pick up the same frequency.
The size of this gadget may seem shocking, but a computer science prophet, Gordon Bell, predicted in the '70's the evolution of computer “classes” such that this revelation is, perhaps, to be expected. Bell's Law says that roughly every decade a new class of computer –- think PC, netbook, smartphone –- enters the stage and changes the industry. This law is often presented as a corollary to Moore's Law, which states that about every two years, the number of transistors that can fit on an integrated circuit -- essentially the computing power of a chip -- doubles.
If Daft Punk were writing this blog, they might have titled it “Smaller, Better, Faster, Stronger.”
Greg Chen/University of Michigan
Probably one of the highest risk/reward activity in modern science is being conducted by a very small group of astronomers: the search for signals from extraterrestrial civilizations (SETI). Because they are trying to answer a purely hypothetical question, SETI astronomers certainly have detractors that wonder if the pursuit is worth even a modest investment.
But answering the question “are we alone?” would have a profound cultural and theological impact on our view of our place in the universe.
A panel of experts pondering this question were at opposite ends of the universe at the recent meeting of the American Association for the Advancement of Science (AAAS) in Washington D.C.
Emphasizing that radio and optical searches are growing exponentially, Seth Shostak of the SETI Institute predicted contact with E.T. within 20 years, “if our precepts are correct.” In other words, SETI observations over the past two years have cast a bigger net over the galaxy than in the previous 50 years of searching.
Howard Smith of the Harvard-Smithsonian Center for Astrophysics was downright dour, however. He reiterated his strident thesis that was picked up by a British tabloid two week earlier: There's nobody out there. Intelligent life is highly improbable. Or, at least it’s highly improbable we’ll ever find it, he said.
Miracles and the Fermi Paradox
The shortcoming of Smith’s hypothesis is that it is blatantly pre-Copernican thinking -- that Earth holds a special place in the universe. His conclusions subtly flirt with the idea we are the only fruit of God’s handiwork. And, in that context, he is eager to emphasize our critical need for stewardship over this planet. "We are probably alone and will have to solve our own problems," he said.
Astronomical discoveries over the past 400 years have consistently reasserted the Copernican Principle -- the latest being the Kepler Space Telescope’s harvest of over 1,200 planets orbiting other stars.
As Smith tried to whittle away at the number of potentially habitable Kepler exoplanets, Shostak couldn’t resist taking a goal shot. Extrapolating from the Kepler data, he estimated that there are at least 10 trillion trillion Earth-like planets in the entire universe. “You would have to believe in miracles if E.T. did not exist!” he asserted.
Where's Darth Vader?
Smith countered with The Fermi Paradox. If alien civilizations all around us some would be smart enough to travel faster than light and they’d be here by now. So, if aliens exist at all, they are not that clever. Nor have they been able to come and conquer us, which would have a statistical probability in a universe teeming infinite worlds. There’s gotta be at least one Darth Vader out there somewhere.
The esteemed Harvard science historian, Owen Gingerich, dismissed this debate by simply saying, “We cannot extrapolate from just one example of intelligent life.”
Nevertheless, this dialogue leaves me as optimistic as ever of finding E.T. In fact I would say it is a 50/50 bet that SETI tells us that “we are not alone” before the great space observatories needed for conclusively finding Earth II are ever built. That is, assuming aliens uses radio or optical transmissions for saying “hi.”
But what would happen next?
Show Me Your God And I'll Show You Mine
Shostak’s optimism is mollified by his belief the first signal detected will not be readable because of the need for larger radio telescopes with better time resolution to tease out frequency or amplitude modulation. And, even if that is accomplished, decoding the message content may remain elusive for many generations.
We will simply know that we are not alone. This will permanently change the trajectory of our world view in ways similar to the Copernican revolution, discovery of the New World, or Darwinism.
The AAAS participants pondered how finding E.T. would impact the great world religions. Surveys show that only 10 percent of religious people think that such a discovery would challenge their view of God. In fact the popular evangelist Billy Graham belied in extraterrestrials.
The teachings of Islam are a bit ambivalent on this question said Nidhal Guessoum of the American University of Sharjar, United Arab Emirates. The Koran says that because Allah is omnipotent, creation is ongoing in a universe full of grandeur. The Koran also describes Allah as “Lord of the Worlds,” and implies there are other Earths in the heavens.
But the Koran also paints an ultra-anthropic view of the universe. Humans are Allah’s lieutenants and put smack-dab at the center of his creation.
The Apple Test
The existence of E.T. would be more problematic in Christian theology.
In the “fall from Eden” as described in Genesis, the entire universe is cursed because of the Original Sin of Adam and Eve (which is a basic tenant of Catholicism). A sentient being living 10,000 light-years away may not take too kindly to this idea. Imagine, the alien is supposed to believe that it’s doomed to death and judgment because a small-cranium naked biped living on a subgiant rocky planet once bit into a spheroid of carbohydrates, sugars, and water.
The essence of Christianity is redemption through God’s sacrifice of his only son. Because aliens are not descended from Adam and Eve must they be separately saved too? Or did they pass the Apple Test?
Jennifer Wiseman of NASA’s Goddard Space Flight Center is optimistic that finding E.T. would exemplify the greatness of God. “We would have a wider view of creation that embraces and integrates religious and non-religious ideas.”
Smith said that the precepts of an all powerful creator in Judaism would accept the idea of life off the Earth.
So, our first question for the aliens might be: “Got God?”
PayPal has frozen the account of a group that has been raising money for the legal defense of accused WikiLeaks source Bradley Manning, citing a failure to meet PayPal’s requirement for nonprofit groups.
According to Courage to Resist, a military veterans advocacy group that has been raising donations for Manning’s defense, PayPal froze the account after the group refused to link its PayPal account to its checking account, which would give the online payment provider access to funds in the checking account.
“We exchanged numerous e-mails and phone calls with the legal department and the office of executive escalations of PayPal,” said Jeff Paterson in a press release. “They said they would not unrestrict our account unless we authorized PayPal to withdraw funds from our organization’s checking account by default. Our accounting does not allow for this type of direct access by a third party, nor do I trust PayPal as a business entity with this responsibility given their punitive actions against WikiLeaks — an entity not charged with any crime by any government on Earth.”
The advocacy group has been raising funds for the Manning Support Network and has so far paid Manning’s defense attorney at least $50,000 from money that it raised on the soldier’s behalf. Paterson did not respond to a call for comment, but said in the press release that his group opened the PayPal account in 2006.
A spokesman for PayPal took issue with how the advocacy group has characterized the matter, saying this was not about Courage to Resist’s support for Manning. Company policy requires all nonprofit organizations with 501(c)3 status [.pdf] to link their PayPal account to a bank account. That provides a clear audit trail if the IRS or other government agency ever raises questions about an organization’s non-profit status.
“It’s pretty normal practice to be honest,” said PayPal spokesman Anuj Nayar. “It doesn’t normally cause the issues that it has caused in this case. We were very surprised to see the press release.”
He acknowledges that linking to the account allows PayPal to withdraw funds from the bank account as well, but said this is never done without authorization and is generally done only when PayPal has determined that a merchant or organization is engaged in fraud.
He added that the frozen account was not specifically a Bradley Manning legal defense account.
“The release makes it very much sound like they have a legal defense fund connected to PayPal and that’s what we’ve turned off,” Nayar said. “But there is no PayPal account for a Manning legal defense fund. It’s a Courage to Resist account.”
Asked why, if the Courage to Resist account was opened in 2006, PayPal hadn’t raised the issue of linking it to a bank account earlier, Nayar did not have an immediate response. He said only that nonprofit organizations are allowed to open accounts easily and quickly.
“We don’t limit them prior to opening and saying they’re a nonprofit before allowing them to open an account,” he said.
With regard to PayPal’s assertion that it’s only following company policy, Courage to Resist says it repeatedly requested and was refused formal documentation from PayPal describing its policy.
“They opted to apply an exceptional hurdle for us to clear in order to continue as a customer, whereas we have clearly provided the legally required information and verification,” the group wrote.
Manning’s defense is expected to cost about $115,000. In addition to the funds raised by Courage to Resist, WikiLeaks — after a protracted delay — contributed $15,100 to Manning’s defense in January.
Last December, PayPal froze the account of the Germany-based Wau Holland foundation, which manages the bulk of donations to WikiLeaks. PayPal asserted at the time that WikiLeaks was in violation of its terms of service.
“PayPal has permanently restricted the account used by WikiLeaks due to a violation of the PayPal Acceptable Use Policy, which states that our payment service cannot be used for any activities that encourage, promote, facilitate or instruct others to engage in illegal activity,” read a statement on PayPal’s website. “We’ve notified the account holder of this action.”
PayPal didn’t indicate the nature of the illegal activity that WikiLeaks allegedly promoted, but the move against WikiLeaks came after the site began publishing 250,000 State Department cables believe to have been obtained from Manning during the time he worked as an Army intelligence analyst in Iraq. Manning was arrested last May and is currently in custody at the U.S. Marine Corps’ brig in Quantico, Virginia, awaiting a hearing in his case.
Apple recently announced a new policy regarding digital subscription fees for publications distributed on the company’s various iPrefixed devices: “Gimme.”
No, no, I’m being unfair again, oversimplifying and abridging Apple’s actual stance in pursuit of cheap laughs. I told myself I’d stop doing that. So here, in Apple’s own words, is the actual new policy regarding digital subscriptions, and the fees thereof:
Bug_altext“Gimme the money.”
In legal terms, this means that companies and individuals selling subscriptions to their digital offerings — including America’s sweetheart, Wired magazine — must hand over 30 percent of the take, ideally in a burlap sack with a large dollar sign printed on it.
Furthermore — and this is what’s really braising everyone’s beef cheeks — you’re not allowed to offer your customers better deals outside Apple’s well-tended hegemony. In fact, your app can’t even direct users to a site where they’re able to buy your wares without Uncle Steve holding out his hand and clearing his throat meaningfully.
“This is actually good for everyone,” explained Apple’s junior vice president in charge of public relations, Some Liar. “This benefits consumers most of all,” he lied. “This isn’t about money, it’s about creating a consistent user experience.”
Then he died and went straight to hell, which is where liars go.
Many publishers reacted to the announcement by saying the terms would force them out of the digital-content business and back into print publishing, which is extremely profitable and will never become obsolete.
One reason Apple can charge such fees with no more shame than a sex mime is that so far the iPad’s most fearsome competition in the tablet market is the Motorola FabuPad, a $1,200 device that most industry experts agree is actually a Wooly Willy toy.
“You know, that thing where there’s some magnet shavings and you make it look like [Wooly] Willy has hair,” explained Ray Stevette, a senior consultant at The Sparkleshine Group.
Other entries into the tablet arena, unveiled at the recent Mobile World Congress in Barcelona, Spain, include the Samsung SlowPoke, the Microsoft RememberWhenPeopleUsedToBeScaredOfUs and the BlackBerry Torte, which is an actual blackberry torte.
The highlight of the show was the HP BestPad, which nobody was allowed to hold, touch or look at from any vantage point other than the two approved Press Holes. When a reporter for CNET pointed out that some glitter glue was flaking off the BestPad onto the supporting podium, the exhibit was closed, and all present were threatened.
The upshot of all this is that the iPad is set to remain the top-selling tablet device at least until someone figures out what tablet devices are actually for. This gives Steve Jobs the sort of leverage that would make Archimedes drool.
Jobs knows it, too. In his latest official statement, he said: “I am Plouton, the ancient god of wealth and the underworld. All who pass through my gates must pay the toll, and all must pass through my gates. My realm is eternal and inescapable. You are my chattel. Ha ha ha ha ha ha ha! [emphasis added].”
How severe can climate change become in a warming world? Worse than anything we've seen in written history, according to results of a study recently appearing in the journal Science.
An international team of scientists led by Curt Stager of Paul Smith's College, New York, has compiled four dozen paleoclimate records from sediment cores in Lake Tanganyika and other locations in Africa.
The records show that one of the most widespread and intense droughts of the last 50,000 years or more struck Africa and Southern Asia 17,000 to 16,000 years ago.
Between 18,000 and 15,000 years ago, large amounts of ice and meltwater entered the North Atlantic Ocean, causing regional cooling but also major drought in the tropics, says Paul Filmer, program director in the National Science Foundation's (NSF) Division of Earth Sciences, which funded the research along with NSF's Division of Atmospheric and Geospace Sciences and its Division of Ocean Sciences.
"The height of this time period coincided with one of the most extreme megadroughts of the last 50,000 years in the Afro-Asian monsoon region with potentially serious consequences for the Paleolithic humans that lived there at the time," says Filmer.
The "H1 megadrought," as it's known, was one of the most severe climate trials ever faced by anatomically modern humans.
Africa's Lake Victoria, now the world's largest tropical lake, dried out, as did Lake Tana in Ethiopia, and Lake Van in Turkey.
The Nile, Congo and other major rivers shriveled, and Asian summer monsoons weakened or failed from China to the Mediterranean, meaning the monsoon season carried little or no rainwater.
What caused the megadrought remains a mystery, but its timing suggests a link to Heinrich Event 1 (or "H1"), a massive surge of icebergs and meltwater into the North Atlantic at the close of the last ice age.
Previous studies had implicated southward drift of the tropical rain belt as a localized cause, but the broad geographic coverage in this study paints a more nuanced picture.
"If southward drift were the only cause," says Stager, lead author of the Science paper, "we'd have found evidence of wetting farther south. But the megadrought hit equatorial and southeastern Africa as well, so the rain belt didn't just move--it also weakened."
Climate models have yet to simulate the full scope of the event.
The lack of a complete explanation opens the question of whether an extreme megadrought could strike again as the world warms and de-ices further.
"There's much less ice left to collapse into the North Atlantic now," Stager says, "so I'd be surprised if it could all happen again--at least on such a huge scale."
Given what such a catastrophic megadrought could do to today's most densely populated regions of the globe, Stager hopes he's right.
Stager also holds an adjunct position at the Climate Change Institute, University of Maine, Orono.
Co-authors of the paper are David Ryves of Loughborough University in the United Kingdom; Brian Chase of the Institut des Sciences de l'Evolution de Montpellier in France and the Department of Archaeology, University of Bergen, Norway; and Francesco Pausata of the Geophysical Institute, University of Bergen, Norway.
Physicists at the National Institute of Standards and Technology (NIST) have for the first time coaxed two atoms in separate locations to take turns jiggling back and forth while swapping the smallest measurable units of energy. By directly linking the motions of two physically separated atoms, the technique has the potential to simplify information processing in future quantum computers and simulations.
Described in a paper published Feb. 23 by Nature, the NIST experiments enticed two beryllium ions (electrically charged atoms) to take turns vibrating in an electromagnetic trap, exchanging units of energy, or quanta, that are a hallmark of quantum mechanics. As little as one quantum was traded back and forth in these exchanges, signifying that the ions are "coupled" or linked together. These ions also behave like objects in the larger, everyday world in that they are "harmonic oscillators" similar to pendulums and tuning forks, making repetitive, back-and-forth motions.
"First one ion is jiggling a little and the other is not moving at all; then the jiggling motion switches to the other ion. The smallest amount of energy you could possibly see is moving between the ions," explains first author Kenton Brown, a NIST post-doctoral researcher. "We can also tune the coupling, which affects how fast they exchange energy and to what degree. We can turn the interaction on and off."
The experiments were made possible by a novel, one-layer ion trap cooled to minus 269 C (minus 452 F) with a liquid helium bath. The ions, 40 micrometers apart, float above the surface of the trap. In contrast to a conventional two-layer trap, the surface trap features smaller electrodes and can position ions closer together, enabling stronger coupling. Chilling to cryogenic temperatures suppresses unwanted heat that can distort ion behavior.
The energy swapping demonstrations begin by cooling both ions with a laser to slow their motion. Then one ion is cooled further to a motionless state with two opposing ultraviolet laser beams. Next the coupling interaction is turned on by tuning the voltages of the trap electrodes. In separate experiments reported in Nature, NIST researchers measured the ions swapping energy at levels of several quanta every 155 microseconds and at the single quantum level somewhat less frequently, every 218 microseconds. Theoretically, the ions could swap energy indefinitely until the process is disrupted by heating. NIST scientists observed two round-trip exchanges at the single quantum level.
To detect and measure the ions' activity, NIST scientists apply an oscillating pulse to the trap at different frequencies while illuminating both ions with an ultraviolet laser and analyzing the scattered light. Each ion has its own characteristic vibration frequency; when excited, the motion reduces the amount of laser light absorbed. Dimming of the scattered light tells scientists an ion is vibrating at a particular pulse frequency.
To turn on the coupling interaction, scientists use electrode voltages to tune the frequencies of the two ions, nudging them closer together. The coupling is strongest when the frequencies are closest. The motions become linked due to the electrostatic interactions of the positively charged ions, which tend to repel each other. Coupling associates each ion with both characteristic frequencies.
The new experiments are similar to the same NIST research group's 2009 demonstration of entanglement -- a quantum phenomenon linking properties of separated particles -- in a mechanical system of two separated pairs of vibrating ions. However, the new experiments coupled the oscillators' motions more directly than before and, therefore, may simplify information processing. In this case the researchers observed quantum behavior but did not verify entanglement.
The new technique could be useful in a future quantum computer, which would use quantum systems such as ions to solve problems that are intractable today. For example, quantum computers could break today's most widely used data encryption codes. Direct coupling of ions in separate locations could simplify logic operations and help correct processing errors. The technique is also a feature of proposals for quantum simulations, which may help explain the mechanisms of complex quantum systems such as high-temperature superconductors.
In addition, the demonstration also suggests that similar interactions could be used to connect different types of quantum systems, such as a trapped ion and a particle of light (photon), to transfer information in a future quantum network. For example, a trapped ion could act as a "quantum transformer" between a superconducting quantum bit (qubit) and a qubit made of photons.
University of British Columbia geophysicists are offering a new explanation for seismic tremors accompanying volcanic eruptions that could advance forecasting of explosive eruptions such as recent events at Mount Pinatubo in the Philippines, Chaiten Volcano in Chile, and Mount St. Helens in Washington State.
All explosive volcanic eruptions are preceded and accompanied by tremors that last from hours to weeks, and a remarkably consistent range of tremor frequencies has been observed by scientists before and during volcanic eruptions around the world.
However, the underlying mechanism for these long-lived volcanic earthquakes has never been determined. Most proposed explanations are dependent upon the shape of the volcanic conduit -- the 'vent' or 'pipe' through which lava passes through -- or the gas content of the erupting magma, characteristics that vary greatly from volcano to volcano and are impossible to determine during or after volcanic activity.
Published this week in the journal Nature, the new model developed by UBC researchers is based on physical properties that most experts agree are common to all explosive volcanic systems, and applies to all shapes and sizes of volcanoes.
"All volcanoes feature a viscous column of dense magma surrounded by a compressible and permeable sheath of magma, composed mostly of stretched gas bubbles," says lead author Mark Jellinek, an associate professor in the UBC Department of Earth and Ocean Sciences.
"In our model, we show that as the center 'plug' of dense magma rises, it simply oscillates, or 'wags,' against the cushion of gas bubbles, generating tremors at the observed frequencies."
"Forecasters have traditionally seen tremors as an important -- if somewhat mysterious -- part of a complicated cocktail of observations indicative of an imminent explosive eruption," says Jellinek, an expert in Geological Fluid Mechanics. "Our model shows that in systems that tend to erupt explosively, the emergence and evolution of the tremor signal before and during an eruption is based on physics that are uniform from one volcano to another."
"The role of tremors in eruption forecasting has become tricky over the past decade, in part because understanding processes underlying their origin and evolution prior to eruption has been increasingly problematic," says Jellinek. "Because our model is so universal, it may have significant predictive power for the onset of eruptions that are dangerous to humans."
The research co-led by Prof. David Bercovici of Yale University and was supported by the Canadian Institute for Advanced Research, the Natural Sciences and Engineering Research Council of Canada, and the U.S. National Science Foundation.
Super skin" is what Stanford researcher Zhenan Bao wants to create. She's already developed a flexible sensor that is so sensitive to pressure it can feel a fly touch down. Now she's working to add the ability to detect chemicals and sense various kinds of biological molecules. She's also making the skin self-powering, using polymer solar cells to generate electricity. And the new solar cells are not just flexible, but stretchable -- they can be stretched up to 30 percent beyond their original length and snap back without any damage or loss of power.
Super skin, indeed.
"With artificial skin, we can basically incorporate any function we desire," said Bao, a professor of chemical engineering. "That is why I call our skin 'super skin.' It is much more than what we think of as normal skin."
The foundation for the artificial skin is a flexible organic transistor, made with flexible polymers and carbon-based materials. To allow touch sensing, the transistor contains a thin, highly elastic rubber layer, molded into a grid of tiny inverted pyramids. When pressed, this layer changes thickness, which changes the current flow through the transistor. The sensors have from several hundred thousand to 25 million pyramids per square centimeter, corresponding to the desired level of sensitivity.
To sense a particular biological molecule, the surface of the transistor has to be coated with another molecule to which the first one will bind when it comes into contact. The coating layer only needs to be a nanometer or two thick.
"Depending on what kind of material we put on the sensors and how we modify the semiconducting material in the transistor, we can adjust the sensors to sense chemicals or biological material," she said.
Bao's team has successfully demonstrated the concept by detecting a certain kind of DNA. The researchers are now working on extending the technique to detect proteins, which could prove useful for medical diagnostics purposes.
"For any particular disease, there are usually one or more specific proteins associated with it -- called biomarkers -- that are akin to a 'smoking gun,' and detecting those protein biomarkers will allow us to diagnose the disease," Bao said.
The same approach would allow the sensors to detect chemicals, she said. By adjusting aspects of the transistor structure, the super skin can detect chemical substances in either vapor or liquid environments.
Regardless of what the sensors are detecting, they have to transmit electronic signals to get their data to the processing center, whether it is a human brain or a computer.
Having the sensors run on the sun's energy makes generating the needed power simpler than using batteries or hooking up to the electrical grid, allowing the sensors to be lighter and more mobile. And having solar cells that are stretchable opens up other applications.
A recent research paper by Bao, describing the stretchable solar cells, will appear in an upcoming issue of Advanced Materials. The paper details the ability of the cells to be stretched in one direction, but she said her group has since demonstrated that the cells can be designed to stretch along two axes.
The cells have a wavy microstructure that extends like an accordion when stretched. A liquid metal electrode conforms to the wavy surface of the device in both its relaxed and stretched states.
"One of the applications where stretchable solar cells would be useful is in fabrics for uniforms and other clothes," said Darren Lipomi, a graduate student in chemical engineering in Bao's lab and lead author of the paper.
"There are parts of the body, at the elbow for example, where movement stretches the skin and clothes," he said. "A device that was only flexible, not stretchable, would crack if bonded to parts of machines or of the body that extend when moved." Stretchability would be useful in bonding solar cells to curved surfaces without cracking or wrinkling, such as the exteriors of cars, lenses and architectural elements.
The solar cells continue to generate electricity while they are stretched out, producing a continuous flow of electricity for data transmission from the sensors.
Bao said she sees the super skin as much more than a super mimic of human skin; it could allow robots or other devices to perform functions beyond what human skin can do.
"You can imagine a robot hand that can be used to touch some liquid and detect certain markers or a certain protein that is associated with some kind of disease and the robot will be able to effectively say, 'Oh, this person has that disease,'" she said. "Or the robot might touch the sweat from somebody and be able to say, 'Oh, this person is drunk.'"
Finally, Bao has figured out how to replace the materials used in earlier versions of the transistor with biodegradable materials. Now, not only will the super skin be more versatile and powerful, it will also be more eco-friendly.
Discovery of Oldest Northern North American Human Remains Provides New Insights Into Ice-Age Culture
Scientists have discovered the cremated skeleton of a Paleoindian child in the remains of an 11,500-year-old house in central Alaska. The findings reveal a slice of domestic life that has been missing from the record of the region's early people, who were among the first
The discovery, by Ben Potter of the University of Alaska Fairbanks and colleagues, appears in the 25 February issue of the journal Science.
"The site is truly spectacular in all senses of the word," Potter said. "The cremation is quite significant, but the context of the find is important too."
In contrast to the temporary hunting camps and other specialized work sites that have produced much of the evidence of North America's early habitation, the newly discovered house appears to have been a seasonal home, used during the summer. Its inhabitants, who included women and children, foraged for fish, birds and small mammals nearby, according to Potter's team.
"Before this find we knew people were hunting large game like bison or elk with sophisticated weapons, but most of sites we had to study were hunting camps. But here we know there were young children and females. So, this is a whole piece of the settlement system that we had virtually no record of," he said.
"As part of the Beringian Land Bridge, Alaska was an important crossroads for the Old and the New Worlds. This study makes an important contribution to our understanding of the early inhabitants of Beringia and their culture," said Brooks Hanson, Deputy Editor, Physical Sciences, at Science.
The young child probably died -- it's not clear how -- before being cremated in a large pit in the center of the home. This pit had many purposes, including cooking and waste disposal. After the cremation, the pit was sealed up and the house was abandoned, the researchers report.
The name of the site where this discovery took place, "Upper Sun River," is a translation of a nearby Athabaskan placename, Xaasaa Na'. The site lies within a dune field in the boreal forest of the Tanana lowlands. The child has been named Xaasaa Cheege Ts'eniin (or Upward Sun River Mouth Child) by the local Native community, the Healy Lake Tribe.
The house's floor was dug about 27 centimeters below the original ground surface. Colored stains in the sediment suggest that poles may have been used to support the walls or roof, though it's not clear what the latter would have been made of. The entire house has not yet been fully excavated, so its total size is still unknown.
The pit at the center was oval-shaped and about 45 centimeters deep. In sediment layers beneath the skeleton, the researchers found bones of salmon, ground squirrels, ptarmigan and other small animals. The skeleton was a particular surprise, since no human remains older than a few hundred years have ever been found in Subarctic Alaska.
Only about 20 percent of the burned skeleton was preserved. The remains don't reveal the child's sex, but they do include teeth, which allowed the researchers to conclude the child was around three years old. The remains showed no signs of injury or illness, though that isn't surprising, since most health problems don't leave traces in bones.
Potter's team didn't find any objects that were clearly grave goods. The researchers did excavate two pieces of red ochre along with the skeleton, but their significance is unclear. While red ochre has been part of burials around the world, it also has many other uses.
This lack of symbolic objects is typical for a mobile hunter-gatherer society like the one at Upper Sun River, according to Potter. It should not be interpreted as a sign that the child's death was treated casually, Potter said.
"All the evidence indicates that they went through some effort. The burial was within the house. If you think of the house as the center of many residential activities: cooking, eating, sleeping, and the fact that they abandoned the house soon afterward the cremation, this is pretty compelling evidence of the careful treatment of the child," Potter said.
While the findings certainly provoke questions about the story of this particular death, for Potter and other archeologists, the site is perhaps even more valuable for what it says broadly about the lifestyles of the early people who lived in the region.
Although many of the specifics are still under debate, researchers generally believe that the first people in North America came across the Bering Land Bridge from Siberia some time near the end of the last ice age, around 13,000 years ago or earlier. Archaeological evidence from this time period is scanty, however, especially in the northern regions adjacent to the Bering Sea, known as Beringia.
Scientists have discovered only a handful of known houses in North America from the continent's first 2,000 years of human occupation. And, except for the one at Upper Sun River, those houses are in the lower 48 states or at Ushki Lake in Siberia. Ushki Lake also includes the only known burial site from this time period in Beringia.
The stone tools from contemporaneous sites in central Alaska fit into a category known as microblade technology, which consists of small, stone, razor-blade-like pieces set into larger organic points. In contrast, the more well-known Clovis people of central North America did not make microblades. In fact, the stone artifacts, along with the house structure and the types of animal remains found at Upper Sun River appear more similar to those of Siberia's Ushki Lake than to anything from the lower 48 states.
"We've got this basic technological organization system that links Alaska with the Old World," Potter said.
Researchers have debated over whether the people in central Alaska during the late Pleistocene and early Holocene were all part of one larger cultural group or whether they belonged to different groups. The tools and other remains at Upper Sun River, and their similarities to some others in the region, support the former scenario, Potter and his colleagues say.
Differences exist among the sites, but these may reflect this people's versatility, with different members carrying out different tasks, such as hunting large game or foraging for small mammals and birds, during different times of year, the researchers argue.
Throughout the excavation, Potter's groups worked closely with leaders of the Healy Lake Tribe and other native groups that live near the Upper Sun River site.
"Our consultation with the local Native groups is not only an ethical imperative in archaeology today, it has been a fulfilling and productive partnership, from my perspective," said Potter.
"We strove to be diligent with full and open negotiations from the time of discovery and before, and we have worked together to build a foundation for continued work on this find and for future discoveries."
This research was supported by the National Science Foundation.
Planet Formation in Action? Astronomers May Have Found First Object Clearing Its Path in Natal Disc Surrounding a Young Star
Using ESO's Very Large Telescope an international team of astronomers has been able to study the short-lived disc of material around a young star that is in the early stages of making a planetary system. For the first time a smaller companion could be detected that may be the cause of the large gap found in the disc. Future observations will determine whether this companion is a planet or a brown dwarf.
Planets form from the discs of material around young stars, but the transition from dust disc to planetary system is rapid and few objects are caught during this phase . One such object is T Chamaeleontis (T Cha), a faint star in the small southern constellation of Chamaeleon that is comparable to the Sun, but very near the beginning of its life . T Cha lies about 350 light-years from Earth and is only about seven million years old. Up to now no forming planets have been found in these transitional discs, although planets in more mature discs have been seen before.
"Earlier studies had shown that T Cha was an excellent target for studying how planetary systems form," notes Johan Olofsson (Max Planck Institute for Astronomy, Heidelberg, Germany), one of the lead authors of two papers in the journal Astronomy & Astrophysics that describe the new work. "But this star is quite distant and the full power of the Very Large Telescope Interferometer (VLTI) was needed to resolve very fine details and see what is going on in the dust disc."
The astronomers first observed T Cha using the AMBER instrument and the VLT Interferometer (VLTI) . They found that some of the disc material formed a narrow dusty ring only about 20 million kilometres from the star. Beyond this inner disc, they found a region devoid of dust with the outer part of the disc stretching out into regions beyond about 1.1 billion kilometres from the star.
Nuria Huélamo (Centro de Astrobiología, ESAC, Spain), the lead author of the second paper takes up the story: "For us the gap in the dust disc around T Cha was a smoking gun, and we asked ourselves: could we be witnessing a companion digging a gap inside its protoplanetary disc?"
However, finding a faint companion so close to a bright star is a huge challenge and the team had to use the VLT instrument NACO in a novel and powerful way, called sparse aperture masking, to reach their goal . After careful analysis they found the clear signature of an object located within the gap in the dust disc, about one billion kilometres from the star -- slightly further out than Jupiter is within our Solar System and close to the outer edge of the gap. This is the first detection of an object much smaller than a star within a gap in the planet-forming dust disc around a young star. The evidence suggests that the companion object cannot be a normal star  but it could be either a brown dwarf  surrounded by dust or, most excitingly, a recently formed planet.
Huélamo concludes: "This is a remarkable joint study that combines two different state-of-the-art instruments at ESO's Paranal Observatory. Future observations will allow us to find out more about the companion and the disc, and also understand what fuels the inner dusty disc."
 The transitional discs can be spotted because they give off less radiation at mid-infrared wavelengths. The clearing of the dust close to the star and the creation of gaps and holes can explain this missing radiation. Recently formed planets may have created these gaps, although there are also other possibilities.
 T Cha is a T Tauri star, a very young star that is still contracting towards the main sequence.
 The astronomers used the AMBER instrument (Astronomical Multi-BEam combineR) and the VLTI to combine the light from all four of the 8.2-metre VLT Unit Telescopes and create a "virtual telescope" 130 metres across.
 NACO (or NAOS-CONICA in full) is an adaptive optics instrument attached to ESO's Very Large Telescope. Thanks to adaptive optics, astronomers can remove most of the blurring effect of the atmosphere and obtain very sharp images. The team used NACO in a novel way, called sparse aperture masking (SAM) to search for the companion. This is a type of interferometry that, rather than combining the light from multiple telescopes as the VLTI does, uses different parts of the mirror of a single telescope (in this case, the mirror of the VLT Unit Telescope 4). This new technique is particularly good for finding faint objects very close to bright ones. VLTI/AMBER is better suited to studying the structure of the inner disc and is less sensitive to the presence of a distant companion.
 The astronomers searched for the companion using NACO in two different spectral bands -- at around 2.2 microns and at 3.8 microns. The companion is only seen at the longer wavelength, which means that the object is either cool, like a planet, or a dust-shrouded brown dwarf.
 Brown dwarfs are objects between stars and planets in size. They are not massive enough to fuse hydrogen in their cores but are larger than giant planets such as Jupiter.