Crocs Uncover

Bizarre Species

sábado, 24 de abril de 2010

Epicenter Mind Our Tech Business Word War III: Google vs. Governments


In how many ways did Google respond to this week’s letter (PDF) from the data protection authorities of nine countries criticizing the company’s approach to privacy?

The first response came in the form of a statement from Google’s official PR operation.

Predictably, it was blander than bland. “We try very hard to be upfront about the data we collect, and how we use it,” the statement suggested. “Of course we do not get everything 100 percent right — that is why we acted so quickly on [Google] Buzz.”

This is the happy-clappy voice of Google’s PR operation, a vast exercise in passive aggression designed to prop up the perception that Eric, Larry and Sergey really, really, really, don’t want to do evil.

Much more interesting was the verbal assessment a PR representative offered up to the Wall Street Journal.

“We have discussed all these issues publicly many times before and have nothing to add to the letter,” the spokesperson told the Journal.

Here, I think we’re getting closer to what Google really feels. The tone is tetchy. The underlying message is clear: Google would like the letter’s signatories to go forth and multiply.

Yet this response is also a problem for Google. The letter it received was signed by data protection bureaucrats in Canada, France, Germany, Israel, Italy, Ireland, the Netherlands, New Zealand, Spain and the UK.

You’ll notice that the US isn’t on the list. So ask yourself how the spokesperson who talked to the Journal would have responded if the US had been among the signatories.

Would she have taken the time to discuss “all these issues” once again? You bet she would. Would she have suggested that responding to the letter was pointless? I doubt it.

At this point, it’s fairly obvious that that Google has made at least three mistakes.

First, there’s the use of corporate weasel words in the original statement. When Google uses this voice to talk about important issues, it sounds just like most other large companies. The desire to disagree with something without provoking an equal and opposite reaction is a very specific form of corporate cowardice. My sense is that Google — as it grows larger — is starting to do this more often.

Next, there’s the US-centric view of the world that leads a spokeswoman to dismiss the letter during a discussion with the Wall Street Journal.

Google is not immune to the deep parochialism of Silicon Valley. Yet this is a company that generates over half of its revenues outside the U.S.

On this basis, it’s surely madness to dismiss the arguments of nine states whose representatives are deeply dissatisfied with Google’s position on privacy.

If you doubt this, listen to the supporting commentary offered by Jacob Kohnstamm, the chairman of the European Commission’s Article 29 working party on data protection. The letter, Kohnstamm told journalists, represents a “last warning to the online world”.

Finally, there’s the tedium-filled hope that the awkward squad will simply shut up and go away.

This, of course, will not happen. In the end, Google will become a regulated quasi-utility. It’s easy to suggest why this should happen, but profoundly difficult to imagine how. Yet where there’s a will, the political elites will eventually find a way.

The key challenge for Google involves slowing down the process. Arguably, this now matters more to shareholders than new product development. Regulated companies make smaller profits than you’d otherwise expect. For a company whose shares still trade at 25 times earnings, this is a fate to be avoided.

In this respect, Google has generally performed well so far. Unlike Microsoft in the 1990s, the company hasn’t allowed indignation to tip over into public aggression toward its opponents.

In no small part, I suspect that this is attributable to Eric Schmidt, who witnessed Microsoft’s fatal weakness for confrontation at close quarters as the chief executive of Novell and CTO of Sun Microsystems during the 1990s.

When dealing with critics in public, Schmidt frequently behaves as if he has listened to their complaints. This is a habit that Bill Gates never acquired.

Schmidt also delivers bullshit with aplomb. When he announced the launch of Fast Flip, he made it sound like the future of news. Clearly, it wasn’t. Yet tokens like these slow down the pursuing pack, sowing division among the ranks.

Subtlety comes naturally to Schmidt. Asked to respond to the latest of Rupert Murdoch’s copyright tirades a few weeks ago, Schmidt remarked dryly that it’s best to look at Rupert’s comments “in context of a business negotiation”.

This response worked well, combining economic rationalism with a tease for the assembled journalists and criticism of Murdoch’s vaudeville style.

In many ways, it’s a shame that Schmidt can’t be seconded to run Google’s PR operation. Arguably, he’d add more value to the company in this role.

This week, for example, Schmidt would probably have drawn attention to what I initially fancied to be Google’s third response to the privacy broadside it received on Monday.

On Google’s corporate pages this week, there was no missing the news about the Government Requests Tool, a new initiative from the office of David Drummond, the company’s top lawyer.

Among other things, the Requests Tool clarifies how frequently governments ask Google to hand over users’ personal data (mostly to assist with law enforcement investigations). The results can be interpreted as an index of governmental willingness to invade privacy.

Here’s the league table for the period between July and December last year in the case of signatories to Tuesday’s letter:

UK 3rd 1,166 requests

France 5th 846 requests

Italy 6th 550 requests

Germany 7th 458 requests

Spain 8th 324 requests

The Netherlands 13th 67 requests

Canada 19th 41 requests

Israel 20th 30 requests

The connection between these rankings and the letter from data protection bureaucrats is clear. Eight of the nine governments that complain so vociferously about Google’s privacy policies also appear on this list of governments who take a keen interest in their citizen’s private communications.

The point is worth arguing. Governments might be keen on regulating Google, but they also represent a huge threat to privacy in their own right. Google itself acts as a restraining influence on their behaviour. This week, the company confirmed that it sometimes “refuses to produce information” or “tries to narrow” the requests for data made by governments.

If Eric Schmidt had been running Google’s PR operation, I suspect he would have used this league table as a riposte to the privacy commissioners.

But he isn’t, and it wasn’t. The fact that the privacy letter and Drummond’s tool became news stories on the same day appears to have been coincidence.

We know this because the ham-fisted spokeswoman who talked to the Wall Street Journal described the timing as “ironic”.

The best PR operators grab passing circumstances and impose order upon them. This week, Google failed to do either.

NASA Solar Observatory's First Shots


A huge loop of material shooting up from the sun's surface in March was one of the first events witnessed by NASA's Solar Dynamics Observatory. Known as a prominence eruption, the loop was born from a relatively cold cloud of plasma, or charged gas, tenuously tethered to the sun's surface by magnetic forces. Such clouds can erupt dramatically when they break free of the sun's unstable hold.

"We are all living in the outer atmosphere of a star. Its variability influences Earth, the other planets, and the whole solar system," Richard Fisher, NASA's director of heliophysics, said today at a press conference. For example, strong solar eruptions called coronal mass ejections can send bursts of charged particles streaming toward Earth, where they can overload our planet's magnetic shield, knocking out satellite communications and power grids.



Since launching the Solar Dynamics Observatory, or SDO, in February, mission managers have been powering up and calibrating the craft. Today NASA unveiled the first pictures and video taken by the SDO's suite of instruments, which were designed to show the full range of the sun's magnetic activity in unprecedented detail.

Although the Solar Dynamics Observatory isn't the only solar probe in orbit, it is "the most advanced spacecraft ever built to study the sun," said NASA spokesperson Dwayne Brown. Billed as the Hubble of heliophysics, the SDO "will change textbooks," Brown predicts.

Cat's Paw Nebula: Celestial Cat's Hidden Secrets Revealed


The Cat's Paw Nebula, NGC 6334, is a huge stellar nursery, the birthplace of hundreds of massive stars. In a magnificent new ESO image taken with the Visible and Infrared Survey Telescope for Astronomy (VISTA) at the Paranal Observatory in Chile, the glowing gas and dust clouds obscuring the view are penetrated by infrared light and some of the Cat's hidden young stars are revealed.
Towards the heart of the Milky Way, 5500 light-years from Earth in the constellation of Scorpius (the Scorpion), the Cat's Paw Nebula stretches across 50 light-years. In visible light, gas and dust are illuminated by hot young stars, creating strange reddish shapes that give the object its nickname. A recent image by ESO's Wide Field Imager (WFI) at the La Silla Observatory (eso1003) captured this visible light view in great detail. NGC 6334 is one of the most active nurseries of massive stars in our galaxy.

VISTA, the latest addition to ESO's Paranal Observatory in the Chilean Atacama Desert, is the world's largest survey telescope (eso0949). It works at infrared wavelengths, seeing right through much of the dust that is such a beautiful but distracting aspect of the nebula, and revealing objects hidden from the sight of visible light telescopes. Visible light tends to be scattered and absorbed by interstellar dust, but the dust is nearly transparent to infrared light.

VISTA has a main mirror that is 4.1 metres across and it is equipped with the largest infrared camera on any telescope. It shares the spectacular viewing conditions with ESO's Very Large Telescope (VLT), which is located on the nearby summit. With this powerful instrument at their command, astronomers were keen to see the birth pains of the big young stars in the Cat's Paw Nebula, some nearly ten times the mass of the Sun. The view in the infrared is strikingly different from that in visible light. With the dust obscuring the view far less, they can learn much more about how these stars form and develop in their first few million years of life. VISTA's very wide field of view allows the whole star-forming region to be imaged in one shot with much greater clarity than ever before.

The VISTA image is filled with countless stars of our Milky Way galaxy overlaid with spectacular tendrils of dark dust that are seen here fully for the first time. The dust is sufficiently thick in places to block even the near-infrared radiation to which VISTA's camera is sensitive. In many of the dusty areas, such as those close to the centre of the picture, features that appear orange are apparent -- evidence of otherwise hidden active young stars and their accompanying jets. Further out though, slightly older stars are laid bare to VISTA's vision, revealing the processes taking them from their first nuclear fusion along the unsteady path of the first few million years of their lives.

The VISTA telescope is now embarking on several big surveys of the southern sky that will take years to complete. The telescope's large mirror, high quality images, sensitive camera and huge field of view make it by far the most powerful infrared survey telescope on Earth. As this striking image shows, VISTA will keep astronomers busy analysing data they could not have otherwise acquired. This cat is out of the bag.

Car Steered With Eyes, Computer Scientists Demonstrate


"Keep your eyes on the road!" Scientists at Freie Universität working under the computer science professor Raúl Rojas have given a completely new meaning to this standard rule for drivers: Using software they developed, they can steer a car with their eyes.
On the site of the former Berlin Tempelhof Airport, the head of the project, Raul Rojas, and his team from the Artificial Intelligence Group recently demonstrated how they can steer the vehicle that is equipped with complex electronics just by eye. More than 60 journalists from around the world were there to watch.

Information about the Software: EyeDriver

The eyeDriver software is a prototype application for steering the research vehicle Spirit of Berlin using eye movements. The software was designed by computer scientists at Freie Universität Berlin in collaboration with the company, SMI (SensoMotoric Instruments). The eye movements of the driver are collected and converted into control signals for the steering wheel. The speed is controlled separately and is not included in eyeDriver. The software shows that you can drive a vehicle alone with eye movements.

The HED4 solution by SMI is used for detecting and tracking the eye movements. It is a converted bicycle helmet equipped with two cameras and an infrared LED, as well as a laptop computer with special software. One of the cameras is pointed to the front in the same direction as the person wearing the helmet (scene camera), while the other camera films one eye of the wearer (eye camera). The infrared light supports the eye camera and is pointed to the eye under observation. A transparent mirror that reflects only the infrared light is used to allow a reasonable viewing angle for the eye camera, without limiting the wearer's ability to see. After a brief calibration the software on the laptop of the HED4 is not only able to capture the position of the pupil in the eye camera, but can also calculate the position in the scene camera that the wearer is looking at. These coordinates in the image of the scene camera (viewing position) are transmitted via an ordinary LAN to the onboard computer in the Spirit of Berlin. The eyeDriver software in the onboard computer in the Spirit of Berlin receives the viewing positions at regular intervals over the LAN in the vehicle and uses it to control the steering wheel. The driver can choose between two modes: "free ride" and "routing."

In the "free ride" mode the viewing positions are linked directly with the steering wheel motor. That means that the x-coordinates of the viewing position are used to calculate the desired position of the steering wheel. The further the driver looks to the left or right, the further the steering wheel is turned in that direction. The speed of the vehicle is set in advance and kept constant, as long as the position of the gaze is recognized. In case it is not possible to detect which direction the driver is looking in, for example, if the driver's eyes are closed, the vehicle brakes automatically.

In the "routing" mode, the Spirit of Berlin steers autonomously most of the time. Only where there is a fork in the road, or an intersection, the car stops and asks the driver to select the next route. This requires the wearer of the helmet to look to the left or right for three seconds. If the driver's gaze lingers long enough in one direction, the eyeDriver software confirms acoustically that the choice has been accepted. The decision is communicated to the planner in the vehicle. Then the artificial intelligence in the Spirit of Berlin can plan the route accordingly and continue to run independently.

The Autonomous Vehicle Project

Prof. Dr. Raúl Rojas is a professor of artificial intelligence at the Institute of Computer Science at Freie Universität Berlin. He gained international success with his soccer robots, the "FU-Fighters." They were world champions twice in the small-size league. Since 2006 Prof. Rojas and his team have been designing technologies related to autonomous vehicles. As part of this project, they developed the research vehicle Spirit of Berlin, making it to the semifinals in the DARPA Urban Challenge in California in 2007.

In the fall of 2009, in the innovative vehicle steering series, based on the test vehicle, the computer scientists Tinosch Ganjineh and Miao Wang developed iDriver, with which it is possible to steer the research car using an iPhone. This series is now complemented with the EyeDriver software. It was developed primarily by Miao Wang and David Latotzky in cooperation with the firm SMI. These two developments are simply sub-projects. The core of the research continues to be the autonomous driving.

The AutoNOMOS Project

Since November 2010 the team around Prof. Rojas has been working on the further development of autonomous or semi-autonomous cars in the AutoNOMOS project headed by Tinosch Ganjineh. The project is being funded by the German Ministry of Education and Research (BMBF) in its ForMaT (Forschung für den Markt im Team) program. The funding is for two years. The project will make a significant contribution to the development of accident-free, efficient, and environmentally friendly mobility. AutoNOMOS is a modular system for the operation of autonomous or semi-autonomous cars. Using AutoNomos, it will be possible to detect impending dangers on roads, highways, and crossings (lane change, traffic jams, rights of way) at an early stage and accidents can be prevented. Once the technology is ready, it will be introduced at first on private property and, finally, in public traffic.

The Spirit of Berlin

The Spirit of Berlin is an autonomous vehicle that was designed and built by the Artificial Intelligence Group at Freie Universität Berlin as of 2007. It is a car that can be driven without a driver. A conventional car (Dodge Grand Caravan, 2000) was equipped with sensors, computers, and actuators. The sensors collect information about the immediate environment. Using this information, the software on the computers selects what to do. The resulting action is then implemented with the help of mechanical actuators.

lunes, 12 de abril de 2010

Google Searches For Key To Energy Savings


Google's PowerMeter can show homeowners their real-time energy usage on a smart phone. But the energy monitoring software needs hardware--a hook-up to the electrical power system--in order to work.

Deep in the dark of the Minnesota night, some appliance was kicking on to rob Ed Kohler of hard-earned cash. He'd look later and see nighttime energy spikes reported by PowerMeter, Google software that monitors home electrical use.

“All the lights were out, but something's cycling,” said the 36-year-old Kohler, marketing manager at a Minneapolis web-development firm. “So I think about it and, aha, figure out it's the refrigerator.”

A 19-year-old refrigerator, a real energy hog by today's standards. It was easy to calculate that a new, energy-efficient model would pay for itself.

Kohler’s revelation is typical of “Aha!” moments that consumers enjoy when they can monitor their energy use, say Google executives. PowerMeter is an early hint at how new technology can give home dwellers more control over their energy use. It was developed by the search giant’s charitable arm, Google.org, which has made energy one of its prime areas of focus. And PowerMeter is free, easy to use and available to anyone worldwide to install.

If only it were that simple.

As software, PowerMeter can’t provide the homeowner with energy use data unless it is linked to the home's electrical power system—and that requires a piece of hardware. But it will take the utilities that deliver electricity to homes years--maybe a decade--to blanket the country with new “smart” meters that can gather and transmit useful data. Then, the power companies must decide how to transmit that data to customers--perhaps through software like Google's that is being tested by several utilities in the United States and Europe, or perhaps through software and hardware being developed by other companies, like Microsoft, Intel, and a number of start-ups.

Utilities cautious on smart meters

From the standpoint of the utilities, the meters raise a myriad of technical questions, not the least of which is just making sure the darn things are accurate. So they are moving forward cautiously. Too cautiously, in the view of Google and 45 companies and organizations that sent a letter to President Obama this week urging that the administration set a goal of giving every U.S. household and business access to “timely, useful and actionable” information on their energy use.

“By giving people the ability to monitor and manage their energy consumption, for instance, via their computers, phones or other devices, we can unleash the forces of innovation in homes and businesses,” the letter said.

The smart meter advocates say if U.S. households saved 15 percent on their energy use by 2020, the greenhouse gas savings would be equivalent to taking 35 million cars off the road and would save consumers $46 billion on their energy bills.

Like a lot of early PowerMeter users, Kohler couldn't wait for smart meters to arrive. He shelled out $200 for a device that measures his energy use, and he installed it himself. Called The Energy Detective, or TED, and manufactured by a small Charleston, South Carolina firm, Energy Inc., the hardware’s transmitter wires directly into a home’s incoming power. (It’s a job the company’s web site suggests might be best handled by an electrician.) The TED has been popular since Google announced it would be the first device that U.S. consumers could buy and link to PowerMeter. Another device, sold by AlertMe of Cambridge, England is available in the United Kingdom.

To encourage the development of more devices, Google last month opened its software to other hardware manufacturers, saying early reports suggest the application achieves its goals.

"It's quite an amazing thing for something, frankly, as boring as electricity use in the home," said Dan Reicher, director of Google’s climate change and energy initiatives. "Everybody learns something and can act pretty quickly."

Meanwhile, millions of other consumers worldwide are getting smart meters connected by their electric utilities. It's a push that gained steam in the United States with federal stimulus money, part of a broad program to add intelligence across the nation's energy system. Still, only about 10 percent of U.S. households have smart meters. And it will take another five years to get even half the nation's households wired, say market analysts at Parks Associates.

Even today, some with smart meters are often blind to the potential benefits. Many don't know they can access the data being gathered on their home energy use, and their meters won't work with PowerMeter or other consumer-friendly software just yet.

Pacific Gas and Electric, for example, has deployed more smart meters than any other U.S. utility, monitoring 5.2 million gas and electric lines in California. The company plans to double that number to cover all 15 million California customers in about two years. Customers one day could get Web and email alerts about high energy use and adjust their behavior -- for example, shifting their clothes drying to the evening. That could save them money if their utility charges higher rates in peak-demand hours.

Worries About Security, Accuracy

"But a lot of these features are some time away," said Paul Moreno, a PG&E spokesman.

For now, residential customers need to go to PG&E's website where they can see the energy numbers. And executives have said they're concerned about working with third parties like Google until common standards emerge for handling and protecting the data.

Further south, a small pilot program at San Diego Gas & Electric is cautiously testing the Google software. SDG&E is also rolling out smart meters, with about 700,000 installed and plans to finish hooking up its 1.4 million customers next year. Later this year, smart meter customers will be able to access their data at the utility's website.

But the advantage of software like PowerMeter is that it paints usage patterns onto a user's Google home page where they're more likely to see and respond to it, says Alex Kim, director of customer innovation at SDG&E. "We think we have a pretty nice website, but we don't think it's going to be their home page," he said.

So far, SDG&E has limited the Google trial to only about 150 customers, testing PowerMeter's security, usability, and utility. Utilities have reason to be careful. Regulators in Texas and California are investigating complaints that new smart meters are miscalculating or wrongly transmitting energy use for some homes.

So Google isn't depending only on the utilities. The company sees equal promise in devices like the TED bought by Kohler. "More are coming, and they're going to be better and cheaper," said Google's Reicher.

Reicher also installed TED in his own Northern California home, which doesn't yet have a smart meter. Besides, unlike the utilities, which typically provide day-old data after downloading it overnight, TED's monitor displays real-time data because it's linked directly to the home's electrical system.

The moving numbers can be captivating, Reicher said. His 6-year-old son watched as he started to make toast. The meter spiked, and Reicher explained the toaster required electricity that came through the walls.

"It was this wonderful teaching moment," Reicher said. "The light bulb went on in his head about how all this works."

Octopus vs. Sea Lion—First Ever Video

Near-Death Experiences Explained?


Near-death experiences are tricks of the mind triggered by an overload of carbon dioxide in the bloodstream, a new study suggests.

Many people who have recovered from life-threatening injuries have said they experienced their lives flashing before their eyes, saw bright lights, left their bodies, or encountered angels or dead loved ones.

In the new study, researchers investigated whether different levels of oxygen and carbon dioxide—the main blood gases—play a role in the mysterious phenomenon.

The team studied 52 heart attack patients who had been admitted to three major hospitals and were eventually resuscitated. Eleven of the patients reported near-death experiences.

During cardiac arrest and resuscitation, blood gases such as CO2 rise or fall because of the lack of circulation and breathing.

"We found that in those patients who experienced the phenomenon, blood carbon-dioxide levels were significantly higher than in those who did not," said team member Zalika Klemenc-Ketis, of the University of Maribor in Slovenia.


CO2 Only Common Factor in Near-Death Experiences

Other factors, such a patient's sex, age, or religious beliefs—or the time it took to revive them—had no bearing on whether the patients reported near-death experiences.

The drugs used during initial treatment—a suggested explanation for near-death experiences after heart attacks—also didn't seem to correlate with the sensations, according to the study authors.

How carbon dioxide might actually interact with the brain to produce near-death sensations was beyond the scope of the study, so for now "the exact pathophysiological mechanism for this is not known," Klemenc-Ketis said.

However, people who have inhaled excess carbon dioxide or have been at high altitudes, which can raise the blood's CO2 concentrations, have been known to have sensations similar to near-death experiences, she said.

A Glimpse of the Afterlife?

The study is among the first to find a direct link between carbon dioxide in the blood and near-death experiences, or NDEs, said Christopher French, a psychologist at the Anomalistic Psychology Research Unit of the University of London, who was not involved in the new research.

The hospital study bolsters previous lab work done in the 1950s that found "the effects of hypercarbia [abnormally high levels of CO2 in the blood] were very similar to what we would now recognise as NDEs," French said in an email.

The research also supports the argument that anything that disinhibits the brain—damages the brain's ability to manage impulses—can produce near-death sensations, he said. Physical brain injury, drugs, and delirium have all been associated with a disinhibited state, and CO2 overload is another potential trigger.

Still, not all scientists are convinced: "The one difficulty in arguing that CO2 is the cause is that in cardiac arrests, everybody has high CO2 but only 10 percent have NDEs," said neuropsychiatrist Peter Fenwick of the Institute of Psychiatry at Kings College London.

What's more, in heart attack patients, Fenwick said, "there is no coherent cerebral activity which could support consciousness, let alone an experience with the clarity of an NDE."

The main alternative is that near-death experiences are "evidence of consciousness becoming separated from the physical substrate of the brain, possibly even a glimpse of an afterlife," the University of London's French noted.

But for him, at least, "the latest results argue strongly against such a hypothesis."

Hiding out Behind the Milky Way


A leggy cosmic creature comes out of hiding in a new infrared view from NASA's Wide-field Infrared Survey Explorer, or WISE. The spiral beauty, called IC 342 and sometimes the "hidden galaxy," is shrouded behind our own galaxy, the Milky Way.
Stargazers and professional astronomers have a hard time seeing the galaxy through the Milky Way's bright band of stars, dust and gas. WISE's infrared vision cuts through this veil, offering a crisp view.

In a spiral galaxy like IC 342, dust and gas are concentrated in the arms. The denser pockets of gas trigger the formation of new stars, as represented here in green and yellow. The core, shown in red, is also bursting with young stars, which are heating up dust. Stars that appear blue reside within our Milky Way, between us and IC 342.

This galaxy has been of great interest to astronomers because it is relatively close. However, determining its distance from Earth has proven difficult due to the intervening Milky Way. Astronomer Edwin Hubble first thought the galaxy might belong to our own Local Group of galaxies, but current estimates now place it farther away, at about 6.6 to 11 million light-years.

This image was made from observations by all four infrared detectors aboard WISE. Blue and cyan represent infrared light at wavelengths of 3.4 and 4.6 microns, which is primarily light from stars. Green and red represent light at 12 and 22 microns, which is primarily emission from warm dust.

Scientists Explore Origins of 'Supervolcanoes' on the Sea Floor: Ancient Goliaths Blamed for Multiple Mass Extinctions


"Supervolcanoes" have been blamed for multiple mass extinctions in Earth's history, but the cause of their massive eruptions is unknown.
Despite their global impact, the eruptions' origin and triggering mechanisms have remained unexplained. New data obtained during a recent Integrated Ocean Drilling Program (IODP) expedition in the Pacific Ocean may provide clues to unlocking this mystery.

To explore the origins of these seafloor giants, scientists drilled into a large, 145 million-year-old underwater volcanic mountain chain off the coast of Japan.

IODP Expedition 324: Shatsky Rise Formation took place onboard the scientific ocean drilling vessel JOIDES Resolution from September 4 to November 4, 2009. Preliminary results of the voyage are emerging.

"'Supervolcanoes' emitted large amounts of gases and particles into the atmosphere, and re-paved the ocean floor," says Rodey Batiza, marine geosciences section head in the National Science Foundation (NSF)'s Division of Ocean Sciences, which co-funded the research.

The result?

"Loss of species, increased greenhouse gases in the atmosphere, and changes in ocean circulation," says Batiza.

In fall 2009, an international team of scientists participating in IODP Expedition 324 drilled five sites in the ocean floor. They studied the origin of the 145 million-year-old Shatsky Rise volcanic mountain chain.

Located 1,500 kilometers (930 miles) east of Japan, Shatsky Rise measures roughly the size of California.

This underwater mountain chain is one of the largest supervolcanoes in the world: the top of Shatsky Rise lies three and a half kilometers (about two miles) below the sea's surface, while its base plunges to nearly six kilometers (four miles) beneath the surface.

Shatsky Rise is composed of layers of hardened lava, with individual lava flows that are up to 23 meters (75 feet) thick.

"Seafloor supervolcanoes are characterized by the eruption of enormous volumes of lava," says William Sager of Texas A&M University, who led the expedition with co-chief scientist Takashi Sano of Japan's National Museum of Nature and Science in Tokyo. "Studying their formation is critical to understanding the processes of volcanism, and the movement of material from Earth's interior to its surface."

About a dozen supervolcanoes exist on Earth; some are on land, while others lie at the bottom of the ocean. Those found on the seafloor are often referred to as large oceanic plateaus.

Current scientific thinking suggests that these supervolcanoes were caused by eruptions over a period of a few million years or less--a rapid pace in geologic time.

Each of these supervolcanoes produced several million cubic kilometers of lava--about three hundred times the volume of all the Great Lakes combined--dwarfing the volume of lava produced by the largest present-day volcanoes in places like Hawaii.

Since the 1960s, geologists have debated the formation and origin of these large oceanic plateaus. The mystery lies in the origin of the magma, molten rock that forms within the Earth.

A magma source rising from deep within the Earth has a different chemical composition than magma that forms just below Earth's crust. Some large oceanic plateaus show signs of a deep-mantle origin. Others exhibit chemical signatures indicative of magma from a much shallower depth.

The IODP Shatsky Rise expedition focused on deciphering the relationship between supervolcano formation and the boundaries of tectonic plates, crucial to understanding what triggers supervolcano formation.

A widely-accepted explanation for oceanic plateaus is that they form when magma in the form of a "plume head" rises from deep within the Earth to the surface.

An alternative theory suggests that large oceanic plateaus can originate at the intersection of three tectonic plates, known as a "triple junction."

Shatsky Rise could play a key role in this debate, because it formed at a triple junction. However, it also displays characteristics that could be explained by the plume head model.

"Shatsky Rise is one of the best places in the world to study the origin of supervolcanoes," says Sager. "What makes Shatsky Rise unique is that it's the only supervolcano to have formed during a time when Earth's magnetic field reversed frequently."

This process creates "magnetic stripe" patterns in the seafloor. "We can use these magnetic stripes to decipher the timing of the eruption," says Sager, "and the spatial relationship of Shatsky Rise to the surrounding tectonic plates and triple junctions."

Sediments and microfossils collected during the expedition indicate that parts of the Shatsky Rise plateau were at one time at or above sea level, and formed an archipelago during the early Cretaceous period (about 145 million years ago).

Shipboard lab studies show that much of the lava erupted rapidly, and that Shatsky Rise formed at or near the equator.

As analyses continue, data collected during this expedition will help scientists resolve the 50 year-old debate about the origin and nature of large oceanic plateaus.

The JOIDES Resolution is one of the primary research vessels of IODP, an international marine research program dedicated to advancing scientific understanding of the Earth through drilling, coring, and monitoring the subseafloor. The vessel is operated by the U.S. Implementing Organization of IODP, consisting of the Consortium for Ocean Leadership, Texas A&M University, and Lamont-Doherty Earth Observatory of Columbia University.

IODP is supported by two lead agencies: the U.S. National Science Foundation and Japan's Ministry of Education, Culture, Sports, Science, and Technology.

Additional program support comes from the European Consortium for Ocean Research Drilling (ECORD), the Australian-New Zealand IODP Consortium (ANZIC), India's Ministry of Earth Sciences, the People's Republic of China (Ministry of Science and Technology), and the Korea Institute of Geoscience and Mineral Resources.

Hawaiian Submarine Canyons Are Hotspots of Biodiversity and Biomass for Seafloor Animal Communities


Underwater canyons have long been considered important habitats for marine life, but until recently, only canyons on continental margins had been intensively studied. Researchers from Hawaii Pacific University (HPU) and the Universtiy of Hawaii at Manoa (UHM) have now conducted the first extensive study of canyons in the oceanic Hawaiian Archipelago and found that these submarine canyons support especially abundant and unique communities of megafauna (large animals such as fish, shrimp, crabs, sea cucumbers, and sea urchins) including 41 species not observed in other habitats in the Hawaiian Islands.
The research is published in the the March issue of the journal Marine Ecology.

The researchers used both visual and video surveys from 36 submersible dives (using UHM's Hawaii Undersea Research Laboratory submersibles Pisces IV and Pisces V) to characterize slope and canyon communities of animals at depths of 350-1500 meters along the margins of four islands of the Hawaiian Archipelago. The coastlines of Oahu and Molokai were selected as examples of high, mountainous islands with large supplies of terrestrial and marine organic matter which can be exported down slopes and canyons to provide food to deep-sea communities. Nihoa Island and Maro Reef were chosen to represent low islands and atolls that are likely to export less organic matter to feed the deep-sea fauna.

Eric Vetter, the lead author of this paper and a Professor of Marine Biology from HPU, had previously studied four canyon systems off the coast of California and found that the productive waters along southern California had resulted in the delivery and accumulation of substantial amounts of organic material. "Craig Smith (Professor of Oceanography at UHM and co-author of this study) and I wondered if the same dramatic contrast in benthic food resources between canyon and non-canyon settings seen in continental margins would also occur on tropical oceanic islands," says Vetter. "We reasoned that the low productivity in tropical regions would result in reduced source material for organic enrichment in canyons and that steep bathymetry combined with curving coastlines would limit the area over which material could be transported to individual canyons. On the other hand we thought that any amount of organic enrichment might have a measureable effect given the very low background productivity."

Canyon systems can enhance abundance and diversity of marine life by providing more varied and complex physical habitats, and by concentrating organic detritus moving along shore and downslope. On most continental margins, the continental shelf and slope are dominated by soft, low-relief sediments; in contrast, canyons crossing these margins often have steep slopes, rocky outcrops, and faster currents that can support fauna with diverse habitat requirements. The margins of oceanic islands generally are steeper than continental margins, making the physical contrast between canyon and non-canyon habitats potentially less dramatic than along continents. "We wanted to learn, given all of these differences, if tropical oceanic islands would be regions of special biological significance, particularly in terms of productivity and biodiversity," says Vetter.

To conduct the research off Oahu, Molokai, and Maro Reef, Vetter, Smith and UHM Doctoral student Fabio De Leo took turns in the submersibles counting marine life on the ocean bottom using visual transects and recording results into a voice recorder. The survey off of Nihoa Island was conducted using a video recorder attached to the submersible. The results of the 36 surveys showed that the highly mobile megafauna (like fish, sharks, shrimp and squid) were much more abundant in the canyons than on the open slopes at all depths studied. This suggests that canyons provide an especially good habitat for mobile species that are able to feed on accumulated organic matter but can escape the physical disturbances in canyons resulting from high currents and mobile sediments (e.g., migrating sand ripples).

"Perhaps the biggest surprise of this study was the large number of species, 41, that we found only in canyon habitats," says Smith. "This suggests that canyons support a substantial specialized fauna that would not exist in the Hawaiian archipelago in the absence of canyons. Thus, submarine canyons are contributing uniquely to biodiversity in the islands and merit careful attention for environmental protection and management."

The elevated abundance and biodiversity of megafauna (especially highly mobile species of fish and crustaceans) in canyons suggests that those environments experience greater food availability, and may provide critical habitat for commercially important bottom fish and invertebrate stocks. "From a conservation standpoint, these regions would be ideal candidates to become Marine Protected Areas (MPAs), especially due to the higher turnover of species diversity when compared to regular slopes," says De Leo. "If we prove that the animals are using the canyons as a feeding ground because the organic debris accumulates there and nowhere else outside the canyons, that's another argument for an MPA." Adds Smith, "if we allow the Hawaiian canyons to be overexploited or impacted by human activities such as dredge-spoil dumping, there is likely to be a significant drop in biodiversity in the deep waters of Hawaii. Clearly, canyon habitats merit special attention for inclusion in Hawaiian MPAs."

Future studies by the team of the Hawaiian canyon fauna include analyses of the stable carbon and nitrogen isotopes in shrimp, urchins and other bottom feeders to identify their main food sources. Because different potential food sources, such as land plants, seafloor algae and phytoplankton, often have different stable isotope "signatures," these analyses will help the researchers to understand what exactly is fueling the rich animal assemblages in the canyons. "We need biochemical proof that the canyons are really channeling this type of material," says DeLeo. "Carbon and nitrogen isotopic signatures could tell the difference between the detrital plant material and the phytoplankton material pools, so you can see if the animal in the canyon is eating phytoplankton cells coming from pelagic production or macroalgae coming from the coast."

Vetter says that their current research is also formulating conceptual models that will allow the researchers to predict which features associated with different canyon systems act to influence biological patterns including animal abundance and diversity. "DeLeo's PhD research will include data on megafauna and macrofauna (smaller animals living in the sediments) patterns in canyons along the U.S. West Coast, Hawaii, and New Zealand, which should make important strides here."

This research was supported by the National Oceanic and Atmospheric Administration (NOAA) Ocean Exploration Office, by the Hawaii Undersea Research Laboratory in the School of Ocean and Earth Science and Technology at UHM, and by the Census of Diversity of Abyssal Marine Life (CeDAMar). We also thank Capes-Fulbright for a fellowship for DeLeo.

World's Deepest Known Undersea Volcanic Vents Discovered


A British scientific expedition has discovered the world's deepest undersea volcanic vents, known as 'black smokers', 3.1 miles (5000 metres) deep in the Cayman Trough in the Caribbean. Using a deep-diving vehicle remotely controlled from the Royal Research Ship James Cook, the scientists found slender spires made of copper and iron ores on the seafloor, erupting water hot enough to melt lead, nearly half a mile deeper than anyone has seen before.
Deep-sea vents are undersea springs where superheated water erupts from the ocean floor. They were first seen in the Pacific three decades ago, but most are found between one and two miles deep. Scientists are fascinated by deep-sea vents because the scalding water that gushes from them nourishes lush colonies of deep-sea creatures, which has forced scientists to rewrite the rules of biology. Studying the life-forms that thrive in such unlikely havens is providing insights into patterns of marine life around the world, the possibility of life on other planets, and even how life on Earth began.

The expedition to the Cayman Trough is being run by Drs Doug Connelly, Jon Copley, Bramley Murton, Kate Stansfield and Professor Paul Tyler, all from Southampton, UK. They used a robot submarine called Autosub6000, developed by engineers at the National Oceanography Centre (NOC) in Southampton, to survey the seafloor of the Cayman Trough in unprecedented detail. The team then launched another deep-sea vehicle called HyBIS, developed by team member Murton and Berkshire-based engineering company Hydro-Lek Ltd, to film the world's deepest vents for the first time.

"Seeing the world's deepest black-smoker vents looming out of the darkness was awe-inspiring," says Copley, a marine biologist at the University of Southampton's School of Ocean and Earth Science (SOES) based at the NOC and leader of the overall research programme. "Superheated water was gushing out of their two-storey high mineral spires, more than three miles deep beneath the waves." He added: "We are proud to show what British underwater technology can achieve in exploring this frontier -- the UK subsea technology sector is worth £4 billion per year and employs 40 000 people, which puts it on a par with our space industry."

The Cayman Trough is the world's deepest undersea volcanic rift, running across the seafloor of the Caribbean. The pressure three miles deep at the bottom of the Trough -- 500 times normal atmospheric pressure -- is equivalent to the weight of a large family car pushing down on every square inch of the creatures that live there, and on the undersea vehicles that the scientists used to reveal this extreme environment. The researchers will now compare the marine life in the abyss of the Cayman Trough with that known from other deep-sea vents, to understand the web of life throughout the deep ocean. The team will also study the chemistry of the hot water gushing from the vents, and the geology of the undersea volcanoes where these vents are found, to understand the fundamental geological and geochemical processes that shape our world.

"We hope our discovery will yield new insights into biogeochemically important elements in one of the most extreme naturally occurring environments on our planet," says geochemist Doug Connelly of the NOC, who is the Principal Scientist of the expedition.

"It was like wandering across the surface of another world," says geologist Bramley Murton of the NOC, who piloted the HyBIS underwater vehicle around the world's deepest volcanic vents for the first time. "The rainbow hues of the mineral spires and the fluorescent blues of the microbial mats covering them were like nothing I had ever seen before."

The expedition will continue to explore the depths of the Cayman Trough until 20th April.

In addition to the scientists from Southampton, the team aboard the ship includes researchers from the University of Durham in the UK, the University of North Carolina Wilmington and the University of Texas in the US, and the University of Bergen in Norway. The expedition members are also working with colleagues ashore at Woods Hole Oceanographic Institution and Duke University in the US to analyse the deep-sea vents.

The expedition is part of a research project funded by the UK Natural Environment Research Council to study the world's deepest undersea volcanoes. The research team will return to the Cayman Trough for a second expedition using the UK's deep-diving remotely-operated vehicle Isis, once a research ship is scheduled for the next phase of their project.

Additional information

(1) The expedition aboard the RRS James Cook began in Port of Spain, Trinidad on 21st March and ends in Montego Bay, Jamaica on 21st April. It is part of a £462k research project funded by the UK Natural Environment Research Council.

(2) The RRS James Cook is the UK's newest ocean-going research ship, operated by the Natural Environment Research Council. The current expedition is the 44th voyage of the ship.

Researchers Shed Light on Ancient Assyrian Tablets


A cache of cuneiform tablets unearthed by a team led by a University of Toronto archaeologist has been found to contain a largely intact Assyrian treaty from the early 7th century BCE.
"The tablet is quite spectacular. It records a treaty -- or covenant -- between Esarhaddon, King of the Assyrian Empire and a secondary ruler who acknowledged Assyrian power. The treaty was confirmed in 672 BCE at elaborate ceremonies held in the Assyrian royal city of Nimrud (ancient Kalhu). In the text, the ruler vows to recognize the authority of Esarhaddon's successor, his son Ashurbanipal," said Timothy Harrison, professor of near eastern archaeology in the Department of Near & Middle Eastern Civilizations and director of U of T's Tayinat Archaeological Project (TAP).

"The treaties were designed to secure Ashurbanipal's accession to the throne and avoid the political crisis that transpired at the start of his father's reign. Esarhaddon came to power when his brothers assassinated their father, Sennacherib."

The 43 by 28 centimetre tablet -- known as the Vassal Treaties of Esarhaddon -- contains about 650 lines and is in a very fragile state. "It will take months of further work before the document will be fully legible," added Harrison. "These tablets are like a very complex puzzle, involving hundreds of pieces, some missing. It is not just a matter of pulling the tablet out, sitting down and reading. We expect to learn much more as we restore and analyze the document."

The researchers hope to glean information about Assyria's imperial relations with the west during a critical period, the early 7th century BCE. It marked the rise of the Phrygians and other rival powers in highland Anatolia -- now modern-day Turkey -- along the northwestern frontier of the Assyrian empire, and coincided with the divided monarchy of Biblical Israel, as well as an era of increased contact between the Levantine peoples of the Eastern Mediterranean and Egypt, as well as the Greeks of the Aegean world.

The cache of tablets -- which date back to the Iron Age -- were unearthed in August 2009 during excavations at the site of an ancient temple at Tell Tayinat, located in southeastern Turkey. A wealth of religious paraphernalia -- including gold, bronze and iron implements, libation vessels and ornately decorated ritual objects -- was also uncovered.

TAP is an international project, involving researchers from a dozen countries, and more than 20 universities and research institutes. It operates in close collaboration with the Ministry of Culture of Turkey, and provides research opportunities and training for both graduate and undergraduate students. The project is funded by the Social Sciences and Humanities Research Council of Canada and the Institute for Aegean Prehistory (INSTAP), and receives support from the University of Toronto.

martes, 6 de abril de 2010

Robotics



Sony Dream Robot (SDR-4X)


Sony Dream Robot SDR-4X is a humanoid robot that can walk, move, and even dance. It is only 23 inches tall and weighs 14 pounds.

Powered by a two of RISC processors, the SDR-4X is capable of 38 separate degrees of movement (older version SDR-3X had only 26). SDR-4X has a real-time adaptive motion control system and better communication technology including facial recognition and complex speech recognition.

Newer version SRD-4X-II has three RISC processors and has advanced motion control, illusion of higher intelligence (vocabulary of approximately 20,000 words), better safety and identification features. SDR-4X-II came in April 02, 2003, lets see when we will see the newer version.

HRP-2m Choromet HRP-2m Choromet



The Choromet is expected to be available from General Robotics in September, with price which is less then five grands. The Choromet is about 13-3/4 inches tall, and is capable of walking upright on two legs. Four companies in Japan have created a relatively low-cost, user-programmable humanoid robot targeting educational and research applications. The HRP-2m Choromet uses technology from Japan's National Institute of Advanced Industrial Science and Technology (AIST), and is user-programmable thanks to open software running on a user-space real-time Linux implementation. AIST hopes Choromet's ability to run software-based movement programs on a real-time Linux platform will enable researchers and schools to experiment with the effectiveness of humanoid robot motion pattern applications. The Choromet is based on several technologies developed by AIST, including A business-card sized SBC (single board computer) 240MHz SH-4 processor, 32MB of RAM, "ARTLinux," an operating system that provides a user-space real-time Linux environment. Humanoid motion application software based on OpenHRP (Humanoid Robotics Project) Some other Choromet features are: Triaxial force sensors on legs, Accelerometer and gyroscope in trunk, and real-time sensor feedback. More info at http://linuxdevices.com/news/NS8377820601.html

ASIMO Humanoid Robot



ASIMO (Advanced Step in Innovative MObility) is a bipadel humanoid robot from Honda This robot has been evolving since its inception in 1986. Current version of ASIMO is 1.2 meter tall and weighs 43 Kg. This size enables ASIMO to actually perform tasks within the realm of a human living environment. It also walks in a smooth fashion which closely resembles that of a human being.
Advanced Walking Technology Predicted Movement Control (for predicting the next move and shifting the center of gravity accordingly) is combined with existing walking control know-how to create i-WALK (intelligent real-time flexible walking) technology, permitting smooth changes of direction. The latest updates on the ASIMO robot are available at Honda .
Following image gives the different version of this robot.


BigDog the Mule



Boston Dynamics has created BigDog the robotic pack mule which according to them is "the most advanced quadruped robot on Earth!" BigDog can carry four infantry backpacks while keeping its balance over different type of terrains. The following freaky video (28MB WMV) shows BigDog walking through mud, over slippery snow, over rocks, up a hill, and even keeps itself from toppling over when it's being kicked! BigDog stands about waist high, can carry 165 lbs (75 kg), can walk through most terrain and can always keep its balance. The on-board computer controls locomotion, monitors external & internal sensors and keeps itself balanced. A gasoline engine powers BigDog's hydraulic system for actuating the hydraulically controlled legs.



http://www.botmag.com/

All for One and One for All: Computer Model Reveals Neurons Coordinating Their Messaging, Yielding Clues to How the Brain Works



There is strength in numbers if you want to get your voice heard. But how to do you get your say if you are in the minority? That's a dilemma faced not only by the citizens of a democracy but also by some neurons in the brain.
Although they only account for a fraction of the synapses in the visual cortex, neurons in the thalamus get their message across loud and clear by coordination -- simultaneously hitting the "send" button -- according to a computer simulation developed by researchers at the Salk Institute for Biological Studies.

Their findings, published in the April 2, 2010 issue of the journal Science, hold important clues to how the brain encodes and processes information, which can be applied to a wide variety of applications, from understanding psychiatric disorders to the development of novel pharmaceuticals and new ways of handling information by computers or communication networks.

Historically, neuroscientists have been limited to recording the activity of single brain cells, which led to the widely accepted view that neurons communicate with each other through volleys of electrical spikes and that they increase the average rate of spiking to "speak up."

But communication between neurons is not limited to one-on-one interactions. Instead, any given cell receives signals from hundreds of cells, which send their messages through thousands of synapses, specialized junctions that allow signals to pass from one neuron to the next.

"Unfortunately, we don't have the technology yet to actually measure what all these neurons are saying to the recipient cell, which would require recording simultaneously from hundreds of cells, " says graduate student and first author Hsi-Ping Wang. "For this reason, nobody could answer a very basic question that's been puzzling neuroscientist for decades, which is: 'How many neurons or synapses does it take to reliably send a signal from point A to point B?'"

This question is particularly pressing for the thalamus, the central switchboard that processes and distributes incoming sensory information to all parts of the cortex. Thalamic input only accounts for five percent of the signals that so-called spiny stellate cells in the cortex receive, even though they drive a good portion of activity throughout the cerebral cortex.

"That is a paradox," says Howard Hughes Medical Institute investigator Terrence J. Sejnowski, Ph.D., professor and head of the Computational Neurobiology Laboratory. "How can so few synapses have such a big impact? If the average spiking rate were the determining factor, thalamic input would be drowned out by the other 95 percent of the inputs from other cortical cells."

Based on the assumption that the brain cares about the reliability and precision of spikes, Sejnowki's team developed a realistic computer model of a spiny stellate cell and the signals it receives through its roughly 6,000 synapses. "We found that it is not the number of spikes that's relevant but rather how many spikes arrive at the same time," says Sejnowski.

"Surprisingly, our model predicts that it only takes about 30 synapses out of 6,000 firing simultaneously to create extremely reliable signaling," explains Wang, "and our prediction lines up with currently available in vivo measurements and understanding. You could have all 6,000 synapses firing at the same time, but it would be a waste of resources."

The researchers hope that their findings will give them new insight into the holy grail of neurobiology: decoding the neural code or language of the brain. If the eye receives the same visual information under identical conditions over and over again, one would expect that the signal, the series of generated spikes or bits, is essentially the same.

"But it's not known whether that happens under natural conditions, and it's technically very difficult to measure, " says senior researcher and co-author Donald Spencer, Ph.D. "That's where the power of computational neurobiology really comes to bear on otherwise intractable questions."

"Applying theories of engineering to the study of the brain has helped us gain new insight into how neurons communicate with each other," says Wang. "On the other hand, there are certain things that the brain does in unique ways that are completely different from how computers work. A better understanding of the brain allows us the capture these algorithms and could very well affect the things engineers do in everyday life."

Jean-Marc Fellous, an associate professor in the Department of Psychology and Applied Mathematics at the University of Arizona, also contributed to the work.

The work was supported by the Howard Hughes Medical Institute.

An Archaeological Mystery in a Half-Ton Lead Coffin



n the ruins of a city that was once Rome's neighbor, archaeologists last summer found a 1,000-pound lead coffin.
Who or what is inside is still a mystery, said Nicola Terrenato, the University of Michigan professor of classical studies who leads the project -- the largest American dig in Italy in the past 50 years.

The sarcophagus will soon be transported to the American Academy in Rome, where engineers will use heating techniques and tiny cameras in an effort to gain insights about the contents without breaking the coffin itself.

"We're very excited about this find," Terrenato said. "Romans as a rule were not buried in coffins to begin with and when they did use coffins, they were mostly wooden. There are only a handful of other examples from Italy of lead coffins from this age -- the second, third or fourth century A.D. We know of virtually no others in this region."

This one is especially unusual because of its size.

"It's a sheet of lead folded onto itself an inch thick," he said. "A thousand pounds of metal is an enormous amount of wealth in this era. To waste so much of it in a burial is pretty unusual."

Was the deceased a soldier? A gladiator? A bishop? All are possibilities, some more remote than others, Terrenato said. Researchers will do their best to examine the bones and any "grave goods" or Christian symbols inside the container in an effort to make a determination.

"It's hard to predict what's inside, because it's the only example of its kind in the area," Terrenato said. "I'm trying to keep my hopes within reason."

Human remains encased in lead coffins tend to be well preserved, if difficult to get to. Researchers want to avoid breaking into the coffin. The amount of force necessary to break through the lead would likely damage the contents. Instead, they will first use thermography and endoscopy. Thermography involves heating the coffin by a few degrees and monitoring the thermal response. Bones and any artifacts buried with them would have different thermal responses, Terrenato said. Endoscopy involves inserting a small camera into the coffin. But how well that works depends on how much dirt has found its way into the container over the centuries.

If these approaches fail, the researchers could turn to an MRI scan -- an expensive option that would involve hauling the half-ton casket to a hospital.

The dig that unearthed this find started in summer 2009 and continues through 2013. Each year, around 75 researchers from around the nation and world, including a dozen U-M undergraduate students, spend two months on the project at the ancient city of Gabii (pronounced "gabby").

The site of Gabii, situated on undeveloped land 11 miles east of Rome in modern-day Lazio, was a major city that pre-dates Rome but seems to have waned as the Roman Empire grew.

Studying Gabii gives researchers a glimpse into pre-Roman life and offers clues to how early Italian cities formed. It also allows them broader access to more substantial archaeological layers or strata. In Rome, layers of civilization were built on top of each other, and archaeologists are not able or allowed to disturb them.

"In Rome, so often, there's something in the way, so we have to get lucky," Terrenato said. "In Gabii, they should all be lucky spots because there's nothing in the way."

Indeed, Terrenato and others were surprised to find something as significant as this coffin so soon.

"The finding of the lead coffin was exhilarating," said Allison Zarbo, a senior art history major who graduates this spring.

Zarbo didn't mind that after the researchers dug up the coffin once, they had to pile the dirt back on to hide it from looters overnight.

"The fact that we had to fill the hole was not so much of a burden as a relief!" Zarbo said. "For academia to lose priceless artifacts that have been found fully in context would be very damaging to our potential knowledge."

Students spent most of their time pick-axing, shoveling, and manning the wheelbarrows, said Bailey Benson, a junior who is double majoring in classical archaeology and art history.

"By the end of the day, not even a 20-minute shower can remove all the dirt and grime you get covered in," Benson said. "It's hard but satisfying work. How many people can say they uncovered an ancient burial?"

This research is funded in part by the National Geographic Society. The managing director of the project is Jeffrey Becker, assistant professor of classics at McMaster University. The field director leading the coffin studies is independent researcher Anna Gallone. The Italian State Archaeological Service (Soprintendenza di Roma) is authorizing and facilitating the project

Ancient Fossil Flea-Like Creature: Rare Body Parts Find Provides Vital Clues to Identity


The fossil, illustrated without the shell and showing the soft-parts, including limbs and eyes. (Credit: David J. Siveter, Derek E. G. Briggs, Derek J. Siveter and Mark D. Sutton)

A geologist from the University of Leicester is part of a team that has uncovered an ancient water flea-like creature from 425 million years ago -- only the third of its kind ever to be discovered in ancient rocks.

Professor David Siveter, of the Department of Geology at the University of Leicester worked with Professor Derek Siveter at the Oxford University Museum of Natural History, Professor Derek Briggs at Yale University USA and Dr Mark Sutton at Imperial College to make the rare discovery.

The specimen, which was found in rocks in Herefordshire, represents a new species of ostracod, and has been named Nasunaris flata. Like water-fleas and shrimps, ostracods belong to the group of animals called Crustacea. The find is important because the fossil has been found with its soft parts preserved inside the shell.

Today its descendents are common, and inhabit ponds, rivers and lakes and many parts of the seas and oceans, having first appeared on Earth about 500 million years ago.

Geologists find ostracods useful in order to help recreate past environments- the type of ostracod found in a rock sample would, for example, help to determine a picture of ancient conditions like water depth and salinity.

The study is published in the Proceedings of the Royal Society B. and in Planet Earth, the online journal of the Natural Environment Research Council.

Professor David Siveter: "Most fossil ostracod species are known only from their shells. You need exceptional conditions to preserve the soft body- there are only two other known examples of ancient fossil ostracods where the complete soft parts of the animal are preserved along with the shell."

Professor Siveter and colleagues were able to identify the 5mm-long fossil, its body and appendages inside the shell, including the antennae and also a set of paired eyes.

The ostracod was so well preserved that the team managed to spot the Bellonci organ, a sensory structure observed in modern species which protrudes out of the middle eye located at the front of the head. 'This is the first time the Bellonci organ is observed in fossil ostracods,' says David Siveter.

Had the soft body parts not been preserved, the scientists were likely to misidentify the fossil based on the shell record alone, claims Professor Siveter.

Scientists to Unearth Ice Age Secrets from Preserved Tree Rings


New Zealand's Kauri trees can measure up to four metres wide and live for up to 2,000 years. (Credit: Chris Turney; Image courtesy of University of Oxford)

Oxford University is involved in a research project to unearth 30,000 year old climate records, before they are lost forever. The rings of preserved kauri trees, hidden in New Zealand's peat bogs, hold the secret to climate fluctuations spanning back to the end of the last Ice Age.
The team, led by Exeter University, has been awarded a grant from the Natural Environment Research Council to carry out carbon dating and other analyses of the kauri tree rings. The trees store an immense amount of information about rapid and extreme climate change in the past. For instance, wide ring widths are associated with cool dry summer conditions. The scientists believe their findings will help us understand what future climate change may bring.

Tree rings are now known to be an excellent resource for extracting very precise and detailed data on atmospheric carbon from a particular time period. Therefore this study could help plug a large gap in our knowledge of climate change by extending historical weather records that only date back to the mid-nineteenth century.

There is nowhere else in the world with such a rich resource of ancient wood that spans such a large period of time. The ancient kauri logs are of enormous dimensions, up to several metres across, and have the potential to provide new detailed information about rapid, extreme and abrupt climate changes at a time when there was significant human migration throughout the globe.

While various records exist for historic climate change, such as those derived from ice cores, there is no easy way of correlating these records. The research will focus on the last 30,000 years, but some trees date back 130,000 years. The period towards the end of the last Ice Age is particularly difficult to understand.

This unique archive of kauri trees is likely to be lost within the next ten years because the timber is so highly-prized for furniture, arts and crafts. Kauri (Agathis australis) are conifer trees buried in peat bogs across northern New Zealand. Trees can measure up to four metres wide and live for up to 2,000 years. As well as containing information on past climates, they could also shed light on environmental and archaeological change.

Samples from a network of sites with buried trees will be collected in New Zealand and taken back to the UK laboratories for preparation and analysis at Exeter and then radiocarbon measurement at Oxford.

Professor Christopher Ramsey, from the School of Archaeology at the University of Oxford, said: 'This gives us a unique opportunity to increase our knowledge of the earth's climate and human responses to it at the end of the last Ice Age. The radiocarbon measurements should give us important new data that will help us to understand interactions between the atmosphere and the oceans during this period when there was rapid and dynamic change. Equally exciting is the prospect it will give us of more precise dating of archaeological sites from this period -- illuminating the only window we have onto how humans responded to these major changes in the environment.'

Lead researcher Professor Chris Turney of the University of Exeter said: 'We are facing a race against the clock to gather the information locked inside these preserved trees. It is fantastic to have this funding so we are able to gather this information before it is lost forever. While it will be fascinating to find out more about the earth 30,000 years ago perhaps more importantly we will have a better appreciation of the challenges of future climate change.'