jueves, 30 de diciembre de 2010
* During his candidacy, Barack Obama heavily employed mobile technology and the Internet to get the word out.
* Now as president, Obama has made several changes in government that bring technology to the foreground.
Obama campaigned heavily on the promise of an open, transparent government. Now that he's in, has he kept his promise?
In 2008, candidate Barack Obama changed the campaign game, heavily employing mobile technology and the Internet to raise funds and his popularity. Citizens texted "HOPE" to spread the word. YouTube videos helped clear up the Jeremiah Wright controversy and brought in outside supporters like Obama Girl. These Internet-based efforts, much like Kennedy with television, helped candidate Obama race through the primaries and into the White House. But what about President Obama?
It's been two years since the President was elected, and despite his declining approval ratings, there's no denying that he has made technological contributions to the government, the presidency and the country's day-to-day running.
But has Obama’s high-tech vision completely transcended from his campaign to his presidency? Decide for yourself. We offer you five of the current administration’s technology initiatives.
The Country’s First CIO and CTO
The Obama administration’s goals to inspire technological innovation and increase accessibility to all Americans call for specialized leadership. Thus, to help develop and implement 21st century technology policy, the president appointed the country’s first- ever Chief Technology and Information Officers, Aneesh Chopra, and Vivek Kundra, respectively, both of whom have been kept busy.
“The president chose to elevate the technology role by naming my position to be an assistant to the president,” Chopra said. “Every piece of paper that comes before the president on policy matters gets a thorough review for considering how technology will interface with that policy.”
In May 2009, Kundra helped launch Data.gov, the government information database discussed earlier. Additionally, he has been in charge of setting up the federal government’s first cloud computing portal, which allows for easier data sharing between government agencies and departments.
Chopra has been instrumental in getting the Open Government Directive on its feet, and has focused heavily on using technology to make health care more accessible and affordable.
“In the Obama administration, technology, data and innovation have been critical components of the President’s priorities from day one,” said Chopra.
Open Government Directive
President Obama campaigned heavily on the promise of an open, transparent government. On his first day in office, he took the first step towards increased accountability by signing the “Transparency and Open Government” memorandum, directing federal agencies to improve communications and collaborative efforts between the federal government and the people.
In December 2009, the administration unveiled the Open Government Directive, which sett a timeline for all executive agencies to publish high-value data sets on the web. A staple of the directive, Data.gov, serves as a repository for public government information -- from unemployment statistics to aviation accident reports.
Criticism of Obama’s online portals revolve around accessibility. Many data sets require special software to read and most don’t exactly peak the public’s interest. According to Tom Glaisyer, a Knight media fellow at the New American Foundation, the government information initiative is still in its beginning stage. Businesses and developers are still figuring out how to organize the information to provide meaning for most Americans.
“Larger numerical data sets are a good component, but they aren’t the whole story,” Glaisyer said.
Besides transparency, another goal of the Open Government Directive is to increase public engagement in executive affairs. Chairman of the Economic Council Austan Goolsbee’s whiteboard visualizations appear to be a step in the direction. Meanwhile, sites like Challenge.gov, where citizens can contribute ideas for federal agencies, and Federalregister.gov, which opens up the doors to the nation’s federal newspaper, are designed to increase participation. But they’re just a start.
“We need to develop more ways to engage effectively with government employees,” Glaisyer said. “It requires as much face-to-face engagement as online.”
Expanding Internet Access
In this Information Age, Internet access provides billions of people with instant access to banking services, health information, shopping, and education. But, approximately 100 million Americans -- most making relatively low incomes -- remain without Internet access.
“As more and more things move online, if you don’t have access, if you don’t know how to use it -- you’re going to be at a disadvantage,” said Ben Lennett, senior policy analyst of the Open Technology Initiative at the New America Foundation headquartered in the nation’s capitol.
There is hope for those stranded in an Internet wasteland, though. As part of the American Recovery and Reinvestment Act of 2009, $7.2 billion has been set aside to provide all Americans with an affordable, 100 megabit-per-second Internet connection by 2020. To bring so-called broadband to the masses, the government will make grants available to rural areas so that they can build up the necessary infrastructure that will expand computer center capacity and support sustainable broadband initiatives.
While the broadband plan’s implementation is still in its infancy, many are questioning whether access can really be made “affordable” in a market dominated by just a few Internet providers. As basic economics will tell you, consumers have a lot more power when they have more choices.
Some policy analysts maintain an open access policy is necessary to keeping costs down. This would require one company to build out a network that rival companies could compete and invest in. Without such competition, affordable access may not find its way to many Americans’ homes.
“Unfortunately, the FCC has ignored the competition question,” Lennett said. “Clearly, there is a correlation between competition and price. Until that becomes a national priority, I’m not sure where we can go from there.”
New Technology for the Deaf and Blind
The emergence of new technology presents new challenges for millions of blind and deaf Americans. Thus, President Obama signed the 21st Century Communications and Video Accessibility Act on October 20, 2010.
The law aims to provide deaf and blind people with the same level of access to new technologies as their fellow, non-disabled Americans. For example, one provision requires all mobile companies to make web browsers, email and text messaging on smart phones easy to use for the visually and hearing impaired.
Paul Schroeder, vice president of programs and policy at the American Foundation for the Blind in New York City, helped write the legislation. Besides smart phones, Shoeffer and other disability rights-advocates also pushed to increase ease-of-access to television programming, DVD menus, and program guides on cable television.
“We want to use these technologies right along side our non-disabled peers,” Schoeffer said. “We hope this new law will set forth directions for companies to make their products more accessible and set the rules of the road.”
Promoting Green Energy Technology
President Obama is making great strides in promoting and investing in renewable energy technology and energy efficiency programs.
The Recovery Act allocated about $70 billion for energy-related programs, including research and development in weatherization assistance, vehicle technologies, biomass, fuel cells, geothermal technologies and solar and wind energy.
For example, the Cape Wind Farm in the sea surrounding Massachusetts remains one of the most notable, yet controversial, clean energy projects approved by the Obama administration. On May 17, 2010, Secretary of Interior Ken Salazar signed the lease for the country’s first-ever offshore wind project, which had been fighting for approval for ten years.
And it’s not just the sea the Obama administration has its eyes on. On October 6, 2010 Salazar signed the first of six leases for large-scale solar energy projects on public land. Unfortunately, most of these loan-guarantee projects and research grants come with heavy paperwork and take some time to implement.
“It’s going to take time to expand the capacity [of renewable energy]. But we have very explicit goals the President is holding the Department of Energy to, including expanding the capacity of renewable energy sources,” said Chopra.
* Fluid dynamics, the study of how water, air and other fluids move, is behind some of the latest animation feats.
* Mathematical equations now determine how an object will move on screen in a realistic way.
Computer scientists are using new kinds of software programs that harness mathematical algorithms that describe chaotic scenes of nature, such as this scene of storm-tossed waves pummeling a ship off a forbidding coastline.
The next time you take in a movie, you may be getting a lesson in cutting-edge physics without even knowing it. Hollywood has embraced the complex field of fluid dynamics, the study of how water, air, smoke and other fluids move, in a big way, allowing filmmakers to create realistic scenes of turbulent oceans and falling buildings -- not to mention the quirks of Jeff Bridges' face.
"It used to be that the story was limited by the technology," said Doug Roble, creative director at Digital Domain, a Hollywood special effects studio. "Now we're getting to the point where there are no limits. If I want to have Mount Everest fall into the ocean and catch on fire, we can do that. And the audience will buy that it is happening."
Computer scientists like Roble are using new kinds of software programs that harness mathematical algorithms that describe chaotic scenes of nature. These same kinds of mathematical formulas can be used to describe and then animate the bending of steel girders, as seen when downtown Los Angeles collapses in last year's blockbuster "2012."
Instead of just drawing a steel girder from a building and pushing it around, Roble says mathematical equations now determine how the object will move on screen.
"In order to get the physics right, the mathematics is very stiff," Roble said. "So in order to simulate it accurately, you have to take extremely small time steps to move the simulation forward. If it's too fast, the simulation will explode, negative signs start appearing, and your simulation won't mimic reality."
Canadian computer scientist Robert Bridson wrote about this emerging field of physics and animation recently in the journal Science. Bridson's company Exotic Matter has a long list of Hollywood film projects, including recent "Harry Potter and the Half-Blood Prince," "Hellboy" and "Quantum of Solace." He's an expert of sorts in creating realistic smoke, water, fire, hair, skin and clothing.
"(Digital filmmaking) has a lot in common with foundational work with applied mathematics and computational physics," Bridson said from New Zealand, where he's working on the latest Tolkien film, "The Hobbit." "People will look at a phenomenon of interest and come up with equations to describe what they are seeing. A lot of that is now going on in film."
Bridson said creating realistic sea foam and ocean spray has become the latest challenge for math-based special effects designers. He notes a particular complex shots of a both a giant wave and the lion's rippling fur in the just-released fantasy film "Chronicles of Narnia: Voyage of the Dawn Treader" that made heavy use of new fluid dynamics-based programs.
What's been driving the marriage of math and movie-making is the rising expectations of film directors and the availability of cheap computing power needed to run the software. That means big special effects no longer equals big budgets.
"We will start seeing more low-budget independent types of shops producing extraordinary effects," Bridson said. "'District 9' cost $30 million. Compared to the budgets of what other science fiction films need, it was pretty cheap."
Using computer simulation of solid and fluid dynamics is both cheaper for the director and less dangerous for human actors, Bridson said.
Both Roble and Bridson say the next step for creative programmers is creating digital doubles for human actors. By using a fully animated digital character, a director will be able to redo difficult scenes without rebuilding a set or requiring dangerous stunts.
In the recent "Tron" sequel, filmmakers wanted to recreate actor Jeff Bridges' character to resemble what he looked like 30 years ago. Using new software, they mapped Bridges' current face with a set of points, then transferred it to images taken from Bridges circa 1982. The data was crunched using software that wasn't available a few years ago, Roble explained.
"The human face is extraordinarily tough," Roble said. "Right now, the research community is focused on muscles and skin."
Skype now does video-calling on iOS devices. The new update to Skype’s iPhone/iPad/iPod Touch app, version 3.0, allows users to make video calls between their iDevices, as well as with desktop computers — in other words, with any other Skype user. The calls can be placed over both Wi-Fi and 3G.
To make and receive video calls, you’ll need to have an iPhone 3GS or better, and you must be running iOS4. If you have both front- and back-facing cameras, you can use either. The 3GS can only, obviously, use the rear cam, since it doesn’t have a front-facing camera.
And if you have an iPad or a last-gen iPod Touch? You’re not left out, even though your device doesn’t have a camera. You can still receive video calls, but of course you can’t send any video.
Skype has a big advantage over FaceTime, Apple’s own video-calling app, as pretty much everyone already uses Skype. FaceTime requires a camera-equipped iPhone or iPod Touch, or a Mac running beta software. And it only works over Wi-Fi.
This is big news, especially for people wanting to replace computers with iPads. If a camera-equipped iPad goes on sale this year, as expected, then people like my parents could ditch their hard-to-administer PC for an iPad.
There is still one limitation to Skype’s iPad version of the software. This update, despite adding video, still requires you to pixel-double the app to get a full-screen view. Hopefully Skype’s next update will bring us video in the iPad’s full, native resolution.
Studying how bacteria incorporate foreign DNA from invading viruses into their own regulatory processes, Thomas Wood, professor in the Artie McFerrin Department of Chemical Engineering at Texas A&M University, is uncovering the secrets of one of nature's most primitive immune systems.
His findings, which appear in Nature Communications, a multidisciplinary publication dedicated to research in all areas of the biological, physical and chemical sciences, shed light on how bacteria have throughout the course of millions of years developed resistance to antibiotics by co-opting the DNA of their natural enemies -- viruses.
The battle between bacteria and bacteria-eating viruses, Wood explains, has been going on for millions of years, with viruses attempting to replicate themselves by -- in one approach -- invading bacteria cells and integrating themselves into the chromosomes of the bacteria. When this happens a bacterium makes a copy of its chromosome, which includes the virus particle. The virus then can choose at a later time to replicate itself, killing the bacterium -- similar to a ticking time bomb, Wood says.
However, things can go radically wrong for the virus because of random but abundant mutations that occur within the chromosome of the bacterium. Having already integrated itself into the bacterium's chromosome, the virus is subject to mutation as well, and some of these mutations, Wood explains, render the virus unable to replicate and kill the bacterium.
With this new diverse blend of genetic material, Wood says, a bacterium not only overcomes the virus' lethal intentions but also flourishes at a greater rate than similar bacteria that have not incorporated viral DNA.
"Over millions of years, this virus becomes a normal part of the bacterium," Wood says. "It brings in new tricks, new genes, new proteins, new enzymes, new things that it can do. The bacterium learns how to do things from this.
"What we have found is that with this new viral DNA that has been trapped over millions of years in the chromosome, the cell has created a new immune system," Wood notes. "It has developed new proteins that have enabled it to resists antibiotics and other harmful things that attempt to oxidize cells, such as hydrogen peroxide. These cells that have the new viral set of tricks don't die or don't die as rapidly."
Understanding the significance of viral DNA to bacteria required Wood's research team to delete all of the viral DNA on the chromosome of a bacterium, in this case bacteria from a strain of E. coli. Wood's team, led by postdoctoral researcher Xiaoxue Wang, used what in a sense could be described as "enzymatic scissors" to "cut out" the nine viral patches, which amounted to precisely removing 166,000 nucleotides. Once the viral patches were successfully removed, the team examined how the bacterium cell changed. What they found was a dramatically increased sensitivity to antibiotics by the bacterium.
While Wood studied this effect in E. coli bacteria, he says similar processes have taken place on a massive, widespread scale, noting that viral DNA can be found in nearly all bacteria, with some strains possessing as much as 20 percent viral DNA within their chromosome.
"To put this into perspective, for some bacteria, one-fifth of their chromosome came from their enemy, and until our study, people had largely neglected to study that 20 percent of the chromosome," Wood says. "This viral DNA had been believed to be silent and unimportant, not having much impact on the cell.
"Our study is the first to show that we need to look at all bacteria and look at their old viral particles to see how they are affecting the bacteria's current ability to withstand things like antibiotics. If we can figure out how the cells are more resistant to antibiotics because of this additional DNA, we can perhaps make new, effective antibiotics."
Clues to future climate may be found in the way that an ordinary drinking glass shatters. A study appearing in the Proceedings of the National Academy of Sciences finds that microscopic particles of dust, emitted into the atmosphere when dirt breaks apart, follow similar fragment patterns to broken glass and other brittle objects.
The research, by National Center for Atmospheric Research (NCAR) scientist Jasper Kok, suggests there are several times more dust particles in the atmosphere than previously thought, since shattered dirt appears to produce an unexpectedly high number of large dust fragments.
The finding has implications for understanding future climate change because dust plays a significant role in controlling the amount of solar energy in the atmosphere. Depending on their size and other characteristics, some dust particles reflect solar energy and cool the planet, while others trap energy as heat.
"As small as they are, conglomerates of dust particles in soils behave the same way on impact as a glass dropped on a kitchen floor," Kok says. "Knowing this pattern can help us put together a clearer picture of what our future climate will look like."
The study may also improve the accuracy of weather forecasting, especially in dust-prone regions. Dust particles affect clouds and precipitation, as well as temperatures.
The research was supported by the National Science Foundation, which sponsors NCAR.
Kok's research focused on a type of airborne particle known as mineral dust. These particles are usually emitted when grains of sand are blown into soil, shattering dirt and sending fragments into the air. The fragments can be as large as about 50 microns in diameter, or about the thickness of a fine strand of human hair.
The smallest particles, which are classified as clay and are as tiny as 2 microns in diameter, remain in the atmosphere for about a week, circling much of the globe and exerting a cooling influence by reflecting heat from the Sun back into space. Larger particles, classified as silt, fall out of the atmosphere after a few days. The larger the particle, the more it will tend to have a heating effect on the atmosphere.
Kok's research indicates that the ratio of silt particles to clay particles is two to eight times greater than represented in climate models.
Since climate scientists carefully calibrate the models to simulate the actual number of clay particles in the atmosphere, the paper suggests that models most likely err when it comes to the number of silt particles. Most of these larger particles swirl in the atmosphere within about 1,000 miles of desert regions, so adjusting their quantity in computer models should generate better projections of future climate in desert regions, such as the southwestern United States and northern Africa.
Additional research will be needed to determine whether future temperatures in those regions will increase more or less than currently indicated by computer models.
The study results also suggest that marine ecosystems, which draw down carbon dioxide from the atmosphere, may receive substantially more iron from airborne particles than previously estimated. The iron enhances biological activity, benefiting ocean food chains, including plants that take up carbon during photosynthesis.
In addition to influencing the amount of solar heat in the atmosphere, dust particles also get deposited on mountain snowpacks, where they absorb heat and accelerate melt.
Glass and dust: Common fracture patterns
Physicists have long known that certain brittle objects, such as glass or rocks, and even atomic nuclei, fracture in predictable patterns. The resulting fragments follow a certain range of sizes, with a predictable distribution of small, medium, and large pieces. Scientists refer to this type of pattern as scale invariance or self-similarity.
Physicists have devised mathematical formulas for the process by which cracks propagate in predictable ways as a brittle object breaks. Kok theorized that it would be possible to use these formulas to estimate the range of dust particle sizes. He turned to a 1983 study by Guillaume d'Almeida and Lothar Schüth from the Institute for Meteorology at the University of Mainz in Germany that measured the particle size distribution of arid soil.
By applying the formulas for fracture patterns of brittle objects to the soil measurements, Kok determined the size distribution of emitted dust particles. To his surprise, the formulas described measurements of dust particle sizes almost exactly.
"The idea that all these objects shatter in the same way is a beautiful thing, actually," Kok says. "It's nature's way of creating order in chaos."
About the article
Title: "A scaling theory for the size distribution of emitted dust aerosols suggests that climate models underestimate the size of the global dust cycle"
When the Black Hole Was Born: Astronomers Identify the Epoch of the First Fast Growth of Black Holes
Most galaxies in the universe, including our own Milky Way, harbor super-massive black holes varying in mass from about one million to about 10 billion times the size of our sun. To find them, astronomers look for the enormous amount of radiation emitted by gas which falls into such objects during the times that the black holes are "active," i.e., accreting matter. This gas "infall" into massive black holes is believed to be the means by which black holes grow.
Now a team of astronomers from Tel Aviv University, including Prof. Hagai Netzer and his research student Benny Trakhtenbrot, has determined that the era of first fast growth of the most massive black holes occurred when the universe was only about 1.2 billion years old -- not two to four billion years old, as was previously believed -- and they're growing at a very fast rate.
The results will be reported in a new paper soon to appear in The Astrophysical Journal.
The oldest are growing the fastest
The new research is based on observations with some of the largest ground-based telescopes in the world: "Gemini North" on top of Mauna Kea in Hawaii, and the "Very Large Telescope Array" on Cerro Paranal in Chile. The data obtained with the advanced instrumentation on these telescopes show that the black holes that were active when the universe was 1.2 billion years old are about ten times smaller than the most massive black holes that are seen at later times. However, they are growing much faster.
The measured rate of growth allowed the researchers to estimate what happened to these objects at much earlier as well as much later times. The team found that the very first black holes, those that started the entire growth process when the universe was only several hundred million years old, had masses of only 100-1000 times the mass of the sun. Such black holes may be related to the very first stars in the universe. They also found that the subsequent growth period of the observed sources, after the first 1.2 billion years, lasted only 100-200 million years.
The team found that the very first black holes -- those that started growing when the universe was only several hundred million years old -- had masses of only 100-1000 times the mass of the sun. Such black holes may be related to the very first stars in the universe. They also found that the subsequent growth period of these black holes, after the first 1.2 billion years, lasted only 100-200 million years.
The new study is the culmination of a seven year-long project at Tel Aviv University designed to follow the evolution of the most massive black holes and compare them with the evolution of the galaxies in which such objects reside.
Other researchers on the project include Prof. Ohad Shemmer of the University of North Texas, who took part in the earlier stage of the project as a Ph.D student at Tel Aviv University, and Prof. Paulina Lira, from the University of Chile.
Rodents get a bad rap as vermin and pests because they seem to thrive everywhere. They have been one of the most common mammals in Africa for the past 50 million years.
From deserts to rainforests, rodents flourished in prehistoric Africa, making them a stable and plentiful source of food, says paleontologist Alisa J. Winkler, an expert on rodent and rabbit fossils. Now rodent fossils are proving their usefulness to scientists as they help shed light on human evolution.
Rodents can corroborate evidence from geology and plant and animal fossils about the ancient environments of our human ancestors and other prehistoric mammals, says Winkler, a research professor at Southern Methodist University.
"Rodents are often known in abundance, and there are many different kinds from a number of famous hominid and hominoid localities," says Winkler. "Many paleoanthropologists are very interested in the faunal and ecological context in which our own species evolved."
Rodents: World's most abundant mammal -- and Africa's too
Rodents -- rats, mice, squirrels, porcupines, gerbils and others -- are the largest order of living mammals, constituting 42 percent of the total mammalian diversity worldwide. That's according to data drawn from the research literature in an analysis by Winkler and her paleontology colleagues Christiane Denys, of the Museum National d'Histoire Naturelle in Paris, and D. Margaret Avery of the Iziko South African Museum in Cape Town.
Their review documents more than 130 formally named genera in "Fossil Rodents of Africa," the first comprehensive summary and distribution analysis of Africa's fossil rodents since 1978.
The analysis is a chapter in the new 1008-page scientific reference book "Cenozoic Mammals of Africa" (University of California Press, 2010), the first comprehensive scientific review of Africa's fossil mammals in more than three decades. The book comprises 48 chapters by 64 experts, summarizing and interpreting the published fossil research to date of Africa's mammals, tectonics, geography, climate and flora of the past 65 million years.
Rodents are human's best friend?
Rodents have been around much longer than humans or human ancestors in Africa, with the earliest from northern Africa dating from about 50 million years ago. Today scientists are aware of 14 families of rodents in Africa.
Winkler cites locales where fossils of the sharp-toothed, gnawing creatures have been found relevant to our human ancestors:
* Ethiopia's Middle Awash, where some fossils date to when the chimpanzee and human lines split 4 million to 7 million years ago and where the famous "Ardi" primate was discovered;
* Tanzania's Olduvai Gorge, dubbed the "Cradle of Mankind";
* The Tugen Hills and Lake Turkana sites of Kenya, where important human ancestor fossils have been discovered;
* In younger southern African cave faunas dating to the Stone Age.
Their fossils also have been found in other older Eastern Africa sites, where apes and humans have been linked to the monkey lineage.
"At many of these sites, identification of Africa's rodents provides important collaborating information on the ecology of the locales and on environmental change through time," the authors write.
Rodent diversity likely underestimated; more fossils than scientists
The diversity of ancient Africa's rodents most likely has been underestimated, say the authors. Just how much isn't known, though, because the quantity of rodent fossils being discovered far exceeds the handful of scientists who specialize in identifying and studying the specimens.
That diversity continues to expand. The last exhaustive analysis of Africa's rodents was carried out by R. Lavocat in 1978. At that time scientists recorded 54 genera, 76 fewer than those documented by Winkler, Denys and Avery in their analysis.
Winkler and her colleagues summarize the distribution and ecology of existing rodent families, as well as the systematics, biochronology and paleobiogeography of rodent families in Africa's fossil record. The diversity they document reflects "the wide variety of habitats present on the continent" and paints a picture of Africa's paleoecology.
Given the huge rodent diversity in modern Africa, "it is likely that such an extensive fauna was also present in the past," the scientists write.
Tremendous diversity reflects wide variety of habitats
An example of that relationship is the scaly-tailed flying squirrel, an exclusively African group of forest-dwelling rodents that are not related to true squirrels. They are well known from about 18 million to 20 million years ago in eastern Africa, Winkler says, suggesting the presence of closed habitats, such as forests. That corroborates other evidence of forests from fossil animals, plants and geology, she says.
"Although there are even older scaly-tailed flying squirrels known from the currently arid regions of northern Africa," says Winkler, "they do not appear to have been gliders, as are most current forms, and the question of when members of the group first developed gliding locomotion still remains."
Funding for "Cenozoic Mammals of Africa" came from the Swedish Research Council; the University of Michigan's College of Literature, Science, and the Arts, and Museum of Paleontology; and the Regents of the University of California.
Winkler is in the Huffington Department of Earth Sciences at SMU, and is also an assistant professor at the University of Texas Southwestern Medical Center, Dallas.
Gamma-ray bursts are among the most energetic events in the Universe, but some appear curiously faint in visible light. The biggest study to date of these so-called dark gamma-ray bursts, using the 2.2-meter MPG/ESO telescope at La Silla in Chile, has found that these explosions don't require exotic explanations. Their faintness is now explained by a combination of causes, the most important of which is the presence of dust between the Earth and the explosion.
Gamma-ray bursts (GRBs), fleeting events that last from less than a second to several minutes, are detected by orbiting observatories that can pick up their high energy radiation. Thirteen years ago, however, astronomers discovered a longer-lasting stream of less energetic radiation coming from these violent outbursts, which can last for weeks or even years after the initial explosion. Astronomers call this the burst's afterglow.
While all gamma-ray bursts  have afterglows that give off X-rays, only about half of them were found to give off visible light, with the rest remaining mysteriously dark. Some astronomers suspected that these dark afterglows could be examples of a whole new class of gamma-ray bursts, while others thought that they might all be at very great distances. Previous studies had suggested that obscuring dust between the burst and us might also explain why they were so dim.
"Studying afterglows is vital to further our understanding of the objects that become gamma-ray bursts and what they tell us about star formation in the early Universe," says the study's lead author Jochen Greiner from the Max-Planck Institute for Extraterrestrial Physics in Garching bei München, Germany.
NASA launched the Swift satellite at the end of 2004. From its orbit above the Earth's atmosphere it can detect gamma-ray bursts and immediately relay their positions to other observatories so that the afterglows could be studied. In the new study, astronomers combined Swift data with new observations made using GROND  -- a dedicated gamma-ray burst follow-up observation instrument, which is attached to the 2.2-metre MPG/ESO telescope at La Silla in Chile. In doing so, astronomers have conclusively solved the puzzle of the missing optical afterglow.
What makes GROND exciting for the study of afterglows is its very fast response time -- it can observe a burst within minutes of an alert coming from Swift using a special system called the Rapid Response Mode -- and its ability to observe simultaneously through seven filters covering both the visible and near-infrared parts of the spectrum.
By combining GROND data taken through these seven filters with Swift observations, astronomers were able to accurately determine the amount of light emitted by the afterglow at widely differing wavelengths, all the way from high energy X-rays to the near-infrared. The astronomers used this information to directly measure the amount of obscuring dust that the light passed through en route to Earth. Previously, astronomers had to rely on rough estimates of the dust content .
The team used a range of data, including their own measurements from GROND, in addition to observations made by other large telescopes including the ESO Very Large Telescope, to estimate the distances to nearly all of the bursts in their sample. While they found that a significant proportion of bursts are dimmed to about 60-80 percent of the original intensity by obscuring dust, this effect is exaggerated for the very distant bursts, letting the observer see only 30-50 percent of the light . The astronomers conclude that most dark gamma-ray bursts are therefore simply those that have had their small amount of visible light completely stripped away before it reaches us.
"Compared to many instruments on large telescopes, GROND is a low cost and relatively simple instrument, yet it has been able to conclusively resolve the mystery surrounding dark gamma-ray bursts," says Greiner.
 Gamma-ray bursts lasting longer than two seconds are referred to as long bursts and those with a shorter duration are known as short bursts. Long bursts, which were observed in this study, are associated with the supernova explosions of massive young stars in star-forming galaxies. Short bursts are not well understood, but are thought to originate from the merger of two compact objects such as neutron stars.
 The Gamma-Ray burst Optical and Near-infrared Detector (GROND) was designed and built at the Max-Planck Institute for Extraterrestrial Physics in collaboration with the Tautenburg Observatory, and has been fully operational since August 2007.
 Other studies relating to dark gamma-ray bursts have been released. Early this year, astronomers used the Subaru Telescope to observe a single gamma-ray burst, from which they hypothesised that dark gamma-ray bursts may indeed be a separate sub-class that form through a different mechanism, such as the merger of binary stars. In another study published last year using the Keck Telescope, astronomers studied the host galaxies of 14 dark GRBs, and based on the derived low redshifts they infer dust as the likely mechanism to create the dark bursts. In the new work reported here, 39 GRBs were studied, including nearly 20 dark bursts, and it is the only study in which no prior assumptions have been made and the amount of dust has been directly measured.
 Because the afterglow light of very distant bursts is redshifted due to the expansion of the Universe, the light that left the object was originally bluer than the light we detect when it gets to Earth. Since the reduction of light intensity by dust is greater for blue and ultraviolet light than for red, this means that the overall dimming effect of dust is greater for the more distant gamma-ray bursts. This is why GROND's ability to observe near-infrared radiation makes such a difference.
lunes, 27 de diciembre de 2010
* According to medieval manuscripts, the great-grandmother of Jesus was St. Ismeria.
* The legend of St. Ismeria emphasizes sanctity earned by a life of penitence as opposed to blood martyrdom.
* St. Ismeria likely served as a role model for older women during the 14th and 15th centuries.
The legend of St. Ismeria marks a shift in belief, as sanctity was previously more often earned by blood martyrdom rather than piety. Click to enlarge this image.
The great-grandmother of Jesus was a woman named Ismeria, according to Florentine medieval manuscripts analyzed by a historian.
The legend of St. Ismeria, presented in the current Journal of Medieval History, sheds light on both the Biblical Virgin Mary's family and also on religious and cultural values of 14th-century Florence.
"I don't think any other woman is mentioned" as Mary's grandmother in the Bible, Catherine Lawless, author of the paper, told Discovery News. "Mary's patrilineal lineage is the only one given."
"Mary herself is mentioned very little in the Bible," added Lawless, a lecturer in history at the University of Limerick. "The huge Marian cult that has evolved over centuries has very few scriptural sources."
Lawless studied the St. Ismeria story, which she said has been "ignored by scholars," in two manuscripts: the 14th century "MS Panciatichiano 40" of Florence's National Central Library and the 15th century "MS 1052" of the Riccardiana Library, also in Florence.
"According to the legend, Ismeria is the daughter of Nabon of the people of Judea, and of the tribe of King David," wrote Lawless. She married "Santo Liseo," who is described as "a patriarch of the people of God." The legend continues that the couple had a daughter named Anne who married Joachim. After 12 years, Liseo died. Relatives then left Ismeria penniless.
"I'm pretty sure one is supposed to believe that it was either her dead husband's relatives or, less likely, her natal family," Lawless said. "The family of the Virgin Mary would not have been cast in such a light."
Ismeria then goes to a hospital where she finds refuge. She is said to perform a miracle, filling a shell with fish to feed all of the hospital's patients. After this miracle she prays to be taken away from the "vainglory of this world."
After God called her to "Paradise," a rector at the hospital informed the Virgin Mary and Jesus of her passing. They departed for the hospital with the 12 Apostles, Mary Magdalene, Mary Salome and Mary Cleophas. There they paid honor to St. Ismeria.
The legend marks a shift in belief, as sanctity was previously more often earned by blood martyrdom rather than piety. Lawless credits that, in part, to the rise in the belief of Purgatory, an interim space between heaven and hell where sins could be purged.
"The more sins purged in one's lifetime (through penitence, good works, etc.) the less time needed in purgatory -- for either oneself or one's family," she said.
She also pointed out that "the great bulk of Christian martyrs of the west died under the Roman persecutions, which ended in the fourth century."
While the author of the Ismeria legend remains unknown, Lawless thinks it could have been a layperson from Tuscany. During the medieval period, "the story may have been used as a model for continent wifehood and active, charitable widowhood in one of the many hospitals of medieval Florence."
"The grandmother of the Virgin was no widow who threatened the patrimony of her children by demanding the return of her dowry, nor did she threaten the family unit by remarrying and starting another lineage," she added. "Instead, her life could be seen as an ideal model for Florentine penitential women."
George Ferzoco, a research fellow at the University of Bristol, commented that the new paper analyzing the legend is "brilliant" and "reveals an exciting trove of religious material from late medieval and renaissance Florence, where many manuscripts were written specifically for females."
"What is so striking about St. Ismeria," Carolyn Muessig of the University of Bristol's Department of Theology and Religious Studies told Discovery News, "is that she is a model for older matrons. Let's face it: Older female role models are hard to come by in any culture."
"But the fact that St. Ismeria came to the fore in late medieval Florence," Muessig concluded, "reveals some of the more positive attitudes that medieval culture had towards the place and the importance of women in society."
An Indian space rocket carrying an advanced communications satellite was destroyed by mission control Saturday following a malfunction after lift-off, officials said.
Live television pictures showed the rocket exploding in a plume of smoke and fire moments after taking off from the Sriharikota launch site, some 80 kilometres (50 miles) from the southern city of Chennai.
The Geosynchronous Satellite Launch Vehicle (GSLV) veered from its intended flight path and was intentionally blown up 47 seconds after lift off, Indian Space Research Organisation (ISRO) Chairman K. Radhakrishnan told reporters.
The GSLV exploded "at an altitude of eight kilometres (4.9 miles) and the debris have fallen in deep sea," Radhakrishnan said, referring to the Bay of Bengal.
"Data indicates commands from onboard computers ceased to reach circuits of the first stage (engines) but what caused the interruption needs to be studied and we hope to get an assessment of what triggered this," Radhakrishnan said.
The Christmas Day launch had originally been scheduled for December 20 but was postponed after engineers discovered a leak in one of the Russian-designed engines of the GSLV, the United News of India agency said.
In July, an Indian rocket successfully put five satellites into orbit, three months after the country's space ambitions suffered a setback when a rocket crashed on lift-off.
India began its space programme in 1963 and has developed its own satellites and launch vehicles to reduce its dependence on other countries.
It first staked its claim for a share of the global commercial launch market by sending an Italian satellite into orbit in 2007. In January 2008, it launched an Israeli spy satellite.
India aims to launch its first manned lunar mission in 2016 and wants to grab a larger share of the multi-billion-dollar market for launching commercial satellites.
Government funding of around 2.8 billion dollars has been secured for the moon project.
India in 2008 launched an unmanned satellite and put a probe on the moon's surface in an event that the state-owned ISRO hoped would give the country international recognition in the space business.
The probe's lunar landing vaulted India's up the league of space-faring nations led by the United States and regional competitors Russia, China and Japan and was seen as a symbolic and proud moment in the country's development.
But India still has a long way to go to catch up with China, which together with the US, Russia and the European Space Agency is already well established in the commercial space sector.
Image: State-run Indian Space Research Organizations' (ISRO) satellite GSAT-5P rocket explodes in mid-air shortly after its launch in Sriharikota, India, Saturday, Dec. 25, 2010 (Associated Press)
By 2045 global population is projected to reach nine billion. Can the planet take the strain?
One day in Delft in the fall of 1677, Antoni van Leeuwenhoek, a cloth merchant who is said to have been the long-haired model for two paintings by Johannes Vermeer—“The Astronomer” and “The Geographer”—abruptly stopped what he was doing with his wife and rushed to his worktable. Cloth was Leeuwenhoek’s business but microscopy his passion. He’d had five children already by his first wife (though four had died in infancy), and fatherhood was not on his mind. “Before six beats of the pulse had intervened,” as he later wrote to the Royal Society of London, Leeuwenhoek was examining his perishable sample through a tiny magnifying glass. Its lens, no bigger than a small raindrop, magnified objects hundreds of times. Leeuwenhoek had made it himself; nobody else had one so powerful. The learned men in London were still trying to verify Leeuwenhoek’s earlier claims that unseen “animalcules” lived by the millions in a single drop of lake water and even in French wine. Now he had something more delicate to report: Human semen contained animalcules too. “Sometimes more than a thousand,” he wrote, “in an amount of material the size of a grain of sand.” Pressing the glass to his eye like a jeweler, Leeuwenhoek watched his own animalcules swim about, lashing their long tails. One imagines sunlight falling through leaded windows on a face lost in contemplation, as in the Vermeers. One feels for his wife. Leeuwenhoek became a bit obsessed after that. Though his tiny peephole gave him privileged access to a never-before-seen microscopic universe, he spent an enormous amount of time looking at spermatozoa, as they’re now called. Oddly enough, it was the milt he squeezed from a cod one day that inspired him to estimate, almost casually, just how many people might live on Earth.
Nobody then really had any idea; there were few censuses. Leeuwenhoek started with an estimate that around a million people lived in Holland. Using maps and a little spherical geometry, he calculated that the inhabited land area of the planet was 13,385 times as large as Holland. It was hard to imagine the whole planet being as densely peopled as Holland, which seemed crowded even then. Thus, Leeuwenhoek concluded triumphantly, there couldn’t be more than 13.385 billion people on Earth—a small number indeed compared with the 150 billion sperm cells of a single codfish! This cheerful little calculation, writes population biologist Joel Cohen in his book How Many People Can the Earth Support?, may have been the first attempt to give a quantitative answer to a question that has become far more pressing now than it was in the 17th century. Most answers these days are far from cheerful.
Historians now estimate that in Leeuwenhoek’s day there were only half a billion or so humans on Earth. After rising very slowly for millennia, the number was just starting to take off. A century and a half later, when another scientist reported the discovery of human egg cells, the world’s population had doubled to more than a billion. A century after that, around 1930, it had doubled again to two billion. The acceleration since then has been astounding. Before the 20th century, no human had lived through a doubling of the human population, but there are people alive today who have seen it triple. Sometime in late 2011, according to the UN Population Division, there will be seven billion of us.
And the explosion, though it is slowing, is far from over. Not only are people living longer, but so many women across the world are now in their childbearing years—1.8 billion—that the global population will keep growing for another few decades at least, even though each woman is having fewer children than she would have had a generation ago. By 2050 the total number could reach 10.5 billion, or it could stop at eight billion—the difference is about one child per woman. UN demographers consider the middle road their best estimate: They now project that the population may reach nine billion before 2050—in 2045. The eventual tally will depend on the choices individual couples make when they engage in that most intimate of human acts, the one Leeuwenhoek interrupted so carelessly for the sake of science.
With the population still growing by about 80 million each year, it’s hard not to be alarmed. Right now on Earth, water tables are falling, soil is eroding, glaciers are melting, and fish stocks are vanishing. Close to a billion people go hungry each day. Decades from now, there will likely be two billion more mouths to feed, mostly in poor countries. There will be billions more people wanting and deserving to boost themselves out of poverty. If they follow the path blazed by wealthy countries—clearing forests, burning coal and oil, freely scattering fertilizers and pesticides—they too will be stepping hard on the planet’s natural resources. How exactly is this going to work?
THERE MAY BE SOME COMFORT in knowing that people have long been alarmed about population. From the beginning, says French demographer Hervé Le Bras, demography has been steeped in talk of the apocalypse. Some of the field’s founding papers were written just a few years after Leeuwenhoek’s discovery by Sir William Petty, a founder of the Royal Society. He estimated that world population would double six times by the Last Judgment, which was expected in about 2,000 years. At that point it would exceed 20 billion people—more, Petty thought, than the planet could feed. “And then, according to the prediction of the Scriptures, there must be wars, and great slaughter, &c.,” he wrote.
As religious forecasts of the world’s end receded, Le Bras argues, population growth itself provided an ersatz mechanism of apocalypse. “It crystallized the ancient fear, and perhaps the ancient hope, of the end of days,” he writes. In 1798 Thomas Malthus, an English priest and economist, enunciated his general law of population: that it necessarily grows faster than the food supply, until war, disease, and famine arrive to reduce the number of people. As it turned out, the last plagues great enough to put a dent in global population had already happened when Malthus wrote. World population hasn’t fallen, historians think, since the Black Death of the 14th century.
In the two centuries after Malthus declared that population couldn’t continue to soar, that’s exactly what it did. The process started in what we now call the developed countries, which were then still developing. The spread of New World crops like corn and the potato, along with the discovery of chemical fertilizers, helped banish starvation in Europe. Growing cities remained cesspools of disease at first, but from the mid-19th century on, sewers began to channel human waste away from drinking water, which was then filtered and chlorinated; that dramatically reduced the spread of cholera and typhus.
Moreover in 1798, the same year that Malthus published his dyspeptic tract, his compatriot Edward Jenner described a vaccine for smallpox—the first and most important in a series of vaccines and antibiotics that, along with better nutrition and sanitation, would double life expectancy in the industrializing countries, from 35 years to 77 today. It would take a cranky person to see that trend as gloomy: “The development of medical science was the straw that broke the camel’s back,” wrote Stanford population biologist Paul Ehrlich in 1968.
Ehrlich’s book, The Population Bomb, made him the most famous of modern Malthusians. In the 1970s, Ehrlich predicted, “hundreds of millions of people are going to starve to death,” and it was too late to do anything about it. “The cancer of population growth … must be cut out,” Ehrlich wrote, “by compulsion if voluntary methods fail.” The very future of the United States was at risk. In spite or perhaps because of such language, the book was a best seller, as Malthus’s had been. And this time too the bomb proved a dud. The green revolution—a combination of high-yield seeds, irrigation, pesticides, and fertilizers that enabled grain production to double—was already under way. Today many people are undernourished, but mass starvation is rare.
Ehrlich was right, though, that population would surge as medical science spared many lives. After World War II the developing countries got a sudden transfusion of preventive care, with the help of institutions like the World Health Organization and UNICEF. Penicillin, the smallpox vaccine, DDT (which, though later controversial, saved millions from dying of malaria)—all arrived at once. In India life expectancy went from 38 years in 1952 to 64 today; in China, from 41 to 73. Millions of people in developing countries who would have died in childhood survived to have children themselves. That’s why the population explosion spread around the planet: because a great many people were saved from dying.
And because, for a time, women kept giving birth at a high rate. In 18th-century Europe or early 20th-century Asia, when the average woman had six children, she was doing what it took to replace herself and her mate, because most of those children never reached adulthood. When child mortality declines, couples eventually have fewer children—but that transition usually takes a generation at the very least. Today in developed countries, an average of 2.1 births per woman would maintain a steady population; in the developing world, “replacement fertility” is somewhat higher. In the time it takes for the birthrate to settle into that new balance with the death rate, population explodes.
Demographers call this evolution the demographic transition. All countries go through it in their own time. It’s a hallmark of human progress: In a country that has completed the transition, people have wrested from nature at least some control over death and birth. The global population explosion is an inevitable side effect, a huge one that some people are not sure our civilization can survive. But the growth rate was actually at its peak just as Ehrlich was sounding his alarm. By the early 1970s, fertility rates around the world had begun dropping faster than anyone had anticipated. Since then, the population growth rate has fallen by more than 40 percent.
THE FERTILITY DECLINE that is now sweeping the planet started at different times in different countries. France was one of the first. By the early 18th century, noblewomen at the French court were knowing carnal pleasures without bearing more than two children. They often relied on the same method Leeuwenhoek used for his studies: withdrawal, or coitus interruptus. Village parish records show the trend had spread to the peasantry by the late 18th century; by the end of the 19th, fertility in France had fallen to three children per woman—without the help of modern contraceptives. The key innovation was conceptual, not contraceptive, says Gilles Pison of the National Institute for Demographic Studies in Paris. Until the Enlightenment, “the number of children you had, it was God who decided. People couldn’t fathom that it might be up to them.”
Other countries in the West eventually followed France’s lead. By the onset of World War II, fertility had fallen close to the replacement level in parts of Europe and the U.S. Then, after the surprising blip known as the baby boom, came the bust, again catching demographers off guard. They assumed some instinct would lead women to keep having enough children to ensure the survival of the species. Instead, in country after developed country, the fertility rate fell below replacement level. In the late 1990s in Europe it fell to 1.4. “The evidence I’m familiar with, which is anecdotal, is that women couldn’t care less about replacing the species,” Joel Cohen says.
The end of a baby boom can have two big economic effects on a country. The first is the “demographic dividend”—a blissful few decades when the boomers swell the labor force and the number of young and old dependents is relatively small, and there is thus a lot of money for other things. Then the second effect kicks in: The boomers start to retire. What had been considered the enduring demographic order is revealed to be a party that has to end. The sharpening American debate over Social Security and last year’s strikes in France over increasing the retirement age are responses to a problem that exists throughout the developed world: how to support an aging population. “In 2050 will there be enough people working to pay for pensions?” asks Frans Willekens, director of the Netherlands Interdisciplinary Demographic Institute in The Hague. “The answer is no.”
In industrialized countries it took generations for fertility to fall to the replacement level or below. As that same transition takes place in the rest of the world, what has astonished demographers is how much faster it is happening there. Though its population continues to grow, China, home to a fifth of the world’s people, is already below replacement fertility and has been for nearly 20 years, thanks in part to the coercive one-child policy implemented in 1979; Chinese women, who were bearing an average of six children each as recently as 1965, are now having around 1.5. In Iran, with the support of the Islamic regime, fertility has fallen more than 70 percent since the early ’80s. In Catholic and democratic Brazil, women have reduced their fertility rate by half over the same quarter century. “We still don’t understand why fertility has gone down so fast in so many societies, so many cultures and religions. It’s just mind-boggling,” says Hania Zlotnik, director of the UN Population Division.
“At this moment, much as I want to say there’s still a problem of high fertility rates, it’s only about 16 percent of the world population, mostly in Africa,” says Zlotnik. South of the Sahara, fertility is still five children per woman; in Niger it is seven. But then, 17 of the countries in the region still have life expectancies of 50 or less; they have just begun the demographic transition. In most of the world, however, family size has shrunk dramatically. The UN projects that the world will reach replacement fertility by 2030. “The population as a whole is on a path toward nonexplosion—which is good news,” Zlotnik says.
The bad news is that 2030 is two decades away and that the largest generation of adolescents in history will then be entering their childbearing years. Even if each of those women has only two children, population will coast upward under its own momentum for another quarter century. Is a train wreck in the offing, or will people then be able to live humanely and in a way that doesn’t destroy their environment? One thing is certain: Close to one in six of them will live in India.
I have understood the population explosion intellectually for a long time. I came to understand it emotionally one stinking hot night in Delhi a couple of years ago… The temperature was well over 100, and the air was a haze of dust and smoke. The streets seemed alive with people. People eating, people washing, people sleeping. People visiting, arguing, and screaming. People thrusting their hands through the taxi window, begging. People defecating and urinating. People clinging to buses. People herding animals. People, people, people, people. —Paul Ehrlich
In 1966, when Ehrlich took that taxi ride, there were around half a billion Indians. There are 1.2 billion now. Delhi’s population has increased even faster, to around 22 million, as people have flooded in from small towns and villages and crowded into sprawling shantytowns. Early last June in the stinking hot city, the summer monsoon had not yet arrived to wash the dust from the innumerable construction sites, which only added to the dust that blows in from the deserts of Rajasthan. On the new divided highways that funnel people into the unplanned city, oxcarts were heading the wrong way in the fast lane. Families of four cruised on motorbikes, the women’s scarves flapping like vivid pennants, toddlers dangling from their arms. Families of a dozen or more sardined themselves into buzzing, bumblebee-colored auto rickshaws designed for two passengers. In the stalled traffic, amputees and wasted little children cried for alms. Delhi today is boomingly different from the city Ehrlich visited, and it is also very much the same.
At Lok Nayak Hospital, on the edge of the chaotic and densely peopled nest of lanes that is Old Delhi, a human tide flows through the entrance gate every morning and crowds inside on the lobby floor. “Who could see this and not be worried about the population of India?” a surgeon named Chandan Bortamuly asked one afternoon as he made his way toward his vasectomy clinic. “Population is our biggest problem.” Removing the padlock from the clinic door, Bortamuly stepped into a small operating room. Inside, two men lay stretched out on examination tables, their testicles poking up through holes in the green sheets. A ceiling fan pushed cool air from two window units around the room.
Bortamuly is on the front lines of a battle that has been going on in India for nearly 60 years. In 1952, just five years after it gained independence from Britain, India became the first country to establish a policy for population control. Since then the government has repeatedly set ambitious goals—and repeatedly missed them by a mile. A national policy adopted in 2000 called for the country to reach the replacement fertility of 2.1 by 2010. That won’t happen for at least another decade. In the UN’s medium projection, India’s population will rise to just over 1.6 billion people by 2050. “What’s inevitable is that India is going to exceed the population of China by 2030,” says A. R. Nanda, former head of the Population Foundation of India, an advocacy group. “Nothing less than a huge catastrophe, nuclear or otherwise, can change that.”
Sterilization is the dominant form of birth control in India today, and the vast majority of the procedures are performed on women. The government is trying to change that; a no-scalpel vasectomy costs far less and is easier on a man than a tubal ligation is on a woman. In the operating theater Bortamuly worked quickly. “They say the needle pricks like an ant bite,” he explained, when the first patient flinched at the local anesthetic. “After that it’s basically painless, bloodless surgery.” Using the pointed tip of a forceps, Bortamuly made a tiny hole in the skin of the scrotum and pulled out an oxbow of white, stringy vas deferens—the sperm conduit from the patient’s right testicle. He tied off both ends of the oxbow with fine black thread, snipped them, and pushed them back under the skin. In less than seven minutes—a nurse timed him—the patient was walking out without so much as a Band-Aid. The government will pay him an incentive fee of 1,100 rupees (around $25), a week’s wages for a laborer.
The Indian government tried once before to push vasectomies, in the 1970s, when anxiety about the population bomb was at its height. Prime Minister Indira Gandhi and her son Sanjay used state-of-emergency powers to force a dramatic increase in sterilizations. From 1976 to 1977 the number of operations tripled, to more than eight million. Over six million of those were vasectomies. Family planning workers were pressured to meet quotas; in a few states, sterilization became a condition for receiving new housing or other government benefits. In some cases the police simply rounded up poor people and hauled them to sterilization camps.
The excesses gave the whole concept of family planning a bad name. “Successive governments refused to touch the subject,” says Shailaja Chandra, former head of the National Population Stabilisation Fund (NPSF). Yet fertility in India has dropped anyway, though not as fast as in China, where it was nose-diving even before the draconian one-child policy took effect. The national average in India is now 2.6 children per woman, less than half what it was when Ehrlich visited. The southern half of the country and a few states in the northern half are already at replacement fertility or below.
In Kerala, on the southwest coast, investments in health and education helped fertility fall to 1.7. The key, demographers there say, is the female literacy rate: At around 90 percent, it’s easily the highest in India. Girls who go to school start having children later than ones who don’t. They are more open to contraception and more likely to understand their options.
SO FAR THIS APPROACH, held up as a model internationally, has not caught on in the poor states of northern India—in the “Hindi belt” that stretches across the country just south of Delhi. Nearly half of India’s population growth is occurring in Rajasthan, Madhya Pradesh, Bihar, and Uttar Pradesh, where fertility rates still hover between three and four children per woman. More than half the women in the Hindi belt are illiterate, and many marry well before reaching the legal age of 18. They gain social status by bearing children—and usually don’t stop until they have at least one son.
As an alternative to the Kerala model, some point to the southern state of Andhra Pradesh, where sterilization “camps”—temporary operating rooms often set up in schools—were introduced during the ’70s and where sterilization rates have remained high as improved hospitals have replaced the camps. In a single decade beginning in the early 1990s, the fertility rate fell from around three to less than two. Unlike in Kerala, half of all women in Andhra Pradesh remain illiterate.
Amarjit Singh, the current executive director of the NPSF, calculates that if the four biggest states of the Hindi belt had followed the Andhra Pradesh model, they would have avoided 40 million births—and considerable suffering. “Because 40 million were born, 2.5 million children died,” Singh says. He thinks if all India were to adopt high-quality programs to encourage sterilizations, in hospitals rather than camps, it could have 1.4 billion people in 2050 instead of 1.6 billion.
Critics of the Andhra Pradesh model, such as the Population Foundation’s Nanda, say Indians need better health care, particularly in rural areas. They are against numerical targets that pressure government workers to sterilize people or cash incentives that distort a couple’s choice of family size. “It’s a private decision,” Nanda says.
In Indian cities today, many couples are making the same choice as their counterparts in Europe or America. Sonalde Desai, a senior fellow at New Delhi’s National Council of Applied Economic Research, introduced me to five working women in Delhi who were spending most of their salaries on private-school fees and after-school tutors; each had one or two children and was not planning to have more. In a nationwide survey of 41,554 households, Desai’s team identified a small but growing vanguard of urban one-child families. “We were totally blown away at the emphasis parents were placing on their children,” she says. “It suddenly makes you understand—that is why fertility is going down.” Indian children on average are much better educated than their parents.
That’s less true in the countryside. With Desai’s team I went to Palanpur, a village in Uttar Pradesh—a Hindi-belt state with as many people as Brazil. Walking into the village we passed a cell phone tower but also rivulets of raw sewage running along the lanes of small brick houses. Under a mango tree, the keeper of the grove said he saw no reason to educate his three daughters. Under a neem tree in the center of the village, I asked a dozen farmers what would improve their lives most. “If we could get a little money, that would be wonderful,” one joked.
The goal in India should not be reducing fertility or population, Almas Ali of the Population Foundation told me when I spoke to him a few days later. “The goal should be to make the villages livable,” he said. “Whenever we talk of population in India, even today, what comes to our mind is the increasing numbers. And the numbers are looked at with fright. This phobia has penetrated the mind-set so much that all the focus is on reducing the number. The focus on people has been pushed to the background.”
It was a four-hour drive back to Delhi from Palanpur, through the gathering night of a Sunday. We sat in traffic in one market town after another, each one hopping with activity that sometimes engulfed the car. As we came down a viaduct into Moradabad, I saw a man pushing a cart up the steep hill, piled with a load so large it blocked his view. I thought of Ehrlich’s epiphany on his cab ride all those decades ago. People, people, people, people—yes. But also an overwhelming sense of energy, of striving, of aspiration.
THE ANNUAL meeting of the Population Association of America (PAA) is one of the premier gatherings of the world’s demographers. Last April the global population explosion was not on the agenda. “The problem has become a bit passé,” Hervé Le Bras says. Demographers are generally confident that by the second half of this century we will be ending one unique era in history—the population explosion—and entering another, in which population will level out or even fall.
But will there be too many of us? At the PAA meeting, in the Dallas Hyatt Regency, I learned that the current population of the planet could fit into the state of Texas, if Texas were settled as densely as New York City. The comparison made me start thinking like Leeuwenhoek. If in 2045 there are nine billion people living on the six habitable continents, the world population density will be a little more than half that of France today. France is not usually considered a hellish place. Will the world be hellish then?
Some parts of it may well be; some parts of it are hellish today. There are now 21 cities with populations larger than ten million, and by 2050 there will be many more. Delhi adds hundreds of thousands of migrants each year, and those people arrive to find that “no plans have been made for water, sewage, or habitation,” says Shailaja Chandra. Dhaka in Bangladesh and Kinshasa in the Democratic Republic of the Congo are 40 times larger today than they were in 1950. Their slums are filled with desperately poor people who have fled worse poverty in the countryside.
Whole countries today face population pressures that seem as insurmountable to us as India’s did to Ehrlich in 1966. Bangladesh is among the most densely populated countries in the world and one of the most immediately threatened by climate change; rising seas could displace tens of millions of Bangladeshis. Rwanda is an equally alarming case. In his book Collapse, Jared Diamond argued that the genocidal massacre of some 800,000 Rwandans in 1994 was the result of several factors, not only ethnic hatred but also overpopulation—too many farmers dividing the same amount of land into increasingly small pieces that became inadequate to support a farmer’s family. “Malthus’s worst-case scenario may sometimes be realized,” Diamond concluded.
Many people are justifiably worried that Malthus will finally be proved right on a global scale—that the planet won’t be able to feed nine billion people. Lester Brown, founder of Worldwatch Institute and now head of the Earth Policy Institute in Washington, believes food shortages could cause a collapse of global civilization. Human beings are living off natural capital, Brown argues, eroding soil and depleting groundwater faster than they can be replenished. All of that will soon be cramping food production. Brown’s Plan B to save civilization would put the whole world on a wartime footing, like the U.S. after Pearl Harbor, to stabilize climate and repair the ecological damage. “Filling the family planning gap may be the most urgent item on the global agenda,” he writes, so if we don’t hold the world’s population to eight billion by reducing fertility, the death rate may increase instead.
Eight billion corresponds to the UN’s lowest projection for 2050. In that optimistic scenario, Bangladesh has a fertility rate of 1.35 in 2050, but it still has 25 million more people than it does today. Rwanda’s fertility rate also falls below the replacement level, but its population still rises to well over twice what it was before the genocide. If that’s the optimistic scenario, one might argue, the future is indeed bleak.
But one can also draw a different conclusion—that fixating on population numbers is not the best way to confront the future. People packed into slums need help, but the problem that needs solving is poverty and lack of infrastructure, not overpopulation. Giving every woman access to family planning services is a good idea—“the one strategy that can make the biggest difference to women’s lives,” Chandra calls it. But the most aggressive population control program imaginable will not save Bangladesh from sea level rise, Rwanda from another genocide, or all of us from our enormous environmental problems.
Global warming is a good example. Carbon emissions from fossil fuels are growing fastest in China, thanks to its prolonged economic boom, but fertility there is already below replacement; not much more can be done to control population. Where population is growing fastest, in sub-Saharan Africa, emissions per person are only a few percent of what they are in the U.S.—so population control would have little effect on climate. Brian O’Neill of the National Center for Atmospheric Research has calculated that if the population were to reach 7.4 billion in 2050 instead of 8.9 billion, it would reduce emissions by 15 percent. “Those who say the whole problem is population are wrong,” Joel Cohen says. “It’s not even the dominant factor.” To stop global warming we’ll have to switch from fossil fuels to alternative energy—regardless of how big the population gets.
The number of people does matter, of course. But how people consume resources matters a lot more. Some of us leave much bigger footprints than others. The central challenge for the future of people and the planet is how to raise more of us out of poverty—the slum dwellers in Delhi, the subsistence farmers in Rwanda—while reducing the impact each of us has on the planet.
The World Bank has predicted that by 2030 more than a billion people in the developing world will belong to the “global middle class,” up from just 400 million in 2005. That’s a good thing. But it will be a hard thing for the planet if those people are eating meat and driving gasoline-powered cars at the same rate as Americans now do. It’s too late to keep the new middle class of 2030 from being born; it’s not too late to change how they and the rest of us will produce and consume food and energy. “Eating less meat seems more reasonable to me than saying, ‘Have fewer children!’ ” Le Bras says.
How many people can the Earth support? Cohen spent years reviewing all the research, from Leeuwenhoek on. “I wrote the book thinking I would answer the question,” he says. “I found out it’s unanswerable in the present state of knowledge.” What he found instead was an enormous range of “political numbers, intended to persuade people” one way or the other.
For centuries population pessimists have hurled apocalyptic warnings at the congenital optimists, who believe in their bones that humanity will find ways to cope and even improve its lot. History, on the whole, has so far favored the optimists, but history is no certain guide to the future. Neither is science. It cannot predict the outcome of People v. Planet, because all the facts of the case—how many of us there will be and how we will live—depend on choices we have yet to make and ideas we have yet to have. We may, for example, says Cohen, “see to it that all children are nourished well enough to learn in school and are educated well enough to solve the problems they will face as adults.” That would change the future significantly.
The debate was present at the creation of population alarmism, in the person of Rev. Thomas Malthus himself. Toward the end of the book in which he formulated the iron law by which unchecked population growth leads to famine, he declared that law a good thing: It gets us off our duffs. It leads us to conquer the world. Man, Malthus wrote, and he must have meant woman too, is “inert, sluggish, and averse from labour, unless compelled by necessity.” But necessity, he added, gives hope:
“The exertions that men find it necessary to make, in order to support themselves or families, frequently awaken faculties that might otherwise have lain for ever dormant, and it has been commonly remarked that new and extraordinary situations generally create minds adequate to grapple with the difficulties in which they are involved.”
Seven billion of us soon, nine billion in 2045. Let’s hope that Malthus was right about our ingenuity.
Earth isn’t the only planet graced with gorgeous eclipses. On Nov. 9, the Mars rover Opportunity watched the larger of Mars’s two moons, Phobos, slip quietly in front of the sun.
This movie combines 10 individual photos taken every four seconds through special solar filters on the rover’s panoramic cameras. The video was made from images that were calibrated and enhanced, plus extra frames to make the movie run smoothly through the entire 32-second-long eclipse.
Phobos is too small to completely cover the sun, so Martians never get to see total solar eclipses like the one visible from the South Pacific this summer. Instead, astronomers call Phobos’s journeys across the face of the sun transits or partial eclipses.
Images of these transits taken many years apart can help scientists track changes in the moons’ orbits, which in turn gives information about Mars’s interior.
But for some Mars explorers, like Panoramic camera principal investigator Jim Bell, the spectacle of seeing events on Mars as if we were there is just as exciting as the science the images reveal.
“It reminds me of a favorite quote from French author Marcel Proust,” Bell said in a press release. “‘The real voyage of discovery consists not in seeking new landscapes, but in having new eyes.’”
Video: NASA/JPL-Caltech/Cornell/Texas A&M
Outwardly, it looked like just another big space launch — and those happen about once a week, from spaceports all around the world. But Friday’s blast-off of a rocket, carrying a Chinese GPS-style navigation satellite, from the Xi Chang Satellite Launch Center was different. It set a record for successful Chinese launches in one year: 15.
The launch represented another important milestone. For the first time since the chilliest days of the Cold War, another country has matched the United States in sheer number of rocket launches.
To some observers, the rapid acceleration of the Chinese space program is perfectly reasonable, even expected. With nearly 20 percent of the world’s population and the planet’s second-biggest economy by some measures, it stands to reason that China would join other advanced, spacefaring nations — and on a grander scale.
But more cautious (or alarmist, depending on your point of view) China-watchers question Beijing’s motives, and warn of potentially dire consequences if China comes to dominate the heavens.
In an interview with Danger Room, space expert Brian Weeden from the Secure World Foundation took a measured view: Sure, China’s catching up fast, but the world’s most powerful Communist country still has a long way to go before it can go toe-to-toe with the United States in space.
Weeden’s argument boils down to an appreciation of quality versus quantity. “On a pure technology basis, I would put them [China] behind the established spacefaring states such as the United States, Russia, Europe, Canada and Japan. This is largely due to China’s deficiencies in advanced technology in general and not limited to just space. However, on a space-capability basis, I would put them ahead of everyone but the United States and Russia, and just behind those two leaders.”
In other words, China makes up for the generally lower-quality of its spacecraft by building more of them — and a greater variety.
For instance, Beijing can’t match the high quality of Canada’s RADARSAT-2 radar-imaging satellite. “However, Canada does not have an indigenous human spaceflight program or indigenous space launch capability,” Weeden pointed out, and China does. Beijing is “in the process of building constellations of on-orbit satellite to provide a wide variety of capabilities, which will likely surpass Russia (whose satellite constellations are in decline) and end up second only to the U.S.”
But even with China matching U.S. launch rates, that near-parity could take decades — or never happen at all, considering the huge demographic pressures Beijing faces. China’s 15 launches in 2010 boosted Beijing’s space arsenal to around 67 satellites, both military- and civilian-owned. Russia still has 99, but with its unreliable rockets and rickety finances is struggling to maintain that number.
The United States, by contrast, owns 441 satellites that we know about, including unique spacecraft such as the Advanced Orion radio snoop (at a reported span of 300 feet, the biggest sat in the world) plus the soon-to-be-retired space shuttle and the shuttle’s smaller robotic replacement, the Air Force’s X-37B.
In many ways, China’s ascent in space reflects the country’s rapid military modernization on the ground, in the air and at sea — and raises some of the same concerns. After decades of dormancy, China is finally awakening to its full potential. That means big technical and professional leaps, fast. But Beijing started so far behind other world powers, that even big leaps can leave it a distant runner-up.
Photo: Wikimedia Commons
It was a year without parallel. Threat Level’s bread-and-butter themes of censorship, hacking, security, privacy, copyright and cyberwar were all represented in tug-of-war struggles with unprecedented outcomes.
Google defeated China’s censors, but caved to corporate censorship in the United States. The largest computer-crime case ever prosecuted ended in the nation’s longest prison term. A small-time Xbox modder who advertised his services online beat the federal rap. And a mysterious computer virus called Stuxnet finally put proof to decades of warnings that malware will eventually be used to kinetic effect in the real world.
A myriad of court decisions seemed to be a boon for online rights, while others clearly were a step backward. The year 2010 saw the rise of the newspaper copyright troll, and judges pushed back on absurd jury verdicts for music file sharing and outdated electronic spying rules.
And a secret-spilling website flirting with insolvency and dissolution suddenly burst onto the world stage. WikiLeaks was without a doubt the biggest 2010 development in Threat Level’s world.
WikiLeaks Takes On World Powers
As the year began, the project appeared to be on its last legs — just another cypherpunk fever dream destined for the same dustbin as digital cash and assassination politics. Site founder Julian Assange had abandoned the wiki portion of the concept, after crowds of volunteer analysts failed to congeal around WikiLeaks’ impressive, but not yet explosive, trove.
Bradley Manning as he appeared in his Facebook photo.
Assange also experimented with auctioning early access to leaks for news outlets, without immediate success. By January, the site had hit financial bankruptcy, and its homepage and archive were replaced by a public plea for donations.
Then came Bradley Manning, a disaffected 22-year-old Army intelligence officer who wanted “people to see the truth.” With one disturbing video and nearly a million leaked U.S. documents later, WikiLeaks had raised more than $1.2 million, and ignited a battle over the meaning of journalism, national security and censorship.
The WikiLeaks saga began in earnest with the April release of the “Collateral Murder” video showing more than a dozen people in Iraq being killed in three U.S. Apache helicopter attacks.
Victims included two Reuters employees, one carrying a camera that was apparently mistaken for a weapon. The partial release of 92,000 reports from the war in Afghanistan followed in July. Then came 400,000 Iraq war reports in October, and finally the slow, steady disclosure of 250,000 U.S. diplomatic cables that kicked off just after Thanksgiving.
The "Collateral Murder" scene shortly after the 2007 Apache helicopter attack in Iraq was exposed by WikiLeaks.
Along the way, Manning was arrested and locked away in a Marine brig. A war broke out within WikiLeaks’ ranks. And Assange became the subject of a U.S. grand jury investigation that may have broad ramifications for the First Amendment.
The State Department said Assange’s publication of U.S. diplomatic cables was “illegal.” But Assange bills WikiLeaks as a media organization, and no media outlet has ever been prosecuted for publishing classified information in the United States.
WikiLeaks and the Future
Yet more is at stake than Assange’s freedom and the future of WikiLeaks. The site has shown us that the right to maintain a presence on the internet regularly runs counter to the net’s gatekeepers that often are motivated by profit.
As the New Year approached, WikiLeaks was caught scrambling to maintain its online presence and financial pipeline. Amazon cut off its web hosting, and PayPal, Visa, MasterCard and Bank of America blocked donations to the organization. Apple even banned an iPhone app designed to facilitate access to Wikileaks’ cache of leaked U.S. diplomatic cables.
“A lot of really important stuff happened this year that forces us to begin to think about that there are so many people who depend on private companies to enjoy the fruits of technology,” said Cindy Cohn, the Electronic Frontier Foundation’s legal director. “If the private company stands up for us we have rights, and if it doesn’t, we don’t.”
Springing to WikiLeaks’ defense were the pranksters and activists known as Anonymous, who overwhelmed the websites of WikiLeaks’ enemies — real and perceived — with junk internet traffic in coordinated attacks dubbed Operation Payback. A more constructive protest grew from the grassroots, with supporters volunteering their own websites to host mirrors of WikiLeaks’ “Cablegate” page, ensuring it can never be removed from the web.
More than anything, the online protests exposed a generational struggle for the heart and soul of the net. It’s a high-stakes conflict between corporations that have grown fat and powerful off the web over nearly two decades and the first generation to grow up with the modern internet as a daily element in their lives.
Both sides believe the internet belongs to them. If history is a guide, it would be unwise to bet against the kids over the establishment.
An international team of researchers featuring Texas A&M University physicist Jairo Sinova has announced a breakthrough that gives a new spin to semiconductor nanoelectronics and the world of information technology.
The team has developed an electrically controllable device whose functionality is based on an electron's spin. Their results, the culmination of a 20-year scientific quest involving many international researchers and groups, are published in the current issue of Science.
The team, which also includes researchers from the Hitachi Cambridge Laboratory and the Universities of Cambridge and Nottingham in the United Kingdom as well as the Academy of Sciences and Charles University in the Czech Republic, is the first to combine the spin-helix state and anomalous Hall effect to create a realistic spin-field-effect transistor (FET) operable at high temperatures, complete with an AND-gate logic device -- the first such realization in the type of transistors originally proposed by Purdue University's Supriyo Datta and Biswajit Das in 1989.
"One of the major stumbling blocks was that to manipulate spin, one may also destroy it," Sinova explains. "It has only recently been realized that one could manipulate it without destroying it by choosing a particular set-up for the device and manipulating the material. One also has to detect it without destroying it, which we were able to do by exploiting our findings from our study of the spin Hall effect six years ago. It is the combination of these basic physics research projects that has given rise to the first spin-FET."
Sixty years after the transistor's discovery, its operation is still based on the same physical principles of electrical manipulation and detection of electronic charges in a semiconductor, says Hitachi's Dr. Jorg Wunderlich, senior researcher in the team. He says subsequent technology has focused on down-scaling the device size, succeeding to the point where we are approaching the ultimate limit, shifting the focus to establishing new physical principles of operation to overcome these limits -- specifically, using its elementary magnetic movement, or so-called "spin," as the logic variable instead of the charge.
This new approach constitutes the field of "spintronics," which promises potential advances in low-power electronics, hybrid electronic-magnetic systems and completely new functionalities.
Wunderlich says the 20-year-old theory of electrical manipulation and detection of electron's spin in semiconductors -- the cornerstone of which is the "holy grail" known as the spin transistor -- has proven to be unexpectedly difficult to experimentally realize.
"We used recently discovered quantum-relativistic phenomena for both spin manipulation and detection to realize and confirm all the principal phenomena of the spin transistor concept," Wunderlich explains.
To observe the electrical manipulation and detection of spins, the team made a specially designed planar photo-diode (as opposed to the typically used circularly polarized light source) placed next to the transistor channel. By shining light on the diode, they injected photo-excited electrons, rather than the customary spin-polarized electrons, into the transistor channel. Voltages were applied to input-gate electrodes to control the procession of spins via quantum-relativistic effects. These effects -- attributable to quantum relativity -- are also responsible for the onset of transverse electrical voltages in the device, which represent the output signal, dependent on the local orientation of processing electron spins in the transistor channel.
The new device can have a broad range of applications in spintronics research as an efficient tool for manipulating and detecting spins in semiconductors without disturbing the spin-polarized current or using magnetic elements.
Wunderlich notes the observed output electrical signals remain large at high temperatures and are linearly dependent on the degree of circular polarization of the incident light. The device therefore represents a realization of an electrically controllable solid-state polarimeter which directly converts polarization of light into electric voltage signals. He says future applications may exploit the device to detect the content of chiral molecules in solutions, for example, to measure the blood-sugar levels of patients or the sugar content of wine.
This work forms part of wider spintronics activity within Hitachi worldwide, which expects to develop new functionalities for use in fields as diverse as energy transfer, high-speed secure communications and various forms of sensor.
While Wunderlich acknowledges it is yet to be determined whether or not spin-based devices will become a viable alternative to or complement of their standard electron-charge-based counterparts in current information-processing devices, he says his team's discovery has shifted the focus from the theoretical academic speculation to prototype microelectronic device development.
"For spintronics to revolutionize information technology, one needs a further step of creating a spin amplifier," Sinova says. "For now, the device aspect -- the ability to inject, manipulate and create a logic step with spin alone -- has been achieved, and I am happy that Texas A&M University is a part of that accomplishment."