Category Archives: society and culture

climate policy and climate science inhabit parallel worlds

The mask slips : Nature : Nature Publishing Group.

It says a lot about the outcome of the UN climate talks in South Africa at the weekend that most of the immediate reports focused on the wrangling that led to an agreement of sorts, rather than the contents and implications of the agreement itself. Late-night talks, later-night arguments and early-morning pacts between battling negotiators with the apparent fate of the world resting on their shoulders give the process a melodrama that is hard to resist, particularly for those who experienced it first hand in the chaos of the Durban meeting (see page 299).

Such late finishes are becoming the norm at these summits. Only as nations abandon their original negotiating positions and reveal their true demands — throwing international differences into stark relief — does a sense of urgency develop and serious negotiation take place. Combined with the consensus nature of the talks, which demands that everyone agrees to everything, the result is usually a cobbled-together compromise that allows as many countries as possible to claim victory and, most importantly, provides them with a mandate to reconvene in 12 months’ time.

So it was this time. In the search for a successor to the Kyoto Protocol, we now have the Durban Platform, which comes on the heels of the Bali Road Map and the Copenhagen Accord.

It takes a certain kind of optimism — or an outbreak of collective Stockholm syndrome — to see the Durban outcome as a significant breakthrough on global warming, as many are claiming. Outside Europe — which has set itself binding emissions goals over the short and long term beyond what it will inherit under its stated plan to carry on with unilateral cuts under an extended Kyoto — there will be no obligation for any nation to reduce soaring greenhouse-gas emissions much before the end of the decade. And that is assuming that all flows smoothly in future UN talks, and that a global deal with binding commitments proves easier to find in talks due to start in 2015 than it has so far.

The Durban deal may mark a success in the political process to tackle climate change, but for the climate itself, it is an unqualified disaster. It is clear that the science of climate change and the politics of climate change, which claims to represent it, now inhabit parallel worlds.

This has always been true up to a point, but surely the mask of political rhetoric has now slipped so far, to reveal the ugly political reality underneath, that it can never be replaced. How can politicians talk now with a straight face of limiting global warming to 2 °C? How will campaigners frame this result as leaving yet another ‘last chance’ to save the planet?

That does not make the political process redundant — far from it. Introducing policies to curb emissions was never about saving the planet or not, or stopping global warming or not. It is about damage limitation — the 3 °C or 4 °C of average warming the planet could experience in the long term, according to some analyses of the Durban outcome doing the rounds, is clearly much worse than the 2 °C used as shorthand for dangerous at present. But it is preferable to the 5 °C or 6 °C that science suggests is possible if emissions continue to rise unabated.

To prevent that outcome will be just as difficult politically as was the now abandoned attempt to find a global successor in time to follow Kyoto. But it remains possible — and there were at least encouraging signs in Durban that previously obstinate countries recognize that it is necessary, even if it is delayed. Those, including this journal, who have long argued the scientific case for the need to control greenhouse-gas emissions should back this new political mood to the hilt. But as the Durban Platform crowds with politicians, the climate train they wait for has left the station.

Comments

  1. 2011-12-14 02:05 AM

    Report this comment #34028

    Jeffrey Thaler said:
    Well written editorial, and unfortunately too accurate. There is a theme arising out of Durban on the limits of legal-political processes, as well as the growing gap between scientific and political “realities”. How to bridge that gap, so we are not just mitigating significant harms to the world our children inherit, is the still-to-be-resolved challenge that requires work outside of the big conference halls. Time and growing GHG emissions are not waiting for any of us.

  2. 2011-12-14 03:13 AM

    Report this comment #34039

    Fred Singer said:
    The Nature editorial (Dec 15; The Mask Slips) talks about science and policy in parallel universes. Quite correct ? if you mean ?separate? and ?disconnected.? COP 17 was never about climate, let alone science. It was all about money: (1) How to assure continuing government careers for 200 delegations, with annual vacations paid by taxpayers. (2) How to transfer $100 billion a year from industrialized nations to LDCs (or more precisely, to their kleptocratic rulers), using ?climate justice? or ?climate guilt? (depending on who is doing the talking). (3) How to gain a national advantage by setting differential emission limits.

    By now it should be obvious that (1) the enshrined temperature limit of +2degC is based on fiction and has no scientific basis. As an annual global average, climate models tell us, it will mean warmer winter nights in Siberia and Canada; perhaps -35deg instead of -40; and little warming in the tropics. (2) It should also be obvious that even strenuous and economy-killing efforts at mitigation, will have little effect on atmospheric levels of carbon dioxide, let alone on climate. If a demonstration is needed, just look at the lack of warming since 1998, in spite of rapidly rising levels of greenhouse gases.

    So, yes, I would agree with the editorial, if properly expanded.

  3. 2011-12-14 05:18 AM

    Report this comment #34049

    Kevin Matthews said:
    Yes, great editorial. Coming from the world’s leading scientific journal (which of course would prefer not to have to say such things) one would hope that authorities and media around the world take significant notice.

    Thinking about the whole UN climate negotiation process, and how complex and cumbersome it is to seek unanimous agreement from 194 countries….

    Then comparing what has come out of the COP17 cycle – significant and landmark progress, even if still sharply insufficient to the urgency of need – to what has come out of the U.S. Congress over the last several months or more, with its supposedly streamlined and results-oriented binary democracy approach – practically nothing.

    And suddenly – surprise! – consensus (in this entirely limited comparison) looks pretty darn effective – just from a simple results-accomplished perspective.

    For which differential, there is, in turn, good scientific reason.

  4. 2011-12-15 05:14 AM

    Report this comment #34107

    John Wheelahan said:
    No, there are no parallel worlds – the science and politics of AGW share the same scam. Spare us the crap about 6 degree C temperature rise , when you know that this is a lie. No temperature rise for a decade!
    The science and politics are about money – the greatest swindle since the South Sea Bubble. Hundreds of billions of dollars are to be given to African despots, conmen, swindlers and bankers for a scientific fanatsy. These beneficiaries will live in luxury in their Mediteranean villas while the poor of the third world countries and developed countries will be the sufferers, and pay the price. Please get real, Nature Editor.

  5. 2011-12-15 07:21 AM

    Report this comment #34146

    Patrik D’haeseleer said:
    I think it is very clear that the “global consensus” approach to dealing with climate change has failed.

    I may be time for those countries who are willing to do something about it to band together and go it alone. And then start charging tariffs on any goods imported from countries not part of the coalition, proportional to the amount CO2 pollution caused by those countries.

    If we can get Europe, Africa and the island nations on board, I don’t think it would take too long for China and India to follow suit.

  6. 2011-12-15 11:35 AM

    Report this comment #34154

    Michael Lerman said:
    I do not subscribe to the concept of global warming induced by human activities. About a 1,000 years ago Greenland was green and cows brought by the Vikings polluted the clean Arctic air. Instead of global warming Greenland got frozen till today. I often go to The Canadian Arctic and indeed can testify that the mean temperatures in July are higher than previously (~10 years ago), and though my Inuit friends blame the US government, I argue and try to persuade them their view is wrong. Michael Lerman, Ph.D., M.D.

  7. 2011-12-18 06:28 AM

    Report this comment #34314

    Karin Green said:
    I find this comment in the article troubling: “Those, including this journal, who have long argued the scientific case for the need to control greenhouse-gas emissions should back this new political mood to the hilt”, especially when you say something like ” there were at least encouraging signs in Durban that previously obstinate countries recognize that it is necessary, even if it is delayed”.

    To me, this bodes ill for an open minded and unbiased editorial policy!

  8. 2011-12-19 06:47 AM

    Report this comment #34516

    Jeffrey Eric Grant said:
    The COP people have been at it for a long time! I would think that if the science is solid, then the arguements would have moved foreward, at least a little. Instead, we are still talking about the evidence of global warming, and how to mitigate against it.
    AGW is all based on atmospheric rise in CO2 that was put there by human activity.So, now we have closed the talks in Durban, still with no agreement on the cause of the increased CO2 that will, someday, maybe, eventually, turn the world temperatures a little warmer. Not in my lifetime; maybe not even in yours!
    I challenge anyone on this thread to answer either of the following two questions:
    1) direct me to a recent empirical scientific study that concludes that increased atmospheric CO2 caused the inclease in atmospheric temperatures more than about 2C/100yr?, or
    2) Since water retains less CO2 when it is heated, how can the worlds oceans be both warmer and more acidic at the same time?

Can disaster aid win hearts and minds?

A Friend in Need – By Charles Kenny | Foreign Policy.

BY CHARLES KENNY | OCTOBER 31, 2011

On Tuesday last week, Turkey reversed its previous stand and decided to accept aid from Israel to help deal with the tragic earthquake that had stricken the country’s east. Shipments of portable housing units began the next day. Turkey’s Foreign Minister Ahmet Davutoglu was quick to emphasize that accepting aid did not signal an improvement in diplomatic relations between the two countries, strained ever since Israel’s raid of a Turkish aid flotilla bound for Gaza in 2010 — likely a response to the perception that aid can buy off recipient governments, even if it can’t change popular attitudes. The irony is that the humanitarian assistance that responds to disasters — unlike the majority of aid that goes to long-term development projects — might be the one case where that logic is sometimes reversed.

At a time when the United States’ aid budget is confronted by an army of hatchet-wielding deficit hawks among the Republican Party’s congressional majority and presidential candidates, some aid proponents are making the case that development and humanitarian assistance are powerful tools to buy friends and influence people. And it is true that aid has long been used to grease the often-rusty wheels of diplomacy. The Camp David Accords between Egypt and Israel were cemented with the help of an aid package worth an average of $2 billion a year to Egypt. Since 1985, U.S. law has mandated that the U.S. Agency for International Development (USAID) take account of would-be aid recipients’ voting patterns at the United Nations — rewarding larger aid packages to those who vote with America. Political Scientists David Carter at Pennsylvania State and Randall Stone at the University of Rochester note that this kind of carrot-minded approach has been successful, influencing countries’ votes on decisions that the U.S. State Department declares as politically important.Watch movie online The Lego Batman Movie (2017)

Twisting politicians’ arms is one thing, but changing popular attitudes is another matter entirely. Look again at Egypt: Despite being one of the largest recipients of USAID financing over the past 30 years, Pew surveys suggest only 20 percent of Egyptians have a favorable view of the United States — considerably less than half of the U.S. favorability rating in former Cold War foe Russia. Popular opinion in Egypt is driven by other factors, not least broader U.S. foreign policy in the region. (A propensity to invade neighboring countries doesn’t help.) And development assistance just isn’t a major factor in the financial fortunes of the average citizen. Maybe that was true back in 1990, when net overseas development assistance to the country equaled 36 percent of government expenditures. But by 2008, that figure was just 3 percent — only a little more one-tenth the value of tourism and one-seventh that of manufacturing exports.

Aid’s limited impact on public opinion usually applies even when the aid is specifically focused on winning converts. A study by consultant Michael Kleinman and Mark Bradbury, a director at the Rift Valley Institute, looked at U.S. military aid for small projects in Kenya designed to improve popular support for the U.S. military presence there, and found that it didn’t. Attitudes were shaped by faith, the relationship between target populations and the Kenyan state, U.S. foreign policy, and events in Somalia — not by a U.S.-financed well or asphalt road. A German aid agency-financed 2010 study, using repeated surveys in Afghanistan’s Takhar and Kunduz provinces, found that in a comparatively peaceful period between 2005 and 2007, development aid did have a small, short-lived positive impact on the general attitudes of Afghan respondents towards foreign peace-building operations in their backyard. But this impact disappeared as threat perceptions rose between 2007 and 2009. Not surprisingly, other factors — in this case, how many people were getting shot — were just more important than who was cutting the checks.

But there is evidence of an exception to the rule that money can’t buy love, and it involves disaster assistance. Four years after a 2005 earthquake in northern Pakistan, economists Tahir Andrabi of Pomona College and Jishnu Das of the World Bank surveyed attitudes towards foreigners in the region. They found trust in foreigners was significantly higher in areas where humanitarian aid had been concentrated than in other areas — dropping off by six percentage points for each 10 kilometers of distance from the fault line.

Why might recipients react differently and more positively to disaster relief assistance than they do to other forms of aid? In part it is surely related to the simple gratitude felt by people who have just lost much of what they had in a flood or earthquake. But it is also more plausible that such aid is given without a broader political motive. Although U.S. food aid flows according to the size of the surplus domestic crop as much as recipient need, using humanitarian relief to reward or punish countries for U.N. voting records or other diplomatic policies presents a practical challenge — you can’t schedule a disaster. Recipients appear to understand that, and are more likely to view such aid as given in good faith. In the Pakistan case, for example, Andrabi and Das note that the positive impact on attitudes was related to a significant on-the-ground presence of foreigners who were assumed to have purely humanitarian motivations — aid distribution was not perceived to be (and wasn’t) linked to war-fighting efforts.

Aid is likely to be a more effective foreign policy tool when it comes to persuading governments to do things that lack popular support. Creating that popular support in the first place is much harder. Perhaps Turkey’s Davutoglu is right to say that even government relations won’t improve in the case of Israeli disaster aid — after all, U.S. humanitarian support in the aftermath of Iran’s Bam earthquake only temporarily thawed diplomatic tensions. On the other hand, maybe the assistance can play a small role in improving popular opinion towards Israel in Turkey. For good or ill, that’s one more reason for governments to respond with open hearts and open checkbooks whenever disaster strikes worldwide.

Top 10 Doomsday Prophecies

HowStuffWorks “Top 10 Doomsday Prophecies”.

It seems like every few years, someone comes out with a new doomsday prophecy. The latest apocalyptic craze places Earth’s final day on Dec. 21, 2012 — the end of the Great Cycle in the Mayan calendar. But whether the supposed agent of doom is aliens, asteroids, floods or earthquakes, the outcome is always the same — the Earth manages to endure. Such predictions are nothing new. After Jesus’ rumored ascension to heaven in the first century A.D., early Christians believed he would soon return, bringing an end to life as they knew it, as described in Mark 13:24-26: “But in those days, after that tribulation, the sun shall be darkened, and the moon shall not give her light, and the stars of heaven shall fall, and the powers that are in heaven shall be shaken. And they shall see the Son of man coming in the clouds with great power and glory.”

Since then, there has been no shortage of apocalyptic forecasts. But why? Why do people continue to predict the end of the world, and why do others insist on believing them? Perhaps some zealots feel the need to justify their preconceived worldviews through revelations about the latest celestial event or natural disaster. And maybe those who trust such doomsayers are simply hopeful for an escape from a world that seems cruel or chaotic. Whatever the case, you’re sure to enjoy our list of 10 doomsday prophecies.

10: The Seekers, Dec. 24, 1955

In December 1954, a headline in the Chicago Tribune read, “Doctor Warns of Disasters in World Tuesday — Worst to Come in 1955 He Declares.” The doctor, Charles Laughead, was a follower of Dorothy Martin, a 54-year-old housewife from Oak Park, Ill. Martin believed that aliens from the planet Clarion had beamed down messages informing her that a massive flood would soon destroy the planet. Her wild prophecies attracted a small group of followers known as the “Seekers,” many of whom had quit their jobs and sold their belongings in anticipation of the end. They gathered at Martin’s home on Christmas Eve, 1955, singing Christmas carols while they waited to be saved by the aliens in their flying saucers. As the night wore on, Martin’s followers became increasingly impatient. Finally, at 4:45 a.m. on Christmas Day, Martin announced that God had been so impressed by their actions that he would no longer destroy the Earth.

This story has a side note that is almost as interesting as the prophecy itself. A small group of psychologists and students organized by University of Minnesota social psychologist Leon Festinger infiltrated the Seekers in an effort to study and better understand apocalyptic cults. Festinger revealed his findings in the 1956 book, “When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World.” This work was an early exploration of the psychologist’s now-famous theory of “cognitive dissonance,” a term that refers to the human tendency to rationalize when one’s thoughts and actions are in disagreement.

9: Mayan Calendar, 2012

The 2009 movie, “2012,” is a 158-minute showcase of apocalyptic eye candy, with enough death and destruction to bring up the question, “What’s so bad about 2012?” It depends on who you ask. The fear is based on the way some people interpret the Mayan Long Count calendar, which is divided into Great Cycles lasting approximately 5,125 years. One of these cycles ends on Dec. 21, 2012, giving some doomsdayers the ammunition they need to declare the impending apocalypse. They also have numerous theories about how exactly the world will end. Some claim that a mysterious planet known as Nibiru, Planet X or Eris, or a large meteor, will collide with Earth. Another popular theory is that the Earth’s magnetic poles will reverse, causing the planet’s rotation to reverse as well.

Scientists have already dismissed these theories as laughable. They contend that if a celestial body were on a crash course with Earth, they would have already noticed it. And while astronomers recognize that the magnetic poles do reverse every 400,000 years or so, they insist that this event does not affect the Earth’s rotation and will not harm life on Earth. Perhaps the most interesting part of this whole apocalyptic fad is that the Mayans themselves don’t expect that the world will end in 2012, rather, they expect it to be a time of great celebration and luck when the planet completes the current Great Cycle.

8: Harold Camping, May 21, 2011

The Bible is pretty clear about doomsday prophecies: “But of that day and that hour knoweth no man, no, not the angels which are in heaven, neither the Son, but the Father,” reads Mark 13:32. But that hasn’t stopped some believers from trying to make predictions anyway. One such man is Harold Camping, a retired engineer who believes that the Bible is a numerical code book that can be deciphered to reveal clues about the end times. Camping, the founder of the independent ministry Family Radio International first predicted that the world would end in September 1994. But when the apocalypse failed to materialize, he attributed the error to incomplete research.

Camping recently gained additional attention for his latest doomsday prediction: May 21, 2011. In an interview with New York Magazine on May 11, 2011, the 89-year-old was brimming with confidence, saying, “God has given sooo much information in the Bible about this, and so many proofs, and so many signs, that we know it is absolutely going to happen without any question at all.” Camping was so certain that his ministry spent millions of dollars plastering the Judgement Day message on more than 5,000 billboards and 20 recreational vehicles as a warning to the general public. When May 21 came and went without interruption, Camping did what any good doomsayer would — he blamed the mistake on a mathematical error and moved the date back to October 21.

7: William Miller, 1843-1844

William Miller and the Millerites may sound like a good name for a 1960s pop act, but in the 1840s, they were a fairly successful doomsday cult. That is, if you measure success by the number of followers, not the eventual occurrence of the predicted apocalypse.

Miller was a product of the Second Great Awakening, a period of intense religious revival from which several modern denominations were born, including the Mormons and the Seventh Day Adventists. A farmer-turned-preacher, Miller crested this wave of spiritual fervor with his prediction that Jesus would return to Earth in March 1843. He derived his prophecy from a complex system of mathematical calculations and promoted it by giving sermons and passing out pamphlets during the 1830s and early 1840s. Scholars estimate that of the some 1 million people who heard his message, about 100,000 actually chose to follow him. As March 1843 neared, many of these believers sold all of their possessions, donned white robes, and climbed to the tops of mountains and hills to await their rapture into heaven. When nothing happened, Miller moved the date to October 1844, which also proved to be a bust, leading some to label the non-event “The Great Disappointment.” Most of the preacher’s followers then abandoned him, and some went on to form the Adventist Church.

6: Halley’s Comet, May 1910

A unique astronomical event is a surefire way to inspire a doomsday prophecy. Enter Halley’s Comet, a ball of icy dust that is visible from Earth every 76 years. When this celestial body was scheduled to make a pass in 1910, the claims of impassioned astronomers at Chicago’s Yerkes Observatory inspired fear in a surprising number of people. They insisted that the comet’s tail was made of poisonous cyanogen gas, and when Earth passed through it on May 18, the toxic fumes would cause widespread death. Some opportunists tried to profit from the hysteria, selling “comet pills,” masks and bottled oxygen intended to help people survive the noxious Armageddon.

As the deadly date approached, some concerned citizens stuffed towels under their doors and covered their keyholes with paper to protect themselves from the gas cloud. Others refused to go to work, choosing instead to stay at home with their families or seek refuge in their churches. Conversely, those not taken by the apocalyptic predictions watched the night pass without incident at rooftop “comet parties” held across the United States.watch full Legend 2015 film online

5: Large Hadron Collider, 2009-2012

To anyone without a particle physics degree, the Large Hadron Collider (LHC) may seem like a scary piece of advanced machinery. The massive particle accelerator’s circular tunnel, located just outside of Geneva, Switzerland, measures 17 miles (28 kilometers) in total circumference. It can send hydrogen protons crashing in to one another nearly of the speed of light, allowing scientists to discover new elements and particles that may shed light on the creation of the universe. That is, if everything goes as planned.

Some theorists suggest that the massive energies created during such collisions could potentially form black holes capable of engulfing the entire planet. These fears came to a head in March 2008 when Walter L. Wagner and Luis Sancho filed a lawsuit in a U.S. court to stop the LHC from beginning operation until scientists produced a safety report and environmental assessment. While most scholars acknowledge the possibility of black holes, they dismiss the danger, insisting that any such anomaly would only last a matter of seconds — hardly long enough to swallow the earth. Despite the controversy, researchers fired up the LHC in 2009 and have accomplished some remarkable feats, including the creation of a soupy mass of matter thought to resemble the conditions of the universe just after the Big Bang. By the end of 2010, no black holes had been detected in the LHC but according to doomsayers, that doesn’t mean we’re in the clear. Something could always happen before scientists conclude the project in 2012.

4: Shoko Asahara, 1997-2000

Why wait for the apocalypse if you can make it happen yourself? This was the mindset of the Japanese doomsday prophet Shoko Asahara. Born Chizuo Matsumoto in 1955, Asahara was completely blind in one eye and partially sightless in the other. His rise as a cult leader began after he was arrested in 1982 for selling fake cures from his traditional Chinese apothecary business. The would-be prophet was reportedly crushed by the incident, which left him embarrassed and bankrupt.

In 1984, Asahara opened a yoga studio, boasting that he had achieved satori, a Japanese term for enlightenment, and claiming that he could levitate. He established the Aum Shinrikyo religion in 1987, a name derived from a sacred Hindu symbol and a Japanese word that translates as “supreme truth.” He soon gained more than 10,000 followers in Japan and 30,000 to 40,000 in Russia, and even produced several candidates to run in the 1990 Japanese legislative elections [source: Onishi]. As Asahara’s success increased, his behavior became increasingly peculiar. He began encouraging his followers to drink his bathwater and blood, and claimed that he could save them from the apocalypse, which he believed would occur after a poison gas attack sometime between 1997 and 2000. Perhaps in an effort to speed along this process, Aum members boarded five trains on March 20, 1995, releasing toxic sarin into three subway lines. The attack killed 12 people and injured another 5,500 [source: Onishi]. Asahara was soon arrested by Japanese authorities and sentenced to death in February 2004.

3: Heaven’s Gate, 1997

Marshall Applewhite, with his piercing, wide-eyed stare, looks like a man who was destined to lead a doomsday sect. He was the leader of Heaven’s Gate, a cult founded in Texas during the early 1970s. The group soon moved to the American southwest where Applewhite began to preach about a spaceship that would spare true believers from the apocalypse and take them to the heavenly “Level Above Human.” After two decades proselytizing in the desert, Heaven’s Gate moved to California where they started a Web consulting business called “Higher Source” to fund their activities. There they lived in a sprawling Spanish-style house and reportedly watched episodes of “X-Files” and “Star Trek” religiously.

Heaven’s Gate took a grim turn in 1997, the year that the comet Hale-Bopp shined brightly in the night sky. It all started on Nov. 14, 1996, when Applewhite and his followers were listening to Art Bell’s “Coast to Coast,” a radio show dedicated to UFO topics. During the program, an amateur astronomer called in and claimed to have photographed a mysterious object hiding in Hale-Bopp’s tail. This was all the evidence that Applewhite needed to confirm his spaceship prophecy from the 1970s. He and his group soon began preparations to board the UFO through the execution of a mass suicide. When police entered the California compound on March 26, 1997, they found 39 bodies dressed in black tunics with a cloth draped over their heads. They had killed themselves with a cocktail of vodka and barbiturates, or by smothering themselves with plastic bags.

2: Y2K, 2000

The year 2000 sparked a number of doomsday scares, but none was more prominent than the supposed Y2K computer glitch. The problem was this: When computer codes were first written, dates were abbreviated to two digits in order to save memory; for example, “1998” would simply be written as “98.” This system worked just fine until 2000, when the date code “00” threatened to cause inaccurate calculations. A 1998 feature story from Microsoft offers an excellent example to illustrate the perceived problem:

“For example, say you buy a new refrigerator in 1999 with a credit card. The bank will run into problems in 2000 when it tries to calculate the interest owed and subtracts the transaction date (99) from the current date (00). The computer is going to come up with the number -99” [source: Crawford].

Some people believed that this glitch would cause apocalyptic consequences. According to these gloomy predictions, at the stroke of midnight on Jan. 1, 2000, airplanes would drop from the sky, elevators would plummet from the tops of skyscrapers, and the world economy would come to a screeching halt. In response to these fears, the U.S. government and American corporations spent a total of $108.8 billion on Y2K computer fixes [source: Karl]. In the end, nothing fell from the sky, but the world’s computers did manage to disrupt some credit card terminals in Britain and send out some bills supposedly due in 1900. To the relief of billions, civilization survived almost completely unscathed.

1: The Sun Becomes a Red Giant, 7.6 billion years from now

Not all apocalyptic predictions are steeped in religious fervor or science fiction; some are based solidly upon respected science. Most scholars agree that 7.6 billion years from now, the sun will enter its red giant phase when it has converted all of its hydrogen into helium. This will cause the sun to expand to a size 20 percent greater than that of Earth’s orbit and shine 3,000 times brighter [source: Appell]. Once this stage is complete, the sun will then collapse into a white dwarf.

Whether this process will actually destroy the planet is a topic of debate in the scientific community. If Earth were to stay in its current orbit, it would undoubtedly be engulfed and vaporized by the expanding sun. However, as the sun swells it will also lose mass, meaning that Earth will drift further away from it and perhaps escape total destruction. Either way, this process would destroy life as we know it, that is, if there were any life left to destroy.

20 Ways to Build a Cleaner, Healthier, Smarter World

World Changing Ideas: 20 Ways to Build a Cleaner, Healthier, Smarter World: Scientific American.

What would happen if solar panels were free? What if it were possible to know everything about the world—not the Internet, but the living, physical world—in real time? What if doctors could forecast a disease years before it strikes? This is the promise of the World Changing Idea: a vision so simple yet so ambitious that its full impact is impossible to predict. Scientific American’s editorial and advisory boards have chosen projects in five general categories—Energy, Transportation, Environment, Electronics and Robotics, and Health and Medicine—that highlight the power of science and technology to improve the world. Some are in use now; others are emerging from the lab. But all of them show that innovation is the most promising elixir for what ails us.  —The Editors

The No-Money-Down Solar Plan
A new wave of start-ups wants to install rooftop solar panels on your house. Upfront cost: nothing
By Christopher Mims

The biggest thing stopping the sun is money. Installing a rooftop array of solar panels large enough to produce all of the energy required by a building is the equivalent of prepaying its electricity bill for the next seven to 10 years—and that’s after federal and state incentives. A new innovation in financing, however, has opened up an additional possibility for homeowners who want to reduce their carbon footprint and lower their electric bills: get the panels for free, then pay for the power as you go.

The system works something like a home mortgage. Organizations and individuals looking for a steady return on their investment, typically banks or municipal bond holders, use a pool of cash to pay for the solar panels. Directly or indirectly, homeowners buy the electricity produced by their own rooftop at a rate that is less, per kilowatt-hour, than they would pay for electricity from the grid. Investors get a safe investment—the latest generation of solar-panel technology works dependably for years—and homeowners get a break on their monthly bills, not to mention the satisfaction of significantly reducing their carbon footprint. “This is a way to get solar without putting any money down and to start saving money from day one. That’s a first,” says SolarCity co-founder Peter Rive.

SolarCity is the largest installer of household solar panels to have adopted this strategy. Founded in 2006 by two brothers who are also Silicon Valley–based serial entrepreneurs, SolarCity leases its panels to homeowners but gives the electricity away for free. The net effect is a much reduced utility bill (customers still need utility-delivered power when the sun isn’t out) plus a monthly SolarCity bill. The total for both comes out to less than the old bill. SunRun in San Francisco offers consumers a similar package, except that the company sells customers the electricity instead of leasing them the panels.

Cities such as Berkeley and Boulder are pioneering their own version of solar-panel financing by loaning individuals the entire amount required to pay for solar panels and installation. The project is paid for by municipal bonds, and the homeowner pays back the loan over 20 years as a part of the property tax bill. The effect is the same whichever route a consumer takes: the new obligation, in the form of taxes, a lease or a long-term contract for electricity, ends up costing less than the existing utility bill.

“What we’re really seeing is a transition in how we think about buying energy goods and services,” says Daniel M. Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley. Kammen, who did the initial analysis on Berkeley’s financing model, believes that by turning to financing, consumers can overcome the inherent disadvantage renewables have when compared with existing energy sources: the infrastructure for power from the grid has already been paid for and, in many cases, has been subsidized for decades.

All three approaches are rapidly expanding across the country. Despite the Berkeley program being less than two years old, 10 different states have passed legislation allowing their cities to set up a Berkeley-style bond-financed loan program. With the passage of the Waxman-Markey climate bill, the option for cities to set up these programs would become federal law. SunEdison in Maryland is currently active in nine states. SolarCity, which has more than 4,000 customers, is active in California, Arizona and Oregon and has promised to announce additional states after the new year.

Right now it is not possible to lower the overall cost of rooftop solar to “grid parity,” that is, to the same price as electricity from local utility companies, without federal subsidies such as the investment tax credit, which lowers the tax bill of banks financing these projects. Those subsidies, which amount to 30 percent of the cost of a solar installation, are guaranteed for at least eight years. By then, SolarCity and its competitors claim they won’t need them.

“Grid parity is driven by multiple factors,” says Attila Toth, vice president of marketing at SunEdison, including the cost of capital, the cost of panels and their installation, and the intensity of sunlight in a given region. “It will occur in different states at different times, but, for example, we expect that California will be one of the first states in the U.S. to get to grid parity, sometime between three and five years from now.”

While the cost of electricity from fossil fuels has increased 3 to 5 percent a year for the past decade, the cost of solar panels has fallen on average 20 percent for every doubling of its installed base. Grid parity is where these trend lines cross—after that, solar has the potential to power more than just homes. It’s hardly a coincidence that Elon Musk, head of electric car company Tesla Motors, sits on SolarCity’s board of directors.

More Ideas to watch
by Christopher Mims

The Gasoline Garden
It is the next step for biofuels: genetically engineered plant life that produces hydrocarbons as a by-product of its normal metabolism. The result will be fuel—common gasoline, even—using nothing but sunlight and CO2. In July, Exxon Mobil announced plans to spend more than $600 million in pursuit of algae that can accomplish the task. Joule Biotechnologies claims to have already succeeded, although the company has yet to reveal any details of its proprietary system.

Hot Nukes
Uranium and plutonium are not the only fuels that can power a nuclear reactor. With an initial kick from more traditional fissile materials, thorium can set up a self-sustaining “breeder” reaction that produces uranium 233, which is well suited to nuclear power generation. The process has the added benefit of being resistant to nuclear proliferation, because its end products emit enough gamma rays to make the fuel dangerous to handle and easy to track.

Save Energy with Information
Studies show that simply making customers aware of their energy use lowers it
by 5 to 15 percent. Smart meters allow customers to track their energy consumption minute by minute and appliance by appliance. Countless start-ups are offering the devices, and Google and Microsoft are independently partnering with local utilities to allow individuals to monitor their power usage over the Web.

Wind Power from the Stratosphere
According to a Stanford University study released in July, the high-altitude winds that constantly blow tens of thousands of feet above the earth hold enough energy to supply all of human civilization 100 times over. California’s Sky WindPower has proposed harvesting this energy by building fleets of giant, airborne, ground-tethered windmills, while Italy’s Kite Gen proposes to accomplish the same feat using kites.

Delivering the U.S. from Oil
Plug-in hybrid trucks are improving the long view of the short haul
By Amanda Schupak

Cargo trucks gulp about 40 percent of the fuel pumped in the U.S. While most consumer attention focuses on improving the fuel economy of consumer vehicles, a major opportunity goes rumbling by. “Folks do not realize that the fuel use of even a small truck is equal to many, many cars,” says Bill Van Amburg, senior vice president of Calstart, a clean transportation technology nonprofit, and director of the Hybrid Truck Users Forum. “A utility truck as a hybrid would reduce more petroleum than nine Priuses.”

Some 1,300 commercial hybrids on the road today get up to twice the fuel efficiency of their conventional counterparts. But these traditional hybrids are inherently limited. They make more efficient use of petroleum-based fuel by capturing some of the energy lost during braking.

Plug-in hybrids, on the other hand, draw energy from the grid. They can drive for miles—in many cases, an entire day’s route—without using any fossil fuel at all. This shifts energy demand away from petroleum and toward grid-based sources. (Last year zero-carbon renewables and nuclear supplied 30 percent of all electric power in the U.S.)

In many ways, plug-in hybrid technology makes more sense for delivery trucks than for consumer sedans. A cargo truck runs a short daily route that includes many stops to aid in regenerative braking. Most of the U.S. Postal Service’s 200,000-plus mail trucks, for example, travel fewer than 20 miles a day. In addition, fleet vehicles return nightly to storage lots that have ready access to the 120- or 240-volt outlets required to charge them.

The Department of Energy recently launched the nation’s largest commercial plug-in hybrid program, a $45.4-million project to get 378 medium-duty vehicles on the road in early 2011. The trucks, which will go to 50 municipal and utility fleets, will feature a power system from Eaton, a large manufacturer of electrical components, on a Ford F-550 chassis. (For its part, Ford will wait for the market to prove itself before designing its own commercial plug-ins.) “These are going to start breaking free in 2011,” says Paul Scott, president of the Electric Vehicle Association of Southern California.

Start-up company Bright Automotive has a more ambitious plan. It aims to replace at least 50,000 trucks with plug-in hybrids by 2014. Bright’s IDEA prototype travels 40 miles on battery power before switching to a four-cylinder engine that gets 40 miles to the gallon. The streamlined aluminum body has the payload of a postal truck yet is far more aerodynamic. The truck weighs as much as a midsize sedan.

John E. Waters, Bright Automotive’s founder and the former developer of the battery system for General Motors?’s groundbreaking EV1 electric car, says that each IDEA would save 1,500 gallons of fuel and 16 tons of carbon dioxide emissions a year over a standard utility truck. Waters says he is ready to begin assembly in his U.S. plant once a pending $450-million federal loan comes through.

Despite the appeal of the carbon savings, the fleet owners who are the trucks’ primary customers have more practical considerations. Bright’s executives are coy about the IDEA’s eventual price tag but assert that a customer with 2,000 trucks driving 80 miles a day five days a week could save $7.2 million a year. Right now that is probably not enough to justify large-scale purchases without additional rebates—or a price on carbon. Van Amburg estimates that going hybrid currently adds $30,000 to $50,000 in upfront costs per vehicle, although that figure should come down as production volumes increase.

Improved battery technology will also help. Today the IDEA’s 13-kilowatt-hour lithium-ion battery pack accounts for nearly a quarter of the vehicle’s total cost. Much of the research being done for the batteries going into the Chevy Volt? and other consumer plug-ins should also be applicable to commercial batteries. “For all the good we all want to do,” says David Lauzun, Bright’s vice president of product development, “these vehicles will not take over the world until it becomes the economic choice—‘I have to have them because it saves me money.’”

Bus Rapid Transit
Subwaylike bus lines mobilize the urban future
By Michael Moyer

For the first time in human civilization, more people now live in urban areas than in the countryside. This shift creates a number of dilemmas, not least of which is how to move people within the world’s rapidly growing metropolises. Pollution and traffic point away from car-based options, while light-rail systems are slow to construct and prohibitively expensive. One disarmingly simple—and cheap—possibility is Bus Rapid Transit, which is engineered to operate like a subway on wheels. In these systems, concrete dividers on existing roads separate high-capacity buses from the rest of traffic. Riders pay before boarding, then wait in enclosed stations. When a bus arrives, sliding partitions open to allow riders to board from a platform that is level with the bus floor. The traffic-free thoroughfares, quick boarding times, and modern, comfortable stations resemble light-rail systems more than the chaos of typical bus travel. In Bogotá, Colombia, which has had seven Bus Rapid Transit lines in operation since 2001, the buses handle 1.6 million trips a day. Its success has allowed the city to remove 7,000 private buses from the city, reducing consumption of bus fuel and its associated pollution by more than 59 percent.

Ocean Overhaul
Marine zoning is a bold remedy for sick seas
By Sarah Simpson

These days not even many politicians deny that the oceans are ill. Protecting the health of coastal waters is now a matter of national policy in dozens of countries, including the U.S., and world leaders are beginning to prescribe a revolutionary remedy that conservationists have been promoting for years: marine planning and zoning.

The idea is a natural extension of management policies that have guided the development of cities and landscapes for nearly a century. Porn shops aren’t next to preschools, after all, and drilling rigs aren’t the centerpieces of national parks. Similarly, zoning advocates envision a mosaic of regional maps in which every watery space on the planet is designated for a particular purpose. Drilling and mining would be allowed only in certain parts of the ocean; fishing in others. The most critically threatened areas would be virtually off-limits.

Whereas people can easily find maps telling them what they can do where on land, the marine realm is a hodgepodge of rules emanating from an army of agencies, each one managing a single use or symptom. In the U.S., for example, one body regulates commercial fishing, usually a single species at a time. Another group manages toxic substances, still another seabed mining, and so on—some 20 federal agencies in all. They tend to make decisions without regard to what the others are doing, explains Duke University? marine ecologist Larry B. Crowder. “Imagine all of the medical specialists visiting a patient in intensive care one at a time and never talking to one another,” he says. “It’s a wonder that the oceans aren’t in worse shape than they are now.”

Ocean advocates such as Crowder eagerly await the final recommendations of a special task force President Barack Obama charged with presenting a plan for overhauling management of U.S. waters, which extend 200 nautical miles offshore. The scope of such an undertaking is huge: the U.S. controls 4.4 million square miles of seascape, making the country’s underwater real estate 25 percent larger than its landmass. The committee’s preliminary report, released in September, suggests that the best way to minimize harmful human impacts on the oceans is to manage regions rather than symptoms.

Many environmentalists are hopeful that such plans will be implemented through the marine equivalent of municipal zoning, which would give them some influence in areas where they now have none. In zones where conservation is designated as the dominant activity, fishing and industrial activities such as mining would no longer have free rein. Under current rules, about the only way a conservation group can block a project it deems harmful—say, a new site for offshore drilling—is through expensive litigation.

So far, though, the president’s task force has been careful not to suggest that ocean zoning will be the only treatment plan, in great part because any effort to restrict commercial interests is bound to meet stiff opposition. “Zoning isn’t anybody’s favorite exercise,” notes John C. Ogden, director of the Florida Institute of Oceanography at the University of South Florida at Tampa. “Someone’s ox is always getting gored.” Most resistant to such change will most likely be the traditional users of the open ocean—namely, commercial fisheries and the petroleum industry. “They’ve had the place to themselves for a long time,” Ogden says.

Ogden and others are quick to point out, however, that zoning practices can benefit commerce as much as conservation. By giving up access to certain areas, industries gain the security of knowing their activities would be licensed in a more predictable and less costly manner than they are today, explains Josh Eagle, associate professor at the University of South Carolina School of Law. Now an oil company can apply for permits to drill virtually anywhere, but it takes on a significant financial risk each time. The business may dump millions of dollars into researching a new facility only to have a lawsuit derail it at the last moment. When opposing parties have more or less equal voices early in the planning process, Eagle says, they are less inclined to block one another’s activities once zones are drawn on a map.

Whether the final report of the president’s task force will promote ocean zoning explicitly is uncertain. But the group has already promised to overhaul the structure of ocean governance by proposing the creation of a National Ocean Council, whose job it will be to coordinate efforts of the myriad federal agencies now in charge.

The move comes just in time. Just as society is beginning to appreciate the enormous efforts it will take to preserve the health of the oceans, it must ask more of them—more energy, more food, and better resilience to coastal development and climate change. The reason the oceans are in trouble is not what people put in and take out. It is a failure of governments to manage these activities properly. Says Crowder: “We have to treat the oceans holistically, not one symptom at a time.”

The Power of Garbage
Trapped lightning could help zap trash and generate electricity
By John Pavlus

Trash is loaded with the energy trapped in its chemical bonds. Plasma gasification, a technology that has been in development for decades, could finally be ready to extract it.

In theory, the process is simple. Torches pass an electric current through a gas (often ordinary air) in a chamber to create a superheated plasma—an ionized gas with a temperature upward of 7,000 degrees Celsius, hotter than the surface of the sun. When this occurs naturally we call it lightning, and plasma gasification is literally lightning in a bottle: the plasma’s tremendous heat dissociates the molecular bonds of any garbage placed inside the chamber, converting organic compounds into syngas (a combination of carbon monoxide and hydrogen) and trapping everything else in an inert vitreous solid called slag. The syngas can be used as fuel in a turbine to generate electricity. It can also be used to create ethanol, methanol and biodiesel. The slag can be processed into materials suitable for use in construction.

In practice, the gasification idea has been unable to compete economically with traditional municipal waste processing. But the maturing technology has been coming down in cost, while energy prices have been on the rise. Now “the curves are finally crossing—it’s becoming cheaper to take the trash to a plasma plant than it is to dump it in a landfill,” says Louis Circeo, director of Plasma Research at the Georgia Tech Research Institute. Earlier this summer garbage-disposal giant Waste Management partnered with InEnTec, an Oregon-based start-up, to begin commercializing the latter’s plasma-gasification processes. And major pilot plants capable of processing 1,000 daily tons of trash or more are under development in Florida, Louisiana and California.

Plasma isn’t perfect. The toxic heavy metals sequestered in slag pass the Environmental Protection Agency?’s leachability standards (and have been used in construction for years in Japan and France) but still give pause to communities considering building the plants. And although syngas-generated electricity has an undeniably smaller carbon footprint than coal—“For every ton of trash you process with plasma, you reduce the amount of CO2 going into the atmosphere by about two tons,” Circeo says—it is still a net contributor of greenhouse gases.

“It is too good to be true,” Circeo admits, “but the EPA has estimated that if all the municipal solid waste in the U.S. were processed with plasma to make electricity, we could produce between 5 and 8 percent of our total electrical needs—equivalent to about 25 nuclear power plants or all of our current hydropower output.” With the U.S. expected to generate a million tons of garbage every day by 2020, using plasma to reclaim some of that energy could be too important to pass up.

More Ideas to watch
By John Pavlus

Cement as a Carbon Sponge
Traditional cement production creates at least 5 percent of global carbon dioxide emissions, but new materials could create carbon-neutral cement. Start-up Novacem, supported by Imperial College London, uses magnesium oxide to make cement that naturally absorbs CO2 as it hardens. California-based Calera uses seawater to sequester carbon emissions from a nearby power plant in cement.

The New Honeybee
Colony collapse disorder (CCD) has killed more than a third of honeybee colonies since 2006. Farmers who depend on bees to pollinate such crops as almonds, peaches and apples are looking to the blue orchard bee to pick up the slack.

One efficient Osmia lignaria can pollinate as much territory as 50 honeybees, but the bees are harder to cultivate because of their solitary nature. These pinch hitters won’t completely replace honeybees, but as scientists continue to grapple with CCD, they could act as an agricultural safety net.

Saltwater Crops
As the world’s freshwater supply becomes scarcer and food production needs balloon, salt-tolerant crops could ease the burden. Researchers at Australia’s University of Adelaide used genetic engineering to enhance a model crop’s natural ability to prevent saline buildup in its leaves, allowing the plant to thrive in conditions that would typically wither it. If the same gene tweak works in cereal crops such as rice and wheat—the researchers are testing them now—fallow lands destroyed by drought or overirrigation could become new breadbaskets.

The Omnipotence Machines
Tiny, ubiquitous sensors will allow us to index the physical world the way the Web maps cyberspace
By Gregory Mone

Earlier this year Hewlett-Packard announced the launch of its Central Nervous System for the Earth (CeNSE) project, a 10-year effort to embed up to a trillion pushpin-size sensors across the planet. Technologists say that the information gathered by this kind of ubiquitous sensing network could change our knowledge of the world as profoundly as the Internet has changed business. “People had no idea the Web was coming,” says technology forecaster Paul Saffo?. “We are at that moment now with ubiquitous sensing. There is quite an astonishing revolution just around the corner.”

The spread of versatile sensors, or “motes,” and the ability of computers to analyze and either recommend or initiate responses to the data they generate, will not merely enhance our understanding of nature. It could lead to buildings that manage their own energy use, bridges that flag engineers when in need of repair, cars that track traffic patterns and detect potholes, and home security systems that distinguish between the footfalls of an intruder and the dog, to name a few.

CeNSE is the boldest project yet announced, but HP is not the only organization developing the technology to make ubiquitous sensing possible. Intel is also designing novel sensor packages, as are numerous university labs.

For all the momentum in the field, though, this sensor-filled future is by no means inevitable. These devices will need to generate rich, reliable data and be rugged enough to survive tough environments. The sensor packages themselves will be small, but the computing effort required will be enormous. All the information they gather will have to be transmitted, hosted on server farms, and analyzed. Finally, someone is going to have to pay for it all. “There is the fundamental question of economics,” notes computer scientist Deborah Estrin of the University of California, Los Angeles. “Every sensor is a nonzero cost. There is maintenance, power, keeping them calibrated. You don’t just strew them around.”

In fact, HP senior researcher Peter Hartwell acknowledges that for CeNSE to hit its goals, the sensors will need to be nearly free. That is one of the reasons why HP is designing a single, do-everything, pushpin-size package stacked with a variety of gauges—light, temperature, humidity, vibration and strain, among others—instead of a series of devices for different tasks. Hartwell says that focusing on one versatile device will drive up volume, reducing the cost for each unit, but it could also allow HP to serve several clients at once with the same sensors.

Consider his chief engineering project, an ultrasensitive accelerometer. Housed inside a chip, the sensor tracks the motion of a tiny, internal movable platform relative to the rest of the chip. It can measure changes in acceleration 1,000 times as accurately as the technology in the Nintendo Wii?.

Hartwell imagines situating one of these pins every 16 feet along a highway. Thanks to the temperature, humidity and light sensors, the motes could serve as mini weather stations. But the accelerometers’ vibration data could also be analyzed to determine traffic conditions—roughly how many cars are moving past and how quickly. The local highway department would be interested in this information, he guesses, but there are potential consumer applications, too. “Your wireless company might want to take that information and tell you how to get to the airport the fastest,” Hartwell says.

All of this gathering and transmission of data requires power, of course, and to guarantee an extended life, the HP pushpin will not rely solely on batteries. “It is going to have some sort of energy-scavenging ability,” Hartwell says. “Maybe a solar panel or a thermoelectric device to help keep the battery charged.”

With the power hurdle in mind, other groups are forgoing batteries altogether. At Intel Labs in Seattle, engineer Josh Smith? has developed a sensor package that runs on wireless power. Like the HP pushpin, Intel’s WISP, or Wireless Identification and Sensing Platform, will include a variety of gauges, but it will also draw energy from the radio waves emitted by long-range radio-frequency ID chip readers. Smith says a single reader, plugged into a wall outlet, can already power and communicate with a network of prototype WISPs five to 10 feet away—a distance that should increase.

Smith cites many of the same infrastructure-related possibilities as Hartwell, along with a number of other uses. If WISPs were placed on standard household items such as cups, these tags could inform doctors about the rehabilitation progress of stroke victims. If the cups the patient normally uses remain stationary, Smith explains, then the individual probably is not up and moving around.

The potential applications for ubiquitous sensing are so broad—a physicist recently contacted him about using WISPs to monitor the temperature outside a proposed neutrino detector—that, as with the Internet, Smith says it is impossible to foresee them all. “In terms of the impact it is going to have on our lives,” Hartwell adds, “you haven’t seen anything yet.”

The Do-Anything Robot
Your PC can accomplish any computing task you ask of it. Why isn’t the same true for robots
By Gregory Mone

Robots have proved to be valuable tools for soldiers, surgeons and homeowners hoping to keep the carpet clean. But in each case, they are designed and built specifically for the job. Now there is a movement under way to build multipurpose machines—robots that can navigate changing environments such as offices or living rooms and work with their hands.

All-purpose robots are not, of course, a new vision. “It’s been five or 10 years from happening for about 50 years,” says Eric Berger, co-director of the Personal Robotics Program at Willow Garage, a Silicon Valley start-up. The delay is in part because even simple tasks require a huge set of capabilities. For a robot to fetch a mug, for example, it needs to make sense of data gathered by a variety of sensors—laser scanners identifying potential obstacles, cameras searching for the target, force feedback in the fingers that grasp the mug, and more. Yet Berger and other experts are confident that real progress could be made in the next decade.

The problem, according to Willow Garage, is the lack of a common platform for all that computational effort. Instead of building on the capabilities of a single machine, everyone is designing robots, and the software to control them, from the ground up. To help change this, Willow Garage is currently producing 25 copies of its model PR2 (for “Personal Robot 2”), a two-armed, wheeled machine that can unplug an appliance, open doors and move through a room. Ten of the robots will stay in-house, but 10 more will go to outside research groups, and everyone will pool their advances. This way, Berger says, if you want to build the robotic equivalent of a Twitter, you won’t start by constructing a computer: “you build the thing that’s new.”

Pocket Translator
The military, short on linguists, is building smart phone–based devices to do the job
By Gregory Mone

Sakhr Software, a company that builds automatic language translators, recently unveiled a prototype smart phone application that transforms spoken English phrases into spoken Arabic, and vice versa, in near real time. The technology isn’t quite ready for your next trip to Cairo, but thanks to recent advances in machine-translation techniques, plus the advent of higher-fidelity microphones and increasing processing power in smart phones, this mobile technology could soon allow two people speaking different languages to have basic conversations.

Before the 1990s automatic translation meant programming in an endless list of linguistic rules, a technique that proved too labor-intensive and insufficiently accurate. Today’s leading programs—developed by BBN Technologies?, IBM, Sakhr and others as part of a Defense Advanced Research Projects Agency effort to eliminate the military’s need for human translators—rely on machine-learning techniques instead. The software works from a database of parallel texts—for example, War and Peace in two different languages, translated United Nations speeches, and documents pulled off the Web. Algorithms identify short matching phrases across sources, and the software uses them to build statistical models that link English phrases to Arabic ones.

John Makhoul, BBN’s chief scientist, says the current technology is at its best when confined to subject areas with specific phrases and terminology—translating a weather report from English into French, for example, or helping soldiers gather basic biographical information from people in the field. Makhoul envisions the first consumer applications, five years from now, being similarly constrained. A tourism-related translation app on a smart phone could help an American in Florence get directions from a non-English-speaking local, but they won’t chat about Renaissance art. “It is not going to work perfectly,” he says, “but it will do a pretty good job.”

Know if Disease Grows Inside You
Complex diseases have complex causes. Luckily, they also leave a multitude of traces
By Melinda Wenner

With the exception of certain infectious diseases, few of humanity’s ailments have cures. More than 560,000 Americans will die of cancer this year, and despite the 250,000 coronary bypass surgeries doctors perform annually, heart disease is still the country’s number-one killer.

The hardest diseases to cure are the ones that take the longest to develop. They are the end result of decades of complex molecular interactions inside your body. Yet this complexity also pre­sents an opportunity. Scientists have discovered that these interactions leave discernible fingerprints on the body. By unweaving the complex tapestry of molecular clues—changes in the body’s proteins, nucleic acids and metabolites, collectively called biomarkers—doctors hope they will soon be able to not only detect disease but predict a coming illness in time to take action.

Biomarkers are not new. Since 1986 doctors have monitor­ed prostate cancer by measuring blood levels of the protein known as prostate-specific antigen (PSA). But tests that rely on a single biomarker to detect disease are rare, because most disorders involve intricate changes in a collection of biomarkers.

Take schizophrenia: in January 2010 scientists will release a biomarker test that distinguishes schizophrenia from other psychiatric conditions. The test, which is being commercialized by Rules-Based Medicine, a laboratory in Austin, Tex., is based on the characteristics of about 40 blood-based proteins.

To find potentially useful biomarkers, researchers collect blood samples from thousands of healthy people and analyze them. Biomarker levels in these samples provide a baseline reading. Then they do the same for people with a specific condition such as diabetes or breast cancer. If reproducible differences emerge between the groups, scientists can use the patterns in the disease group to diagnose the same condition in others. By collecting samples over time, researchers can also go back and analyze early samples from individuals who later become ill to identify patterns indicative of early disease or high disease risk.

Biophysical Corporation, a sister company to Rules-Based Medicine, is one of several companies that has developed blood-based biomarker tests and marketed them to the public [see “The Ultimate Blood Test,” by Philip Yam; Scientific American, June 2006]. The company searches for up to 250 biomarkers suggestive of cancer, inflammatory conditions, heart disease and other illnesses. Mark Chandler, Biophysical’s chair and CEO, says that the real value of the tests lies in long-term monitoring. A person could “get a test monthly, just a finger stick, that would be able to say, we have had a serious change here that is indicative of an early-stage cancer,” he explains.

Yet not all experts are convinced that the age of biomarkers is at hand. Cheryl Barton, an independent U.K.-based pharmaceutical consultant who authored a Business Insights market analysis report on biomarkers in 2006, says she remains “a little bit skeptical about how clinically useful they are.” A study of 5,000 subjects published in the Journal of the American Medical Association in July 2009 found that six cardiovascular biomarkers were only marginally better at predicting heart disease than were standard cardiovascular risk factors, such as whether the subjects smoked or had diabetes.

Adding to the overall difficulty, a person might suffer from two or more diseases—prostate cancer and heart disease, for example. No one knows how multiple diseases might affect overall biomarker signatures or how profiles will change as other diseases develop. “When you get to be 65 or 70, almost everybody has other conditions,” Chandler says. “We don’t know how to deal with that right now.” And scientists still need to discern which biomarkers are truly relevant to disease—a difficult task when working with blood, which contains tens of thousands of proteins at concentrations spanning more than 10 orders of magnitude.

Some companies have simplified the problem by avoiding blood altogether. LabCorp recently commercialized a biomarker test that analyzes colon cells in stool for the chemical signatures indicative of colorectal cancer. “The stool is in intimate contact with the lining of the colon, so it becomes much more highly populated with these rare molecules than would get into the bloodstream from colon cancer,” says Barry Berger, chief medical officer of Exact Sciences, a Madison, Wis.–based biotechnology company that developed the test technology.

In time, scientists are confident that they will eventually crack the more difficult problem of finding distinct disease signatures in the noisy data. “The evolutionary process, being complex and unknown, does not always give us an easy route,” Berger notes, “but it definitely gives us lots of opportunities.”

Satellites Diagnose Disease Outbreaks
Space-based data are helping to track and predict the spread of deadly diseases ?
By Katherine Harmon

Many contagious diseases spread through carriers such as birds and mosquitoes. These vectors in turn move with heat and rainfall. With this in mind, researchers have begun to use satellite data to monitor the environmental conditions that lead to disease. “Ideally, we could predict conditions that would result in some of these major outbreaks of cholera, malaria, even avian flu,” says Tim Ford of the University of New England at Biddeford and co-author of a paper on the subject published this past September in Emerging Infectious Diseases.

Satellite data have already been used to map the advance of the H5N1 avian influenza in Asia. The domestic duck, a common inhabitant of Southeast Asia’s rice paddies, is one of the main carriers of the disease. Xiangming Xiao, associate director of the University of Oklahoma?’s Center for Spatial Analysis, uses satellite images to map agricultural patterns in the region. These maps show where the ducks are most likely to live and thus where the avian influenza is most likely to spread.

Migratory birds also carry the virus, but their travel patterns are more difficult to predict. Xiao and his colleagues combine the satellite imagery with satellite-gathered surface-temperature data to estimate the birds’—and thereby the virus’s—trajectory. Computer models then link these environmental drivers to the spread of the flu in human populations.

Of course, not all of the work can be outsourced to orbiting observatories. Xiao says that judging the severity of avian flu’s spread from satellite imaging required knowing details about the human populations as well—for instance, how likely certain communities were to raise ducks for poultry consumption. “Satellite monitoring has a capacity to provide consistent observation,” Xiao says. “On the other hand, the in situ observations are still very, very important, so the key is to combine those together. That is a real challenge.”

More Ideas to watch
By Melinda Wenner

Quick Clots
Emergency technicians could prevent up to 35 percent of prehospital trauma deaths if they had better and cheaper ways to prevent blood loss. Now a University of Maryland–affiliated start-up called Trauma Solutions has developed a synthetic hydrogel that can clot blood by prompting the body to make fibrin, a protein that seals wounds and stops bleeding. Future iterations could simultaneously release such medicines as antibiotics and painkillers. Each application will cost about $5, compared with some natural blood-clotting substances that cost upward of $500.

Lab-on-a-Stamp
Liver damage is a major side effect of HIV/AIDS and tuberculosis drugs, yet few developing countries have enough trained scientists or equipment to monitor it. Nonprofit Cambridge, Mass.–based Diagnostics For All has developed an inexpensive fingernail-size device made almost entirely of paper that monitors liver damage using a single drop of blood. Channels in the paper guide blood to regions that change color depending on the levels of two damage-related liver enzymes.

Bacterial Toothpaste
Streptococcus mutans bacteria in the mouth decay teeth by converting sugars into enamel-eroding lactic acid. Florida-based Oragenics has genetically engineered a new strain of bacteria that converts sugars to trace amounts of alcohol instead. Because the new strain permanently displaces natural S. mutans, the therapy, which is currently in clinical trials, will be available as a one-time prescription that will protect teeth for life.

| Read Comments (29)

 

MIT economist: Wall Street created worst recession since WWII

MIT economist: Wall Street created worst recession since WWII | The Raw Story – Digg.

rawstory.com — MIT economics professor Simon Johnson said on MSNBC’s The Rachel Maddow Show on Wednesday night that Wall Street “blew itself up,” which lead to the “most severe recession since World War II.” The former chief economist of the International Monetary Fund added that the enormous economic damage was “a direct consequence of what the biggest banks did and were allowed to get away with.” Watch video, courtesy of MSNBC, below: Visit msnbc.com for breaking news, […] 1 day 8 hr ago

MIT economist: Wall Street created worst recession since WWII | The Raw Story

Two Plus Two Equals Five – A 2nd look at disaster death tolls

Two Plus Two Equals Five – By Philip Walker | Foreign Policy.

The death toll and level of destruction immediately following a disaster are always difficult to determine, but over time a consensus usually emerges between governments and aid organizations. But, as David Rieff points out, “Sadly, over the course of the past few decades, exaggeration seems to have become the rule in the world of humanitarian relief.… These days, only the most extreme, most apocalyptic situations are likely to move donors in the rich world.” And with donor fatigue an ever-present possibility, it is no surprise then that later studies that contradict the original, inflated estimates are criticized — or worse, ignored — for seemingly undermining the humanitarian cause.

Arriving at these estimates is no easy endeavor, as government agencies and relief organization are rarely able to survey entire populations. Instead, emergency management experts rely on sound statistical and epidemiological techniques. But debating and questioning the numbers behind man-made and natural disasters is not just an academic exercise: the implications are huge. For example, relief agencies were restricted from operating in Darfur, partly because of Sudan’s anger that the U.S.-based Save Darfur Coalition had estimated that 400,000 people were killed in the region. Moreover, the U.N. Security Council used the International Rescue Committee’s death toll of 5.4 million in the Congo to put together its largest peacekeeping operation ever. Similarly, government aid pledges increase or decrease depending upon the extent of the disaster. Numbers do matter, and much depends upon their validity and credibility. What follows is a look at some recent disasters where the numbers just don’t match up.

Above, a view of some of the destruction in Bandar Aceh, Indonesia, a week after the devastating earthquake and tsunami struck on Dec. 26, 2004. According to the U.S. Geological Survey, 227,898 people died and about 1.7 million people were displaced in 14 countries in Southeast Asia, South Asia, and East Africa. Indonesia, the hardest hit country by the disaster, initially claimed that 220,000 people had died or went missing but ended up revising that number down to around 170,000.

THE DEADLIEST WAR IN THE WORLD

Discrepancy: 5.4 million vs. 900,000 dead in the Democratic Republic of the Congo between 1998 and 2008

The Democratic Republic of the Congo (DRC) has seen more than its fair share of conflict over the past 15 years. The war in the DRC officially broke out in 1998 and although the conflict technically ended in 2003 when the transitional government took over, fighting has continued in many of the country’s provinces. The conflict has been dubbed “Africa’s World War,” both due to the magnitude of the devastation and the number of African countries that have, at different times, been involved in the conflict. According to a widely cited 2008 report by the New York-based International Rescue Committee (IRC), “an estimated 5.4 million people have died as a consequence of the war and its lingering effects since 1998,” making it the world’s deadliest crisis since World War II. The organization is one of the largest providers of humanitarian aid in the Congo and is therefore deemed one of the few reliable sources on the conflict.

However, Andrew Mack, director of the Human Security Report Project at Simon Fraser University in Canada, said the IRC study did not employ appropriate scientific methodologies and that in reality far less people have died in the Congo. “When we used an alternative measure of the pre-war mortality rate, we found that the IRC estimates of their final three surveys, the figure dropped from 2.83 million to under 900,000,” Mack argued. (He also argued that international relief agencies — such as the International Rescue Committee — are facing a potential conflict of interest because they depend on donations that, in turn, are stimulated by their studies of death tolls. Those studies should be done by independent experts, not by relief agencies that depend on donations, he says.)

Above, the body of a young man lying on the central market avenue of Ninzi, about 25 miles north of Bunia, where on June 20, 2003, Lendu militias launched an attack, killing and mutilating at least 22 civilians.

Discrepancy: 400,000 vs. 15,000 women raped in the Democratic Republic of the Congo between 2006 and 2007

A June 2011 study in the American Journal of Public Health found that 400,000 women aged 15-49 were raped in the DRC over a 12-month period in 2006 and 2007. The shockingly high number is equivalent to four women being raped every five minutes. Perhaps even more alarming, the new number is 26 times higher than the 15,000 rapes that the United Nations reported during the same period.

Maria Eriksson Baaz, a Swedish academic from the University of Gothenburg, has called the study into question by arguing that it is based on out-of-date and questionable figures. As a long-time researcher on women’s rights in the DRC, Baaz claims that extrapolations made from these figures cannot be backed up scientifically. In a recent interview with the BBC, she said it was difficult to collect reliable data in the Congo and that women sometimes claim to be victims in order to get free health care. “Women who have been raped can receive free medical care while women who have other conflict-related injuries or other problems related to childbirth have to pay,” she said. “In a country like the DRC, with [its] extreme poverty where most people can simply not afford health care, it’s very natural this happens.”

Above, Suzanne Yalaka breastfeeds her baby Barunsan on Dec. 11, 2003, in Kalundja, South Kivu province. Her son is the consequence of her being raped by ten rebels from neighboring Burundi. She was left behind by her husband and her husband’s family.

NORTH KOREAN FAMINE

Discrepancy: 2.4 million vs. 220,000 dead in North Korea between 1995 and 1998

Due to the regime’s secretive nature, reliable statistics on the 1990s famine in North Korea are hard to come by. Yet, surprisingly, on May 15, 2001, at a UNICEF conference in Beijing, Choe Su-hon, one of Pyongyang’s nine deputy foreign ministers at the time, stated that between 1995 and 1998, 220,000 North Koreans died in the famine. Compared with outside estimates, these figures were on the low end — presumably because it was in the regime’s interest to minimize the death toll.

A 1998 report by U.S. congressional staffers, who had visited the country, found that from 1995 to 1998 between 900,000 and 2.4 million people had died as a result of food shortages. It noted that other estimates by exile groups were substantially higher but that these numbers were problematic because they were often based on interactions with refugees from the northeastern province of North Hamgyong, which was disproportionately affected by the famine.

Above, North Koreans rebuilding a dike in Mundok county, South Pyongan province, in September 1997, following an August tidal wave after typhoon Winnie. The rebuilding effort was part of an emergency food-for-work project organized by the World Food Program. According to a former North Korean government official, during the famine — from 1993 to 1999 — life expectancy fell from 73.2 to 66.8 and infant mortality almost doubled from 27 to 48 per 1,000 people.

GENOCIDE IN DARFUR

Discrepancy: 400,000 vs. 60,000 dead in Darfur between 2003 and 2005

In 2006, three years after the conflict in Darfur began, Sudanese President Omar al-Bashir publically criticized the United Nations for exaggerating the extent of the fighting in Darfur. “The figure of 200,000 dead is false and the number of dead is not even 9,000,” he proclaimed. At the same time, outside groups like the Save Darfur Coalition and various governments, including the United States, were having a difficult time producing concrete numbers as well. Their only consensus was that the real death toll was exponentially higher than those numbers provided by Bashir.

In 2005, a year after U.S. Secretary of State Colin Powell told a U.S. congressional committee that the ethnic violence in Darfur amounted to “genocide,” Deputy Secretary of State Robert Zoellick estimated the death toll between 60,000 and 160,000. Zoellick was widely criticized for understating the numbers. The World Health Organization estimated that 70,000 people had died over a seven-month period alone. At the same time, researchers for the Coalition for International Justice contended that 396,563 people had died in Darfur. Today, the Sudanese authorities claim that since the conflict began in 2003, 10,000 people have died, while the U.N. estimates that over 300,000 have been killed and another 2.7 million have been displaced.

Above, an armed Sudanese rebel arrives on Sept. 7, 2004, at the abandoned village of Chero Kasi less than an hour after Janjaweed militiamen set it ablaze in the violence-plagued Darfur region.

CYCLONE NARGIS 

Discrepancy: 138,000 vs. unknown death toll in Burma in 2008

Tropical cyclone Nargis made landfall in southern Burma on May 2, 2008, leaving a trail of death and destruction before petering out the next day. It devastated much of the fertile Irrawaddy delta and Yangon, the nation’s main city. Nargis brought about the worst natural disaster in the country’s history — with a death toll that may have exceeded 138,000, according to a study by the Georgia Institute of Technology. But, with a vast number of people still unaccounted for three years later, the death toll might even be higher. The Burmese authorities allegedly stopped counting for fear of political fallout.

It’s more common for countries hit by a devastating disaster to share their plight with the world and plead for a robust relief effort, but in the aftermath of cyclone Nargis the Burmese military regime sought to maintain control over news of the disaster — restricting access to journalists and censoring the release of information and images. Moreover, the United Nations and other relief agencies were initially banned from setting up operations. At the time, with over 700,000 homes blown away, the U.N. and the Red Cross estimated that over 2.5 million people were in desperate need of aid.

Above, school teacher Hlaing Thein stands on the wreckage of a school destroyed by cyclone Nargis in Mawin village in the Irrawaddy delta region on June 9, 2008.

 

Two Plus Two Equals Five

What numbers can we trust? A second look at the death toll from some of the world’s worst disasters.

BY PHILIP WALKER | AUGUST 17, 2011

EARTHQUAKE IN HAITI

Discrepancy: 318,000 vs. 46,000-85,000 dead in Haiti in 2010

The devastating earthquake of Jan. 12, 2010, killed over 318,000 people and left over 1.5 million people homeless, according to the Haitian government. International relief organizations generally estimate anywhere between 200,000 and 300,000 casualties.

However, a recently leaked report compiled for USAID by a private consulting firm claims that the death toll is likely between 46,000 and 85,000, and that roughly 900,000 people were displaced by the earthquake. The report has not yet been published, but its alleged findings have already been disputed by both Haitian authorities and the United Nations. Even the U.S. State Department, for now, is reluctant to endorse it, saying “internal inconsistencies” in some of the statistical analysis are currently being investigated prior to publication.

PAKISTAN FLOODS

Discrepancy: Large numbers affected vs. small death toll in Pakistan in 2010

A young girl washes the mud from her toy at a water pump in the middle of collapsed buildings at a refugee camp near Nowshera in northwest Pakistan on Sept. 23, 2010. Figures provided by the United Nations and Pakistan’s government estimate that 20 million people were affected by the 2010 summer floods — the worst in the country’s history. Almost 2,000 people died, 3,000 were injured, 2 million homes were damaged or destroyed, and over 12 million people were left in need of emergency food aid, according to Pakistan’s National and Provincial Disaster Management Authority. Flood waters wiped out entire villages and vast stretches of farmland affecting an area roughly the size of England. After surveying 15 key sectors across the country, in Oct. 2010, the World Bank and Asian Development Bank announced an estimated damage of $9.7 billion — an amount more than twice that of Pakistan’s 2005 earthquake which killed approximately 86,000 people. U.N. Secretary-General Ban Ki-moon characterized the destruction as more dire than that caused by the 2004 Indian Ocean tsunami and the Pakistani earthquake combined. “In the past I have visited the scenes of many natural disasters around the world, but nothing like this,” he stated.

David Rieff warns that, “By continually upping the rhetorical ante, relief agencies, whatever their intentions, are sowing the seeds of future cynicism, raising the bar of compassion to the point where any disaster in which the death toll cannot be counted in the hundreds of thousands, that cannot be described as the worst since World War II or as being of biblical proportions, is almost certainly condemned to seem not all that bad by comparison.” This was the case in Pakistan where the number affected by the flooding was gigantic but the death toll was relatively low — especially compared to the Haiti earthquake a few months earlier. As a result, the United Nations and other aid organizations were unable to raise large sums for the relief effort compared to previous disasters. “Right now, our level of needs in terms of funding is huge compared to what we’ve been receiving, even though this is the largest, by far, humanitarian crisis we’ve seen in decades, ” said Louis-George Arsenault, director of emergency operations for UNICEF, in an interview with the BBC in Aug. 2010.

As David Meltzer, senior vice president of international services for the American Red Cross, discerningly put it, “Fortunately, the death toll [in Pakistan] is low compared to the tsunami and the quake in Haiti. … The irony is, our assistance is focused on the living — and the number of those in need is far greater than in Haiti.”

 

Post-9/11 U.S. intelligence reforms take root but problems remain

Post-9/11 U.S. intelligence reforms take root, problems remain | Reuters.

(Reuters) – U.S. intelligence agencies will forever be scarred by their failure to connect the dots and detect the September 11 plot, but a decade later efforts to break down barriers to information-sharing are taking root.

Changing a culture of “need-to-know” to “need-to-share” does not come easily in spy circles. Some officials say they worry, a decade later, about a future attack in which it turns out that U.S. spy agencies had clues in their vast vaults of data but did not put them together, or even know they existed.

Yet significant changes, both big and small, have broken down barriers between agencies, smoothed information-sharing and improved coordination, U.S. intelligence experts say.

From issuing a blue badge to everyone working in the sprawling intelligence community to symbolize a common identity, to larger moves of mixing employees from different agencies, the goal is singular — to prevent another attack.

“We’re much further ahead,” David Shedd, Defense Intelligence Agency deputy director, said of the ability to connect the dots compared with 10 years ago. Still, signs of a plot to attack the United States could be missed again.

“My worst fear, and I suspect probably one that would come true, is that in any future would-be or actual attack, God forbid, we will be able to find the dots again somewhere because of simply how much data is collected,” Shedd said.

The political response to the failure to stop the attack was the 2002 creation of the Department of Homeland Security, pulling together 22 agencies to form the third largest U.S. Cabinet department behind the Pentagon and Veterans Affairs.

That was followed by the creation in late 2004 of the Director of National Intelligence to oversee all the spy agencies, as recommended by the bipartisan 9/11 commission.

Previously, the CIA director held a dual role of also overseeing the multitude of intelligence agencies. But in the aftermath of the 2001 attacks, policymakers decided that was too big of a job for one person to do effectively.

‘THERE ARE PROBLEMS’

Critics argued then and now that the reforms were the government’s usual response to crises — create more bureaucracy. But others see much-needed change.

“It has been a tremendous improvement,” said Lee Hamilton, who was the 9/11 commission vice chair. “It’s not seamless, there are problems, and we’ve still got a ways to go.”

The 2001 attacks involving airliners hijacked by al Qaeda operatives killed nearly 3,000 people in New York, Pennsylvania and the Pentagon. Various U.S. intelligence and law enforcement agencies had come across bits of information suggesting an impending attack but failed to put the pieces together.

The CIA had information about three of the 19 hijackers at least 20 months before the attacks; the National Security Agency had information linking one of the hijackers with al Qaeda leader Osama bin Laden’s network; the CIA knew one hijacker had entered the United States but did not tell the FBI; and an FBI agent warned of suspicious Middle Eastern men taking flying lessons.

Have the reforms made America safer? Officials say yes, and point to the U.S. operation that killed bin Laden in Pakistan in May that demanded coordination among intelligence agencies and the military. But there is an inevitable caveat: no one can guarantee there will never be another attack on U.S. soil.

On Christmas Day 2009, a Nigerian man linked to an al Qaeda off-shoot tried unsuccessfully to light explosives sewn into his underwear on a flight to Detroit from Amsterdam. It turned out U.S. authorities had pockets of information about him.

President Barack Obama used a familiar September 11 phrase to describe the 2009 incident as “a failure to connect the dots of intelligence that existed across our intelligence community.”

Roger Cressey, a former White House National Security Council counterterrorism official, resurrected another September 11 phrase: “It was a failure of imagination.”

The intelligence community had not seen al Qaeda in the Arabian Peninsula, a Yemen-based al Qaeda off-shoot, as capable of striking the U.S. homeland. If the “underwear bomber” threat had originated in Pakistan “they would have gone to battle stations immediately,” Cressey said.

Some proposed changes in how authorities would respond to another successful attack still are pending. For example, creation of a common communication system for police, firefighters and other emergency personnel remains tangled up in political wrangling in Congress over how to implement it.

“This is a no-brainer,” Hamilton said. “The first responders at the scene of a disaster ought to be able to talk with one another. They cannot do it today in most jurisdictions.”

Former leaders of the 9/11 commission issued a report card saying nine of its 41 recommendations remain unfinished.

WHERE’S THE POWER?

The Office of the Director of National Intelligence has experienced growing pains as overseer of the 17 spy agencies, churning through four chiefs in six years.

Tensions over turf, confusion about the DNI’s role, and problems herding agencies with very powerful chiefs of their own all came to a crescendo when retired Admiral Dennis Blair, the third DNI, tried to assert authority over CIA station chiefs, who represent the agency in different countries.

“The position of chief of station is one of the crown jewels of the CIA, and they don’t want anyone playing with their crown jewels,” said Mark Lowenthal, a former senior U.S. intelligence official.

After a dust-up with CIA Director Leon Panetta, who now is defense secretary, it was Blair who was sent packing.

“I think the mistake that some have made is to have viewed the DNI and the Director of CIA as an either/or proposition rather than the power of the two working together,” the DIA’s Shedd said in an interview in his office.

“There is a history of where that hasn’t worked so well, I believe it is working much better today,” said Shedd, who has worked at the DNI, CIA and National Security Council.

Intelligence experts say in the current administration, Obama’s top homeland security and counterterrorism adviser John Brennan arguably has more power than any of them because he has the president’s ear. It’s a reminder that, bureaucratic reform or no, personalities count in making national security policy.

The improved sharing of secret data has led to yet another set of problems. The deluge of bits and bytes has subjected intelligence analysts to information overload as they try to sift through it all for relevant pieces.

“Our analysts still are spending way too much time on finding the information rather than on the analysis of the information,” Shedd said. “There is just too much data to go find it all.”

The intelligence community wants a system developed that would automatically process information from multiple agencies and then make the connections for the analysts.

But greater inroads into sharing data across agencies does not guarantee that another attack will be averted.

The threat has evolved and officials now are increasingly concerned about a “lone wolf” plot by an individual, not tied to any militant group, that may be more difficult to uncover.

“Those threats will not come to our attention because of an intelligence community intercept,” said John Cohen, a senior Department of Homeland Security counterterrorism official.

“They will come to our attention because of an alert police officer, an alert deputy sheriff, an alert store owner, an alert member of the public sees something that is suspicious and reports it,” Cohen said.

One measure of the success of post-9/11 reforms is that a decade later the United States has not had a similar attack.

“Now that could be luck, that could be skill, we don’t really know,” Hamilton said. “But in all likelihood what we have done, including the establishment of the Department of Homeland Security and the transformation in intelligence and FBI, has certainly been helpful.”

(Editing by Warren Strobel and Will Dunham)

Can we predict earthquakes?

BBC News – Can we predict when and where quakes will strike?.

l'Aquila earthquake Seismologists try to manage the risk of building damage and loss of life

Related Stories

This week, six seismologists go on trial for the manslaughter of 309 people, who died as a result of the 2009 earthquake in l’Aquila, Italy.

The prosecution holds that the scientists should have advised the population of l’Aquila of the impending earthquake risk.

But is it possible to pinpoint the time and location of an earthquake with enough accuracy to guide an effective evacuation?

There are continuing calls for seismologists to predict where and when a large earthquake will occur, to allow complete evacuation of threatened areas.

What causes an earthquake?

An earthquake is caused when rocks in the Earth’s crust fracture suddenly, releasing energy in the form of shaking and rolling, radiating out from the epicentre.

The rocks are put under stress mostly by friction during the slow, 1-10 cm per year shuffling of tectonic plates.

The release of this friction can happen at any time, either through small frequent fractures, or rarer breaks that release a lot more energy, causing larger earthquakes.

It is these large earthquakes that have devastating consequences when they strike in heavily populated areas.

Attempts to limit the destruction of buildings and the loss of life mostly focus on preventative measures and well-communicated emergency plans.

Predicting an earthquake with this level of precision is extremely difficult, because of the variation in geology and other factors that are unique to each location.

Attempts have been made, however, to look for signals that indicate a large earthquake is about to happen, with variable success.

Historically, animals have been thought to be able to sense impending earthquakes.

Noticeably erratic behaviour of pets, and mass movement of wild animals like rats, snakes and toads have been observed prior to several large earthquakes in the past.

Following the l’Aquila quake, researchers published a study in the Journal of Zoology documenting the unusual movement of toads away from their breeding colony.

But scientists have been unable to use this anecdotal evidence to predict events.

The behaviour of animals is affected by too many factors, including hunger, territory and weather, and so their erratic movements can only be attributed to earthquakes in hindsight.

Precursor events

When a large amount of stress is built up in the Earth’s crust, it will mostly be released in a single large earthquake, but some smaller-scale cracking in the build-up to the break will result in precursor earthquakes.

Start Quote

There is no scientific basis for making a prediction”

Richard Walker University of Oxford

These small quakes precede around half of all large earthquakes, and can continue for days to months before the big break.

Some scientists have even gone so far as to try to predict the location of the large earthquake by mapping the small tremors.

The “Mogi Doughnut Hypothesis” suggests that a circular pattern of small precursor quakes will precede a large earthquake emanating from the centre of that circle.

While half of the large earthquakes have precursor tremors, only around 5% of small earthquakes are associated with a large quake.

So even if small tremors are felt, this cannot be a reliable prediction that a large, devastating earthquake will follow.

“There is no scientific basis for making a prediction”, said Dr Richard Walker of the University of Oxford.

In several cases, increased levels of radon gas have been observed in association with rock cracking that causes earthquakes.

Leaning building Small ground movements sometimes precede a large quake

Radon is a natural and relatively harmless gas in the Earth’s crust that is released to dissolve into groundwater when the rock breaks.

Similarly, when rock cracks, it can create new spaces in the crust, into which groundwater can flow.

Measurements of groundwater levels around earthquake-prone areas see sudden changes in the level of the water table as a result of this invisible cracking.

Unfortunately for earthquake prediction, both the radon emissions and water level changes can occur before, during, or after an earthquake, or not at all, depending on the particular stresses a rock is put under.

Advance warning systems

The minute changes in the movement, tilt, and the water, gas and chemical content of the ground associated with earthquake activity can be monitored on a long term scale.

Measuring devices have been integrated into early warning systems that can trigger an alarm when a certain amount of activity is recorded.

Start Quote

Prediction will only become possible with a detailed knowledge of the earthquake process. Even then, it may still be impossible”

Dr Dan Faulkner University of Liverpool

Such early warning systems have been installed in Japan, Mexico and Taiwan, where the population density and high earthquake risk pose a huge threat to people’s lives.

But because of the nature of all of these precursor reactions, the systems may only be able to provide up to 30 seconds’ advance warning.

“In the history of earthquake study, only one prediction has been successful”, explains Dr Walker.

The magnitude 7.3 earthquake in 1975 in Haicheng, North China was predicted one day before it struck, allowing authorities to order evacuation of the city, saving many lives.

But the pattern of seismic activity that this prediction was based on has not resulted in a large earthquake since, and just a year later in 1976 a completely unanticipated magnitude 7.8 earthquake struck nearby Tangshan causing the death of over a quarter of a million people.

The “prediction” of the Haicheng quake was therefore just a lucky unrepeatable coincidence.

A major problem in the prediction of earthquake events that will require evacuation is the threat of issuing false alarms.

Scientists could warn of a large earthquake every time a potential precursor event is observed, however this would result in huge numbers of false alarms which put a strain on public resources and might ultimately reduce the public’s trust in scientists.

“Earthquakes are complex natural processes with thousands of interacting factors, which makes accurate prediction of them virtually impossible,” said Dr Walker.

Seismologists agree that the best way to limit the damage and loss of life resulting from a large earthquake is to predict and manage the longer-term risks in an earthquake-prone area. These include the likelihood of building collapsing and implementing emergency plans.

“Detailed scientific research has told us that each earthquake displays almost unique characteristics, preceded by foreshocks or small tremors, whereas others occur without warning. There simply are no rules to utilise in order to predict earthquakes,” said Dr Dan Faulkner, senior lecturer in rock mechanics at the University of Liverpool.

“Earthquake prediction will only become possible with a detailed knowledge of the earthquake process. Even then, it may still be impossible.”

More on This Story

Related Stories

Surprise! House disaster vote has become political football

House disaster vote sets up showdown with Senate – BusinessWeek.

The GOP-controlled House remains on track to pass $3.7 billion in disaster relief as part of a bill to avert a government shutdown at the end of the month, the No. 2 House Republican said Wednesday. But first the party must overcome opposition from Democrats and some tea party Republicans.

Democratic leaders, including some who said last week they would back the stopgap measure, came out solidly against it Wednesday morning because it contains $1.5 billion in cuts from a government loan program to help car companies build more fuel-efficient vehicles.

That money would pay for the most urgently needed portion of the disaster aid that’s required to avoid a cutoff next week of Federal Emergency Management Agency relief to victims of Hurricane Irene, recent Texas wildfires and Tropical Storm Lee.

GOP leaders are also encountering opposition from tea party Republicans like Rep. Jeff Landry of Louisiana, who opposes the stopgap measure because it permits a higher spending rate than Republicans proposed last spring. The measure instead follows a hard-fought spending pact endorsed by GOP leaders and President Barack Obama.

Majority Leader Eric Cantor, R-Va., predicted Wednesday that the stopgap measure, commonly called a continuing resolution, or CR, will pass.

In the Senate, the measure awaits a battle with Democrats. That fight involves how much disaster aid to provide and whether any of it should be paid for with offsetting spending cuts.

FEMA has only a few days’ worth of aid remaining in its disaster relief fund. The agency has already held up thousands of longer-term rebuilding projects — repairs to sewer systems, parks, roads and bridges, for example — to conserve money to provide emergency relief to victims of recent disasters.

The House measure contains $1 billion in immediate aid for the 2011 budget year that’s about to end and another $2.7 billion for the 2012 budget year beginning Oct. 1. The Senate measure totals $6.9 billion, with $804 million proposed for the last few days of fiscal 2011.

Senate Majority Leader Harry Reid, D-Nev., said that once the stopgap measure passes the House, he’ll move to substitute the Senate’s more generous aid package for the House’s version. It will take at least seven Republicans to join with majority Democrats to win the 60 votes likely required to defeat GOP blocking tactics.

Ten Republicans voted with Reid last week to pass the stand-alone disaster aid measure, but their votes can’t be taken for granted now. Tea party favorites like Sens. Marco Rubio, R-Fla., and Pat Toomey, R-Pa., were among those who voted with Reid last week, but they told reporters Wednesday that they’ll instead support the partially paid-for House version.

If two more Senate Republicans switch, Reid would no longer have the 60 votes he needs.

In the House, Democrats are rallying against the measure because of accompanying cuts to an Energy Department program that subsidizes low-interest loans to help car companies and parts manufacturers retool factories to build vehicles that will meet new, tougher fuel economy standards. These lawmakers include House Democratic Whip Steny Hoyer of Maryland and top Appropriations Committee Democrat Norm Dicks of Washington. Both had previously said they would support the measure.

Democrats say cutting the loan program could cost up to 10,000 jobs because there wouldn’t be enough money for all pending applications.

“While the government has a responsibility to fund disaster response in places that were devastated by Hurricane Irene or other natural disasters, it is unconscionable to use funds designed to create jobs in manufacturing states to pay for it,” Reps. Gary Peters, D-Mich., and Anna Eshoo, D-Calif., said in a letter to House Speaker John Boehner, R-Ohio.

They credited $3.5 billion of loan subsidies with supporting loans totaling $9.2 billion that created or saved 41,000 jobs in Tennessee, California, Indiana, Michigan, Delaware, Illinois, Kentucky, Missouri and Ohio. Ford Motor Co. and Nissan Motor Co. have already received loans; Chrysler Group LLC is awaiting final approval of a loan.

A protracted showdown could ultimately lead to a partial shutdown of the government when the budget year ends Sept. 30. That’s unlikely, however.

Senate Minority Leader Mitch McConnell, R-Ky., predicted the conflict could be worked out in time for the Senate to make a Thursday night getaway to a weeklong recess. Such a scenario probably depends on Republicans prevailing.

“Congress always responds appropriately to disasters,” McConnell said. “We’re having a discussion about the appropriate way to do that, and I’m confident it will be resolved.”

Reid, however, is spoiling for the battle. “We’re not going to cave in on this,” he said.

The underlying stopgap funding measure would finance the government through Nov. 18 to give lawmakers more time to try to reach agreement on the 12 unfinished spending bills needed to run government agencies on a day-to-day basis for the 2012 budget year.