Tag Archives: disaster prevention

climate policy and climate science inhabit parallel worlds

The mask slips : Nature : Nature Publishing Group.

It says a lot about the outcome of the UN climate talks in South Africa at the weekend that most of the immediate reports focused on the wrangling that led to an agreement of sorts, rather than the contents and implications of the agreement itself. Late-night talks, later-night arguments and early-morning pacts between battling negotiators with the apparent fate of the world resting on their shoulders give the process a melodrama that is hard to resist, particularly for those who experienced it first hand in the chaos of the Durban meeting (see page 299).

Such late finishes are becoming the norm at these summits. Only as nations abandon their original negotiating positions and reveal their true demands — throwing international differences into stark relief — does a sense of urgency develop and serious negotiation take place. Combined with the consensus nature of the talks, which demands that everyone agrees to everything, the result is usually a cobbled-together compromise that allows as many countries as possible to claim victory and, most importantly, provides them with a mandate to reconvene in 12 months’ time.

So it was this time. In the search for a successor to the Kyoto Protocol, we now have the Durban Platform, which comes on the heels of the Bali Road Map and the Copenhagen Accord.

It takes a certain kind of optimism — or an outbreak of collective Stockholm syndrome — to see the Durban outcome as a significant breakthrough on global warming, as many are claiming. Outside Europe — which has set itself binding emissions goals over the short and long term beyond what it will inherit under its stated plan to carry on with unilateral cuts under an extended Kyoto — there will be no obligation for any nation to reduce soaring greenhouse-gas emissions much before the end of the decade. And that is assuming that all flows smoothly in future UN talks, and that a global deal with binding commitments proves easier to find in talks due to start in 2015 than it has so far.

The Durban deal may mark a success in the political process to tackle climate change, but for the climate itself, it is an unqualified disaster. It is clear that the science of climate change and the politics of climate change, which claims to represent it, now inhabit parallel worlds.

This has always been true up to a point, but surely the mask of political rhetoric has now slipped so far, to reveal the ugly political reality underneath, that it can never be replaced. How can politicians talk now with a straight face of limiting global warming to 2 °C? How will campaigners frame this result as leaving yet another ‘last chance’ to save the planet?

That does not make the political process redundant — far from it. Introducing policies to curb emissions was never about saving the planet or not, or stopping global warming or not. It is about damage limitation — the 3 °C or 4 °C of average warming the planet could experience in the long term, according to some analyses of the Durban outcome doing the rounds, is clearly much worse than the 2 °C used as shorthand for dangerous at present. But it is preferable to the 5 °C or 6 °C that science suggests is possible if emissions continue to rise unabated.

To prevent that outcome will be just as difficult politically as was the now abandoned attempt to find a global successor in time to follow Kyoto. But it remains possible — and there were at least encouraging signs in Durban that previously obstinate countries recognize that it is necessary, even if it is delayed. Those, including this journal, who have long argued the scientific case for the need to control greenhouse-gas emissions should back this new political mood to the hilt. But as the Durban Platform crowds with politicians, the climate train they wait for has left the station.

Comments

  1. 2011-12-14 02:05 AM

    Report this comment #34028

    Jeffrey Thaler said:
    Well written editorial, and unfortunately too accurate. There is a theme arising out of Durban on the limits of legal-political processes, as well as the growing gap between scientific and political “realities”. How to bridge that gap, so we are not just mitigating significant harms to the world our children inherit, is the still-to-be-resolved challenge that requires work outside of the big conference halls. Time and growing GHG emissions are not waiting for any of us.

  2. 2011-12-14 03:13 AM

    Report this comment #34039

    Fred Singer said:
    The Nature editorial (Dec 15; The Mask Slips) talks about science and policy in parallel universes. Quite correct ? if you mean ?separate? and ?disconnected.? COP 17 was never about climate, let alone science. It was all about money: (1) How to assure continuing government careers for 200 delegations, with annual vacations paid by taxpayers. (2) How to transfer $100 billion a year from industrialized nations to LDCs (or more precisely, to their kleptocratic rulers), using ?climate justice? or ?climate guilt? (depending on who is doing the talking). (3) How to gain a national advantage by setting differential emission limits.

    By now it should be obvious that (1) the enshrined temperature limit of +2degC is based on fiction and has no scientific basis. As an annual global average, climate models tell us, it will mean warmer winter nights in Siberia and Canada; perhaps -35deg instead of -40; and little warming in the tropics. (2) It should also be obvious that even strenuous and economy-killing efforts at mitigation, will have little effect on atmospheric levels of carbon dioxide, let alone on climate. If a demonstration is needed, just look at the lack of warming since 1998, in spite of rapidly rising levels of greenhouse gases.

    So, yes, I would agree with the editorial, if properly expanded.

  3. 2011-12-14 05:18 AM

    Report this comment #34049

    Kevin Matthews said:
    Yes, great editorial. Coming from the world’s leading scientific journal (which of course would prefer not to have to say such things) one would hope that authorities and media around the world take significant notice.

    Thinking about the whole UN climate negotiation process, and how complex and cumbersome it is to seek unanimous agreement from 194 countries….

    Then comparing what has come out of the COP17 cycle – significant and landmark progress, even if still sharply insufficient to the urgency of need – to what has come out of the U.S. Congress over the last several months or more, with its supposedly streamlined and results-oriented binary democracy approach – practically nothing.

    And suddenly – surprise! – consensus (in this entirely limited comparison) looks pretty darn effective – just from a simple results-accomplished perspective.

    For which differential, there is, in turn, good scientific reason.

  4. 2011-12-15 05:14 AM

    Report this comment #34107

    John Wheelahan said:
    No, there are no parallel worlds – the science and politics of AGW share the same scam. Spare us the crap about 6 degree C temperature rise , when you know that this is a lie. No temperature rise for a decade!
    The science and politics are about money – the greatest swindle since the South Sea Bubble. Hundreds of billions of dollars are to be given to African despots, conmen, swindlers and bankers for a scientific fanatsy. These beneficiaries will live in luxury in their Mediteranean villas while the poor of the third world countries and developed countries will be the sufferers, and pay the price. Please get real, Nature Editor.

  5. 2011-12-15 07:21 AM

    Report this comment #34146

    Patrik D’haeseleer said:
    I think it is very clear that the “global consensus” approach to dealing with climate change has failed.

    I may be time for those countries who are willing to do something about it to band together and go it alone. And then start charging tariffs on any goods imported from countries not part of the coalition, proportional to the amount CO2 pollution caused by those countries.

    If we can get Europe, Africa and the island nations on board, I don’t think it would take too long for China and India to follow suit.

  6. 2011-12-15 11:35 AM

    Report this comment #34154

    Michael Lerman said:
    I do not subscribe to the concept of global warming induced by human activities. About a 1,000 years ago Greenland was green and cows brought by the Vikings polluted the clean Arctic air. Instead of global warming Greenland got frozen till today. I often go to The Canadian Arctic and indeed can testify that the mean temperatures in July are higher than previously (~10 years ago), and though my Inuit friends blame the US government, I argue and try to persuade them their view is wrong. Michael Lerman, Ph.D., M.D.

  7. 2011-12-18 06:28 AM

    Report this comment #34314

    Karin Green said:
    I find this comment in the article troubling: “Those, including this journal, who have long argued the scientific case for the need to control greenhouse-gas emissions should back this new political mood to the hilt”, especially when you say something like ” there were at least encouraging signs in Durban that previously obstinate countries recognize that it is necessary, even if it is delayed”.

    To me, this bodes ill for an open minded and unbiased editorial policy!

  8. 2011-12-19 06:47 AM

    Report this comment #34516

    Jeffrey Eric Grant said:
    The COP people have been at it for a long time! I would think that if the science is solid, then the arguements would have moved foreward, at least a little. Instead, we are still talking about the evidence of global warming, and how to mitigate against it.
    AGW is all based on atmospheric rise in CO2 that was put there by human activity.So, now we have closed the talks in Durban, still with no agreement on the cause of the increased CO2 that will, someday, maybe, eventually, turn the world temperatures a little warmer. Not in my lifetime; maybe not even in yours!
    I challenge anyone on this thread to answer either of the following two questions:
    1) direct me to a recent empirical scientific study that concludes that increased atmospheric CO2 caused the inclease in atmospheric temperatures more than about 2C/100yr?, or
    2) Since water retains less CO2 when it is heated, how can the worlds oceans be both warmer and more acidic at the same time?

How to make green steel

CultureLab: How to make steel go green – with songs!.

Michael Marshall, environment reporter

greensteel2.jpg

This is something you don’t see every day: a substantial, carefully-researched book on how to reform our manufacturing industries, paired with an album of songs on the same theme.

Let’s start with the book. Sustainable Materials: With Both Eyes Open tackles a particularly thorny question: how can we cut our greenhouse gas emissions to a safe level, without shutting down essential industries? It focuses on steel and aluminium, which between them account for 28 per cent of all industrial emissions, although later chapters briefly consider cement, paper and plastics as well.

This is a follow-up book to David MacKay’s much-vaunted Sustainable Energy – Without the Hot Air. Both feature academics from the University of Cambridge carefully working out how we can transform an emissions-heavy sector of the economy.

The eight authors, led by Julian Allwood and Jonathan Cullen, first take a close look at how steel and aluminium are produced from their respective ores, asking “how much can the metals industry do to clean up its act?” The answer they come up with: “plenty, but nowhere near enough”.

So they take a second approach, asking whether we can redesign the things we make to use less metal, use them for longer, and recycle their components when they wear out. This also offer plenty of options. Reassuringly, when the two approaches are combined the total emissions cuts are substantial.

 

Some of the ideas they come up with are so simple, I wondered why no one thought of them before. For instance, the average fridge lasts about 10 years, and gets thrown out when the compressor fails. This is a small part, but it takes a lot of work to replace so it’s cheaper to buy a new fridge. If fridges were redesigned so that the compressor was easy to replace, they would last far longer. “You shouldn’t have to buy two fridges in your lifetime,” they say.

Of course, this is another example of a solution for climate change that involves huge numbers of people taking concerted action. The problem is people’s disinclination to get off their backsides.

It’s quite a technical book, so it may not have much popular appeal, despite its nicely chatty style. But for policy-makers trying to cut emissions, and anyone in manufacturing, it should be required reading.

And so to the album, a collaboration between Allwood and soprano Adey Grummet, which is much better than it has any right to be. Worthy music on eco-conscious themes can sound like Spinal Tap’s Listen to the Flower People, but With Both Eyes Open actually contains a couple of good tunes.

The strongest songs get away from the details of materials science and become universal. The opening track, You Gotta Start, is an up-tempo number extolling the virtues of having a go, even when you don’t know exactly what you need to do. It’s not just about sustainability.

Similarly, the title track is a passionate call to arms, urging people to move away from blind consumerism. The closing line – “the stuff of life is life and not just stuff” – is better and more relevant than anything Coldplay will write next year.

Given how specialist the subject matter is, I’m not sure how many people the album will really appeal to. Of the 12 songs, I only expect to keep the two I’ve highlighted on my MP3 player. Unfortunately, the rest just restate ideas from the book in a slightly less clear way.

I worry that the album will give people, particularly policy-makers, the impression that the book is somehow flaky and not worth paying attention to. That would be a crying shame, because the book’s lessons are clear, well-supported, and vital.

Book information
Sustainable Materials: With Both Eyes Open
by Julian Allwood and Jonathan Cullen
UIT Cambridge
Free online or £31.82

Battery Fires Reveal Risks of Storing Large Amounts of Energy

Battery Fires Reveal Risks of Storing Large Amounts of Energy: Scientific American.

STORAGE RISK: Storing large amounts of energy, in batteries or other devices, inherently poses risks — but also offers benefits. Image: Mariordo/Wikimedia Commons

People still need electricity when the wind isn’t blowing and the sun isn’t shining, which is why renewable energy developers are increasingly investing in energy storage systems. They need to sop up excess juice and release it when needed.

However, storing large amounts of energy, whether it’s in big batteries for electric cars or water reservoirs for the electrical grid, is still a young field. It presents challenges, especially with safety.

The most recent challenge first appeared in May, three weeks after a safety crash test on the Chevrolet Volt, General Motors Co.’s plug-in hybrid. The wrecked vehicle caught fire on its own in a storage facility, raising questions about its lithium-ion battery.

Last week, after a series of additional side-impact crash tests on the Volt battery, the National Highway Traffic Safety Administration (NHTSA) launched what it called a “safety defect investigation” into the risk of fire in a Chevy Volt that has been involved in a serious accident.

Problems have also afflicted spinning flywheels, which allow power plants and other large energy users to store and release powerful surges of energy. In Stephentown, N.Y., Beacon Power’s 20-megawatt flywheel energy storage facility suffered two flywheel explosions, one on July 27 — just two weeks after it opened — and one on Oct. 13. The company declared bankruptcy earlier this month.

In Japan, sodium-sulfur batteries at Mitsubishi Materials Corp.’s Tsukuba plant in Ibaraki prefecture caught on fire on Sept. 21. It took firefighters more than eight hours to control the blaze, and authorities declared it extinguished on Oct. 5.

NGK Insulators Ltd., the company that manufactured the energy storage system, said it is still investigating the incident’s cause and has halted production of its sodium-sulfur cells, which are installed in 174 locations across six countries.

“Clearly, storing large amounts of energy is difficult from a physics standpoint; [the energy] would rather be somewhere else,” said Paul Denholm, a senior energy analyst at the National Renewable Energy Laboratory.

He explained that energy naturally wants to spread out, so packing it into a small space like a battery or a fuel cell creates the risk of an uncontrolled energy release like a fire or explosion. Similar issues come up with mechanical storage, whether it’s water behind a dam, compressed air underground or spinning flywheels.

Some storage risks are ‘grandfathered’
However, these risks are not unique to storing electricity. Fossil fuels, which are technically forms of stored energy, pose plenty of problems in their extraction, refining, distribution and delivery.

“We basically have grandfathered these risk factors. Gasoline catches on fire all the time,” said Denholm. Electrical energy storage systems aren’t inherently riskier than petroleum or natural gas, according to Denholm, but their risks are different.

The NHTSA shares Denholm’s assessment when it comes to cars. “Let us be clear: NHTSA does not believe electric vehicles are at a greater risk of fire than other vehicles,” said the agency in a press release earlier this month responding to the Volt fire. “It is common sense that the different designs of electric vehicles will require different safety standards and precautions.”

For batteries, the main issue is how they control the heat they generate. “What you really want to avoid is cascading failure,” said Denholm. “A failure of any one of those batteries is not a huge event, but if you don’t have proper thermal management, a failure in one battery can cause failure in another.”

This condition, known as a thermal runaway, happens when a cell fails and releases its energy as heat. This heat can cause adjacent cells to fail and generate heat, as well, leading to melting materials and fires.

Controlling temperatures is relatively simple when the batteries are in a fixed location, say, next to a wind farm, but it becomes harder when they are placed in a car or bus.

“The biggest thing that people become concerned about [for batteries in cars] is the ability to be able to tolerate abuse,” said Joe Redfield, principal engineer at the Southwest Research Institute, a nonprofit engineering research and development group.

In a car, a battery is exposed to a wide range of humidities, temperatures and electrical loads. All of these factors influence the battery’s reliability, and if they get too extreme, they can cause a thermal runaway condition.

New problem for firefighters
The problem is compounded by the fact that newer lithium-ion batteries store more electricity than other electrochemical storage systems. “The lead-acid battery has been around a long time” and is a mature technology, said Redfield. “The energy levels of lithium-ion batteries are much, much, much greater than that of lead-acid storage.”

This becomes a major problem for firefighters and first responders in the event of an accident involving lithium-ion batteries. Water can’t always be used to extinguish an electrical fire, since water can conduct electricity.

In addition, in the case of a thermal runaway, it’s usually not the batteries that catch fire but their fumes, though lithium itself is flammable. Even after the fire is extinguished, the batteries can still generate tremendous amounts of heat and reignite fumes, hampering rescue efforts.

One solution is to separate batteries into modules, making it easier to isolate a failed battery from the rest. Another trick is to have a master kill switch, a mechanism that quickly disables the electrical system and discharges the batteries.

The Department of Energy and the National Fire Protection Association are working together to train firefighters and rescue workers to identify these switches in vehicles and grid storage systems as well as in how to respond to battery fires, according to the NHTSA.

Redfield said that the best way to prevent such incidents is with a battery management system that evenly distributes electrical loads and controls temperatures. “It’s not just for safety; it’s primarily there to provide performance and battery life,” he said.

Electrics get high marks in crash tests

“As the operating temperature increases, the lifetime diminishes dramatically. You want to ensure the longest battery life, and if you achieve that, then you’re clearly in the safety limits of the operating environment,” he added.

Overall, Redfield expects that energy storage systems will help increase renewable energy use and curb fossil fuel dependence in the United States. The bumps along the road are significant, but they do not result from an inherent flaw in the idea.

“Failures in new technology have almost always been the result of design shortcuts that were made in putting the new technology into progress. Every now and then, you have some uncharted territory — things we haven’t seen before — but typically, they are few and far between,” said Redfield.

“It really is going down the same path we’ve gone down many times before. We don’t need to make the same mistakes we’ve made with liquid fuels.” After the earlier testing, NHTSA gave the Volt a five-star crash test rating — the agency’s highest — and it did the same for Nissan’s all-electric Leaf.

Meanwhile, a second testing agency, the Insurance Institute for Highway Safety, has given the Chevrolet Volt a “G,” the highest safety score possible, after side crash tests on the front, side, rear and rollovers.

Research by an affiliate of the insurance group, the Highway Loss Data Institute, estimates that overall chances of being injured in a crash are 25 percent lower in hybrids because their large batteries make them heavier than similar gasoline-powered cars.

Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC. www.eenews.net, 202-628-6500

Thailand: Super-canal may prevent floods

Thailand: Super-canal may prevent floods – CNN.

Thai authorities are considering the construction of a super-express waterway through Bangkok to prevent future floods similar to the one that has crippled the Thai capital and brought manufacturing in other parts of the country to a standstill.

A team of disaster experts from Chulalongkorn University in Bangkok is now investigating permanent solutions to the disaster that has left hundreds dead.

“One of the urgent solutions is a super-express floodway,” Thanawat Jarupongsakul, from the university’s Unit for Disaster and Land Information Studies, told the Bangkok Post.

Advertisement

Under the plan, existing natural canals — some of them more than 100 kilometers (62 miles) long — would be linked in a 200-km “super-highway” that would divert the course of floodwaters from the north.

The super-canal would hold 1.6 billion cubic meters of water and drain run-off at a rate of 6,000 cubic meters per second — the equivalent of two and a half Olympic-sized swimming pools a second.

“This idea is much cheaper than digging a new river as a floodway,” Thanawat said.

He said the proposed scheme would involve the construction of a kilometer-wide exclusion zone next to the floodway to prevent properties from being inundated, and a raised highway on both side of the canal.

The super-express floodway would then drain upstream run-off directly into the sea.

The university team is also looking at other flood-prevention measures such as a better early-warning system, improved water resource management, a flood tax, the use of a flood-risk map for urban development and groundwater-use controls.

“Now, the government must stop [trying to] solve flood problems with political methods,” Thanawat told the Bangkok Post. He said poor water management rather than excess rain had caused this year’s severe flooding, adding that natural swamps in the west of Thailand’s Central Plains, which once absorbed water flow, had been developed into industrial and residential areas, blocking the natural floodway.

While giant flood tunnels in the Bangkok metropolitan area could drain floodwater from the city, they could not cope with a massive inundation from the north.

“If there is no step forward, foreign investors will eventually disappear from the country and the next generation will be still worried whether flooding will happen or not,” he said.

Deflecting Killer Asteroids Away From Earth

Deflecting Killer Asteroids Away From Earth: How We Could Do It | Asteroid 2005 YU55, Asteroids & Near-Earth Objects | Asteroid Impact & Mass Extinction | Space.com.

An illustration of how solar sails might help deflect the asteroid Apophis.
CREDIT: Olivier Boisard

A huge asteroid’s close approach to Earth tomorrow (Nov. 8) reinforces that we live in a cosmic shooting gallery, and we can’t just sit around waiting to get hit again, experts say.

Asteroid 2005 YU55, which is the size of an aircraft carrier, will zip within the moon’s orbit tomorrow, but it poses no danger of hitting us for the foreseeable future. Eventually, however, one of its big space rock cousins will barrel straight toward Earth, as asteroids have done millions of times throughout our planet’s history.

If we want to avoid going the way of the dinosaurs, which were wiped out by an asteroid strike 65 million years ago, we’re going to have to deflect a killer space rock someday, researchers say. Fortunately, we know how to do it.

“We have the capability — physically, technically — to protect the Earth from asteroid impacts,” said former astronaut Rusty Schweickart, chairman of the B612 Foundation, a group dedicated to predicting and preventing catastrophic asteroid strikes. “We are now able to very slightly and subtly reshape the solar system in order to enhance human survival.”

 

In fact, we have several different techniques at our disposal to nudge killer asteroids away from Earth. Here’s a brief rundown of the possible arrows in our planetary defense quiver. [The 7 Strangest Asteroids in the Solar System ]

 

The gravity tractor

If researchers detect a potentially dangerous space rock in plenty of time, the best option may be to send a robotic probe out to rendezvous and ride along with it.

The spacecraft’s modest gravity would exert a tug on the asteroid as the two cruise through space together. Over months or years, this “gravity tractor” method would pull the asteroid into a different, more benign orbit.

“You can get a very precise change in the orbit for the final part of the deflection using a technology of this kind,” Schweickart said in late September, during a presentation at Caltech in Pasadena, Calif., called “Moving an Asteroid.”

Humanity has already demonstrated the know-how to pull off such a mission. Multiple probes have met up with faraway asteroids in deep space, including NASA’s Dawn spacecraft, which is currently orbiting the huge space rock Vesta.

And in 2005, the Japanese Hayabusa probe even plucked some pieces off the asteroid Itokawa, sending them back to Earth for analysis.

Smash ’em up

We could also be more aggressive with our asteroid rendezvous craft, relying on brute force rather than a gentle gravitational tug. That is, we could simply slam a robotic probe into the threatening space rock to change its orbit.

We know how to do this, too. In 2005, for example, NASA sent an impactor barreling into the comet Tempel 1 to determine the icy object’s composition.

The impactor approach would not be as precise as the gravity tractor technique, Schweickart said, but it could still do the job.

There’s also the possibility of blowing the asteroid to smithereens with a nuclear weapon. The nuclear option could come into play if the dangerous space rock is too big to knock around with a kinetic impactor,  but it would likely be a weapon of last resort.

For one thing, blasting an asteroid to bits might end up doing more harm than good, said fellow presentation panelist Bill Nye, executive director of the Planetary Society.

“Momentum is conserved,” Nye said. “If you blow it up, then the whole giant spray of rocks is coming at the Earth instead of one.”

The politics involved in mobilizing use of a nuke could also be a cause for concern, Schweickart said. It will likely be hard enough to convince the world to mount any sort of asteroid-deflection mission in time, and adding nuclear missiles to the equation would make things much stickier.

“The potential use of nuclear explosives for deflection cannot currently be ruled out,” Schweickart said. “But it is an extremely low probability that they will be needed.”

Close Encounters of the Comet Kind: A Brief History
This image of Comet Tempel 1 was taken by NASA’s Deep Impact spacecraft on July 4, 2005, 67 seconds after a probe crashed into the comet.
CREDIT: NASA/JPL-Caltech/UMD

‘Mirror bees’ and foil wrap

While we’re pretty sure that gravity tractors and kinetic impactor probes would work, researchers are also looking into several other ideas. [Photos: Asteroids in Deep Space]

There’s the “mirror bee” concept, for example, which would launch a swarm of small, mirror-bearing spacecraft to a dangerous asteroid. These mini-probes would aim reflected sunlight at one spot on the space rock, heating it up so much that rock is vaporized, creating propulsive jets.

“The reaction of that gas or material being ejected from the asteroid would nudge it off-course,” Nye said.

The Planetary Society is helping fund research into mirror bees, Nye said. And while he said the concept isn’t yet ready for deployment or demonstration, he stressed that it’s not too far off, either.

“Maybe five years,” Nye told SPACE.com. “It’s not 30 years.”

Nye also floated another, more speculative idea. It might be possible to move an asteroid, he said, by wrapping it in reflective foil, like a giant baked potato. Photons from the sun might then nudge the space rock away from Earth, in much the same way they propel spacecraft equipped with solar sails.

“This might work, even if the thing is rotating,” Nye said. “OK, make no promises. But it’s something to invest in.”

Passing the intelligent life test

The biggest key to deflecting dangerous asteroids, researchers say, is detecting them with plenty of lead time to take appropriate action. We’d like to have a least a decade of notice, NASA scientists have said.

It’ll take awhile, after all, to mobilize and launch a deflection mission, and for that mission to do its job, especially if we go the gravity tractor route.

We need to make sure we can rise to the challenge when a big, threatening asteroid shows up on our radar, Schweickart and Nye said. Civilization’s very survival depends on it.

“If there is a community of intelligent life out in the universe … those intelligent beings will have already conquered this challenge,” Schweickart said. “Our entrance exam to that community of intelligent life is to pass this test.”

You can follow SPACE.com senior writer Mike Wall on Twitter: @michaeldwall. Follow SPACE.com for the latest in space science and exploration news on Twitter @Spacedotcom and on Facebook.

9 Environmental Boundaries We Don’t Want to Cross

9 Environmental Boundaries We Don’t Want to Cross | Wired Science | Wired.com.The Lego Batman Movie (2017)

  • Follow @9brandon

bluemarble

Climate change threatens to turn the planet into a stormy, overheated mess: That much we know. But according to 28 leading scientists, greenhouse gas pollution is but one of nine environmental factors critical to humanity’s future. If their boundaries are stretched too far, Earth’s environment could be catastrophically altered — and three have already been broken, with several others soon to follow.

79digg

This grim diagnosis, published Wednesday in Nature, is the most ambitious assessment of planetary health to date. It’s a first-draft users’ manual for an era that scientists dub the “anthropocene,” in which nearly seven billion resource-hungry humans have come to dominate ecological change on Earth. The scientists’ quantifications are open to argument, but not the necessity of their perspective.

“It’s a crude attempt to map the environmental space in which we can operate,” said Jon Foley, director of the University of Minnesota’s Institute on the Environment and one of the paper’s lead authors. “We need to keep our activities in a certain range, or the planet could tip into a state we haven’t seen in the history of our civilization.”

Thresholds for atmospheric carbon dioxide and ozone have already been described, and are widely known to the public. But the scientists say five other factors are just as important: ocean acidification, nitrogen and phosphorus pollution, land use, freshwater use and biodiversity. They say chemical pollution and atmospheric aerosols may also be essential, but can’t yet be quantified.

Values for the proposed boundaries are still just estimates, and don’t account for how pushing one could affect another — how, for example, acidification that kills plankton could make it harder for the ocean to absorb CO2 and rebound from nitrogen pollution. Ecological models still can’t capture the entirety of Earth’s biological, geological and chemical processes, and it’s impossible to run whole-Earth experiments — except, arguably, for the experiment that’s going on now.

 

Despite those uncertainties, one aspect of Earth’s behavior is becoming clear. Records of global transitions between geological ages, and of regional changes between environmental stages, suggest that planet-wide change could happen relatively quickly. It might not take thousands or millions of years for Earth’s environment to be altered. It could happen in centuries, perhaps even decades.

Exactly what Earth would look like is difficult to predict in detail, but it could be radically different from the mild environment that has prevailed for the last 10,000 years. It was temperate stability that nurtured the rise of civilization, and it should continue for thousands of years to come, unless humanity keeps pushing the limits.

“The Earth of the last 10,000 years has been more recognizable than the Earth we may have 100 years from now. It won’t be Mars, but it won’t be the Earth that you and I know,” said Foley. “This is the single most defining problem of our time. Will we have the wisdom to be stewards of a world we’ve come to dominate?”

anthrome_map_v1

Foley’s team put the atmospheric carbon dioxide threshold at 350 parts per million, a level the Intergovernmental Panel for Climate Change says should keep Earth’s average temperature from rising by more than four degrees Fahrenheit. Current atmospheric CO2 levels are already approaching 400 parts per million.

Also exceeded are limits for species loss, which the scientists set at 10 per year per million species, and nitrogen use, pegged at 35 million tons per year. The current extinction rate is ten times higher than advised, ostensibly compromising the ability of ecosystems to process nutrients. The use of nitrogen — which is needed for fertilizer, but causes oxygen-choking algae blooms — is nearly four times higher than recommended.

On the positive side, atmospheric levels of ultraviolet radiation-blocking ozone are safe, thanks to a 1987 ban on ozone-destroying chemicals. Total rates of ocean acidification, freshwater consumption and land use are also acceptable, but those thresholds are expected to be exceeded in coming decades.

The seven boundary points are certain to be controversial, and Nature commissioned seven separate critiques by leading experts in each field.

William Schlesinger, president of the Cary Institute of Ecosystem Studies, said the recommended nitrogen limit “seems arbitrary.” Echoing his words was Steve Bass of the International Institute for Environment and Development, who said the 15 percent cap on land devoted to agriculture could as easily be 10 or 20 percent.

International Water Management Institute researcher David Molden said the 4,000 cubic kilometer ceiling on freshwater use — roughly one-third of all freshwater — “may be too high.” Myles Allen, an Oxford University climatologist, argued that CO2 emissions should be counted in a different way. Cristian Samper, director of the U.S. Natural History Museum, said that taxonomic family loss is a more relevant measure than species loss.

According to Foley, who called his team’s threshold values a “cave painting” version of the true limits, the paper is less important for its details than its approach. And though the critics argued over the numbers, all agreed that exceeding them will be disastrous.

“Planetary boundaries are a welcome new approach,” wrote Molden. “It is imperative that we act now on several fronts to avert a calamity far greater than what we envision from climate change.”

Peter Brewer, an ocean chemist at the Monterey Bay Aquarium Research Institute, criticized the paper’s lack of proposed solutions. Given the ongoing failure of governments and citizens to follow their scientists’ advice on climate change, more than dire warnings is clearly needed.

“Is it truly useful to create a list of environmental limits without serious plans for how they may be achieved?” Brewer wrote. “Without recognition of what would be needed economically and politically to enforce such limits, they may become just another stick to beat citizens with.”

“It’s unsatisfactory, I agree. We don’t answer the question of how to keep humanity from crossing the boundaries,” said Johan Rockstrom, director of the Stockholm Environment Institute and a lead author of the Nature paper. “That’s the next challenge. To stay within planetary boundaries, we need tremendous social transformation.”

See Also:

Note: The Nature paper is an edited version of the full article, which is available from the Stockholm Resilience Institute.

Citations: “A safe operating space for humanity.” By Johan Rockström, Will Steffen, Kevin Noone, Åsa Persson, F. Stuart Chapin, III, Eric F. Lambin, Timothy M. Lenton, Marten Scheffer, Carl Folke, Hans Joachim Schellnhuber, Björn Nykvist, Cynthia A. de Wit, Terry Hughes, Sander van der Leeuw, Henning Rodhe, Sverker Sörlin, Peter K. Snyder, Robert Costanza, Uno Svedin, Malin Falkenmark, Louise Karlberg, Robert W. Corell, Victoria J. Fabry, James Hansen, Brian Walker, Diana Liverman, Katherine Richardson, Paul Crutzen, Jonathan A. Foley. Nature, Vol. 461 No. 7263, September 24, 2009.

“Thresholds risk prolonged degradation.” By William Schlesinger. Nature, Vol. 461 No. 7263, September 24, 2009.

“Keep off the grass.” By Steve Bass. Nature, Vol. 461 No. 7263, September 24, 2009.

“Tangible targets are critical.” By Myles Allen. Nature, Vol. 461 No. 7263, September 24, 2009.

“Identifying abrupt change.” By Mario J. Molina. Nature, Vol. 461 No. 7263, September 24, 2009.

“The devil is in the detail.” By David Molden. Nature, Vol. 461 No. 7263, September 24, 2009.

“Consider all consequences.” By Peter Brewer. Nature, Vol. 461 No. 7263, September 24, 2009.

“Rethinking biodiversity.” By Cristian Samper. Nature, Vol. 461 No. 7263, September 24, 2009.

20 Ways to Build a Cleaner, Healthier, Smarter World

World Changing Ideas: 20 Ways to Build a Cleaner, Healthier, Smarter World: Scientific American.

What would happen if solar panels were free? What if it were possible to know everything about the world—not the Internet, but the living, physical world—in real time? What if doctors could forecast a disease years before it strikes? This is the promise of the World Changing Idea: a vision so simple yet so ambitious that its full impact is impossible to predict. Scientific American’s editorial and advisory boards have chosen projects in five general categories—Energy, Transportation, Environment, Electronics and Robotics, and Health and Medicine—that highlight the power of science and technology to improve the world. Some are in use now; others are emerging from the lab. But all of them show that innovation is the most promising elixir for what ails us.  —The Editors

The No-Money-Down Solar Plan
A new wave of start-ups wants to install rooftop solar panels on your house. Upfront cost: nothing
By Christopher Mims

The biggest thing stopping the sun is money. Installing a rooftop array of solar panels large enough to produce all of the energy required by a building is the equivalent of prepaying its electricity bill for the next seven to 10 years—and that’s after federal and state incentives. A new innovation in financing, however, has opened up an additional possibility for homeowners who want to reduce their carbon footprint and lower their electric bills: get the panels for free, then pay for the power as you go.

The system works something like a home mortgage. Organizations and individuals looking for a steady return on their investment, typically banks or municipal bond holders, use a pool of cash to pay for the solar panels. Directly or indirectly, homeowners buy the electricity produced by their own rooftop at a rate that is less, per kilowatt-hour, than they would pay for electricity from the grid. Investors get a safe investment—the latest generation of solar-panel technology works dependably for years—and homeowners get a break on their monthly bills, not to mention the satisfaction of significantly reducing their carbon footprint. “This is a way to get solar without putting any money down and to start saving money from day one. That’s a first,” says SolarCity co-founder Peter Rive.

SolarCity is the largest installer of household solar panels to have adopted this strategy. Founded in 2006 by two brothers who are also Silicon Valley–based serial entrepreneurs, SolarCity leases its panels to homeowners but gives the electricity away for free. The net effect is a much reduced utility bill (customers still need utility-delivered power when the sun isn’t out) plus a monthly SolarCity bill. The total for both comes out to less than the old bill. SunRun in San Francisco offers consumers a similar package, except that the company sells customers the electricity instead of leasing them the panels.

Cities such as Berkeley and Boulder are pioneering their own version of solar-panel financing by loaning individuals the entire amount required to pay for solar panels and installation. The project is paid for by municipal bonds, and the homeowner pays back the loan over 20 years as a part of the property tax bill. The effect is the same whichever route a consumer takes: the new obligation, in the form of taxes, a lease or a long-term contract for electricity, ends up costing less than the existing utility bill.

“What we’re really seeing is a transition in how we think about buying energy goods and services,” says Daniel M. Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley. Kammen, who did the initial analysis on Berkeley’s financing model, believes that by turning to financing, consumers can overcome the inherent disadvantage renewables have when compared with existing energy sources: the infrastructure for power from the grid has already been paid for and, in many cases, has been subsidized for decades.

All three approaches are rapidly expanding across the country. Despite the Berkeley program being less than two years old, 10 different states have passed legislation allowing their cities to set up a Berkeley-style bond-financed loan program. With the passage of the Waxman-Markey climate bill, the option for cities to set up these programs would become federal law. SunEdison in Maryland is currently active in nine states. SolarCity, which has more than 4,000 customers, is active in California, Arizona and Oregon and has promised to announce additional states after the new year.

Right now it is not possible to lower the overall cost of rooftop solar to “grid parity,” that is, to the same price as electricity from local utility companies, without federal subsidies such as the investment tax credit, which lowers the tax bill of banks financing these projects. Those subsidies, which amount to 30 percent of the cost of a solar installation, are guaranteed for at least eight years. By then, SolarCity and its competitors claim they won’t need them.

“Grid parity is driven by multiple factors,” says Attila Toth, vice president of marketing at SunEdison, including the cost of capital, the cost of panels and their installation, and the intensity of sunlight in a given region. “It will occur in different states at different times, but, for example, we expect that California will be one of the first states in the U.S. to get to grid parity, sometime between three and five years from now.”

While the cost of electricity from fossil fuels has increased 3 to 5 percent a year for the past decade, the cost of solar panels has fallen on average 20 percent for every doubling of its installed base. Grid parity is where these trend lines cross—after that, solar has the potential to power more than just homes. It’s hardly a coincidence that Elon Musk, head of electric car company Tesla Motors, sits on SolarCity’s board of directors.

More Ideas to watch
by Christopher Mims

The Gasoline Garden
It is the next step for biofuels: genetically engineered plant life that produces hydrocarbons as a by-product of its normal metabolism. The result will be fuel—common gasoline, even—using nothing but sunlight and CO2. In July, Exxon Mobil announced plans to spend more than $600 million in pursuit of algae that can accomplish the task. Joule Biotechnologies claims to have already succeeded, although the company has yet to reveal any details of its proprietary system.

Hot Nukes
Uranium and plutonium are not the only fuels that can power a nuclear reactor. With an initial kick from more traditional fissile materials, thorium can set up a self-sustaining “breeder” reaction that produces uranium 233, which is well suited to nuclear power generation. The process has the added benefit of being resistant to nuclear proliferation, because its end products emit enough gamma rays to make the fuel dangerous to handle and easy to track.

Save Energy with Information
Studies show that simply making customers aware of their energy use lowers it
by 5 to 15 percent. Smart meters allow customers to track their energy consumption minute by minute and appliance by appliance. Countless start-ups are offering the devices, and Google and Microsoft are independently partnering with local utilities to allow individuals to monitor their power usage over the Web.

Wind Power from the Stratosphere
According to a Stanford University study released in July, the high-altitude winds that constantly blow tens of thousands of feet above the earth hold enough energy to supply all of human civilization 100 times over. California’s Sky WindPower has proposed harvesting this energy by building fleets of giant, airborne, ground-tethered windmills, while Italy’s Kite Gen proposes to accomplish the same feat using kites.

Delivering the U.S. from Oil
Plug-in hybrid trucks are improving the long view of the short haul
By Amanda Schupak

Cargo trucks gulp about 40 percent of the fuel pumped in the U.S. While most consumer attention focuses on improving the fuel economy of consumer vehicles, a major opportunity goes rumbling by. “Folks do not realize that the fuel use of even a small truck is equal to many, many cars,” says Bill Van Amburg, senior vice president of Calstart, a clean transportation technology nonprofit, and director of the Hybrid Truck Users Forum. “A utility truck as a hybrid would reduce more petroleum than nine Priuses.”

Some 1,300 commercial hybrids on the road today get up to twice the fuel efficiency of their conventional counterparts. But these traditional hybrids are inherently limited. They make more efficient use of petroleum-based fuel by capturing some of the energy lost during braking.

Plug-in hybrids, on the other hand, draw energy from the grid. They can drive for miles—in many cases, an entire day’s route—without using any fossil fuel at all. This shifts energy demand away from petroleum and toward grid-based sources. (Last year zero-carbon renewables and nuclear supplied 30 percent of all electric power in the U.S.)

In many ways, plug-in hybrid technology makes more sense for delivery trucks than for consumer sedans. A cargo truck runs a short daily route that includes many stops to aid in regenerative braking. Most of the U.S. Postal Service’s 200,000-plus mail trucks, for example, travel fewer than 20 miles a day. In addition, fleet vehicles return nightly to storage lots that have ready access to the 120- or 240-volt outlets required to charge them.

The Department of Energy recently launched the nation’s largest commercial plug-in hybrid program, a $45.4-million project to get 378 medium-duty vehicles on the road in early 2011. The trucks, which will go to 50 municipal and utility fleets, will feature a power system from Eaton, a large manufacturer of electrical components, on a Ford F-550 chassis. (For its part, Ford will wait for the market to prove itself before designing its own commercial plug-ins.) “These are going to start breaking free in 2011,” says Paul Scott, president of the Electric Vehicle Association of Southern California.

Start-up company Bright Automotive has a more ambitious plan. It aims to replace at least 50,000 trucks with plug-in hybrids by 2014. Bright’s IDEA prototype travels 40 miles on battery power before switching to a four-cylinder engine that gets 40 miles to the gallon. The streamlined aluminum body has the payload of a postal truck yet is far more aerodynamic. The truck weighs as much as a midsize sedan.

John E. Waters, Bright Automotive’s founder and the former developer of the battery system for General Motors?’s groundbreaking EV1 electric car, says that each IDEA would save 1,500 gallons of fuel and 16 tons of carbon dioxide emissions a year over a standard utility truck. Waters says he is ready to begin assembly in his U.S. plant once a pending $450-million federal loan comes through.

Despite the appeal of the carbon savings, the fleet owners who are the trucks’ primary customers have more practical considerations. Bright’s executives are coy about the IDEA’s eventual price tag but assert that a customer with 2,000 trucks driving 80 miles a day five days a week could save $7.2 million a year. Right now that is probably not enough to justify large-scale purchases without additional rebates—or a price on carbon. Van Amburg estimates that going hybrid currently adds $30,000 to $50,000 in upfront costs per vehicle, although that figure should come down as production volumes increase.

Improved battery technology will also help. Today the IDEA’s 13-kilowatt-hour lithium-ion battery pack accounts for nearly a quarter of the vehicle’s total cost. Much of the research being done for the batteries going into the Chevy Volt? and other consumer plug-ins should also be applicable to commercial batteries. “For all the good we all want to do,” says David Lauzun, Bright’s vice president of product development, “these vehicles will not take over the world until it becomes the economic choice—‘I have to have them because it saves me money.’”

Bus Rapid Transit
Subwaylike bus lines mobilize the urban future
By Michael Moyer

For the first time in human civilization, more people now live in urban areas than in the countryside. This shift creates a number of dilemmas, not least of which is how to move people within the world’s rapidly growing metropolises. Pollution and traffic point away from car-based options, while light-rail systems are slow to construct and prohibitively expensive. One disarmingly simple—and cheap—possibility is Bus Rapid Transit, which is engineered to operate like a subway on wheels. In these systems, concrete dividers on existing roads separate high-capacity buses from the rest of traffic. Riders pay before boarding, then wait in enclosed stations. When a bus arrives, sliding partitions open to allow riders to board from a platform that is level with the bus floor. The traffic-free thoroughfares, quick boarding times, and modern, comfortable stations resemble light-rail systems more than the chaos of typical bus travel. In Bogotá, Colombia, which has had seven Bus Rapid Transit lines in operation since 2001, the buses handle 1.6 million trips a day. Its success has allowed the city to remove 7,000 private buses from the city, reducing consumption of bus fuel and its associated pollution by more than 59 percent.

Ocean Overhaul
Marine zoning is a bold remedy for sick seas
By Sarah Simpson

These days not even many politicians deny that the oceans are ill. Protecting the health of coastal waters is now a matter of national policy in dozens of countries, including the U.S., and world leaders are beginning to prescribe a revolutionary remedy that conservationists have been promoting for years: marine planning and zoning.

The idea is a natural extension of management policies that have guided the development of cities and landscapes for nearly a century. Porn shops aren’t next to preschools, after all, and drilling rigs aren’t the centerpieces of national parks. Similarly, zoning advocates envision a mosaic of regional maps in which every watery space on the planet is designated for a particular purpose. Drilling and mining would be allowed only in certain parts of the ocean; fishing in others. The most critically threatened areas would be virtually off-limits.

Whereas people can easily find maps telling them what they can do where on land, the marine realm is a hodgepodge of rules emanating from an army of agencies, each one managing a single use or symptom. In the U.S., for example, one body regulates commercial fishing, usually a single species at a time. Another group manages toxic substances, still another seabed mining, and so on—some 20 federal agencies in all. They tend to make decisions without regard to what the others are doing, explains Duke University? marine ecologist Larry B. Crowder. “Imagine all of the medical specialists visiting a patient in intensive care one at a time and never talking to one another,” he says. “It’s a wonder that the oceans aren’t in worse shape than they are now.”

Ocean advocates such as Crowder eagerly await the final recommendations of a special task force President Barack Obama charged with presenting a plan for overhauling management of U.S. waters, which extend 200 nautical miles offshore. The scope of such an undertaking is huge: the U.S. controls 4.4 million square miles of seascape, making the country’s underwater real estate 25 percent larger than its landmass. The committee’s preliminary report, released in September, suggests that the best way to minimize harmful human impacts on the oceans is to manage regions rather than symptoms.

Many environmentalists are hopeful that such plans will be implemented through the marine equivalent of municipal zoning, which would give them some influence in areas where they now have none. In zones where conservation is designated as the dominant activity, fishing and industrial activities such as mining would no longer have free rein. Under current rules, about the only way a conservation group can block a project it deems harmful—say, a new site for offshore drilling—is through expensive litigation.

So far, though, the president’s task force has been careful not to suggest that ocean zoning will be the only treatment plan, in great part because any effort to restrict commercial interests is bound to meet stiff opposition. “Zoning isn’t anybody’s favorite exercise,” notes John C. Ogden, director of the Florida Institute of Oceanography at the University of South Florida at Tampa. “Someone’s ox is always getting gored.” Most resistant to such change will most likely be the traditional users of the open ocean—namely, commercial fisheries and the petroleum industry. “They’ve had the place to themselves for a long time,” Ogden says.

Ogden and others are quick to point out, however, that zoning practices can benefit commerce as much as conservation. By giving up access to certain areas, industries gain the security of knowing their activities would be licensed in a more predictable and less costly manner than they are today, explains Josh Eagle, associate professor at the University of South Carolina School of Law. Now an oil company can apply for permits to drill virtually anywhere, but it takes on a significant financial risk each time. The business may dump millions of dollars into researching a new facility only to have a lawsuit derail it at the last moment. When opposing parties have more or less equal voices early in the planning process, Eagle says, they are less inclined to block one another’s activities once zones are drawn on a map.

Whether the final report of the president’s task force will promote ocean zoning explicitly is uncertain. But the group has already promised to overhaul the structure of ocean governance by proposing the creation of a National Ocean Council, whose job it will be to coordinate efforts of the myriad federal agencies now in charge.

The move comes just in time. Just as society is beginning to appreciate the enormous efforts it will take to preserve the health of the oceans, it must ask more of them—more energy, more food, and better resilience to coastal development and climate change. The reason the oceans are in trouble is not what people put in and take out. It is a failure of governments to manage these activities properly. Says Crowder: “We have to treat the oceans holistically, not one symptom at a time.”

The Power of Garbage
Trapped lightning could help zap trash and generate electricity
By John Pavlus

Trash is loaded with the energy trapped in its chemical bonds. Plasma gasification, a technology that has been in development for decades, could finally be ready to extract it.

In theory, the process is simple. Torches pass an electric current through a gas (often ordinary air) in a chamber to create a superheated plasma—an ionized gas with a temperature upward of 7,000 degrees Celsius, hotter than the surface of the sun. When this occurs naturally we call it lightning, and plasma gasification is literally lightning in a bottle: the plasma’s tremendous heat dissociates the molecular bonds of any garbage placed inside the chamber, converting organic compounds into syngas (a combination of carbon monoxide and hydrogen) and trapping everything else in an inert vitreous solid called slag. The syngas can be used as fuel in a turbine to generate electricity. It can also be used to create ethanol, methanol and biodiesel. The slag can be processed into materials suitable for use in construction.

In practice, the gasification idea has been unable to compete economically with traditional municipal waste processing. But the maturing technology has been coming down in cost, while energy prices have been on the rise. Now “the curves are finally crossing—it’s becoming cheaper to take the trash to a plasma plant than it is to dump it in a landfill,” says Louis Circeo, director of Plasma Research at the Georgia Tech Research Institute. Earlier this summer garbage-disposal giant Waste Management partnered with InEnTec, an Oregon-based start-up, to begin commercializing the latter’s plasma-gasification processes. And major pilot plants capable of processing 1,000 daily tons of trash or more are under development in Florida, Louisiana and California.

Plasma isn’t perfect. The toxic heavy metals sequestered in slag pass the Environmental Protection Agency?’s leachability standards (and have been used in construction for years in Japan and France) but still give pause to communities considering building the plants. And although syngas-generated electricity has an undeniably smaller carbon footprint than coal—“For every ton of trash you process with plasma, you reduce the amount of CO2 going into the atmosphere by about two tons,” Circeo says—it is still a net contributor of greenhouse gases.

“It is too good to be true,” Circeo admits, “but the EPA has estimated that if all the municipal solid waste in the U.S. were processed with plasma to make electricity, we could produce between 5 and 8 percent of our total electrical needs—equivalent to about 25 nuclear power plants or all of our current hydropower output.” With the U.S. expected to generate a million tons of garbage every day by 2020, using plasma to reclaim some of that energy could be too important to pass up.

More Ideas to watch
By John Pavlus

Cement as a Carbon Sponge
Traditional cement production creates at least 5 percent of global carbon dioxide emissions, but new materials could create carbon-neutral cement. Start-up Novacem, supported by Imperial College London, uses magnesium oxide to make cement that naturally absorbs CO2 as it hardens. California-based Calera uses seawater to sequester carbon emissions from a nearby power plant in cement.

The New Honeybee
Colony collapse disorder (CCD) has killed more than a third of honeybee colonies since 2006. Farmers who depend on bees to pollinate such crops as almonds, peaches and apples are looking to the blue orchard bee to pick up the slack.

One efficient Osmia lignaria can pollinate as much territory as 50 honeybees, but the bees are harder to cultivate because of their solitary nature. These pinch hitters won’t completely replace honeybees, but as scientists continue to grapple with CCD, they could act as an agricultural safety net.

Saltwater Crops
As the world’s freshwater supply becomes scarcer and food production needs balloon, salt-tolerant crops could ease the burden. Researchers at Australia’s University of Adelaide used genetic engineering to enhance a model crop’s natural ability to prevent saline buildup in its leaves, allowing the plant to thrive in conditions that would typically wither it. If the same gene tweak works in cereal crops such as rice and wheat—the researchers are testing them now—fallow lands destroyed by drought or overirrigation could become new breadbaskets.

The Omnipotence Machines
Tiny, ubiquitous sensors will allow us to index the physical world the way the Web maps cyberspace
By Gregory Mone

Earlier this year Hewlett-Packard announced the launch of its Central Nervous System for the Earth (CeNSE) project, a 10-year effort to embed up to a trillion pushpin-size sensors across the planet. Technologists say that the information gathered by this kind of ubiquitous sensing network could change our knowledge of the world as profoundly as the Internet has changed business. “People had no idea the Web was coming,” says technology forecaster Paul Saffo?. “We are at that moment now with ubiquitous sensing. There is quite an astonishing revolution just around the corner.”

The spread of versatile sensors, or “motes,” and the ability of computers to analyze and either recommend or initiate responses to the data they generate, will not merely enhance our understanding of nature. It could lead to buildings that manage their own energy use, bridges that flag engineers when in need of repair, cars that track traffic patterns and detect potholes, and home security systems that distinguish between the footfalls of an intruder and the dog, to name a few.

CeNSE is the boldest project yet announced, but HP is not the only organization developing the technology to make ubiquitous sensing possible. Intel is also designing novel sensor packages, as are numerous university labs.

For all the momentum in the field, though, this sensor-filled future is by no means inevitable. These devices will need to generate rich, reliable data and be rugged enough to survive tough environments. The sensor packages themselves will be small, but the computing effort required will be enormous. All the information they gather will have to be transmitted, hosted on server farms, and analyzed. Finally, someone is going to have to pay for it all. “There is the fundamental question of economics,” notes computer scientist Deborah Estrin of the University of California, Los Angeles. “Every sensor is a nonzero cost. There is maintenance, power, keeping them calibrated. You don’t just strew them around.”

In fact, HP senior researcher Peter Hartwell acknowledges that for CeNSE to hit its goals, the sensors will need to be nearly free. That is one of the reasons why HP is designing a single, do-everything, pushpin-size package stacked with a variety of gauges—light, temperature, humidity, vibration and strain, among others—instead of a series of devices for different tasks. Hartwell says that focusing on one versatile device will drive up volume, reducing the cost for each unit, but it could also allow HP to serve several clients at once with the same sensors.

Consider his chief engineering project, an ultrasensitive accelerometer. Housed inside a chip, the sensor tracks the motion of a tiny, internal movable platform relative to the rest of the chip. It can measure changes in acceleration 1,000 times as accurately as the technology in the Nintendo Wii?.

Hartwell imagines situating one of these pins every 16 feet along a highway. Thanks to the temperature, humidity and light sensors, the motes could serve as mini weather stations. But the accelerometers’ vibration data could also be analyzed to determine traffic conditions—roughly how many cars are moving past and how quickly. The local highway department would be interested in this information, he guesses, but there are potential consumer applications, too. “Your wireless company might want to take that information and tell you how to get to the airport the fastest,” Hartwell says.

All of this gathering and transmission of data requires power, of course, and to guarantee an extended life, the HP pushpin will not rely solely on batteries. “It is going to have some sort of energy-scavenging ability,” Hartwell says. “Maybe a solar panel or a thermoelectric device to help keep the battery charged.”

With the power hurdle in mind, other groups are forgoing batteries altogether. At Intel Labs in Seattle, engineer Josh Smith? has developed a sensor package that runs on wireless power. Like the HP pushpin, Intel’s WISP, or Wireless Identification and Sensing Platform, will include a variety of gauges, but it will also draw energy from the radio waves emitted by long-range radio-frequency ID chip readers. Smith says a single reader, plugged into a wall outlet, can already power and communicate with a network of prototype WISPs five to 10 feet away—a distance that should increase.

Smith cites many of the same infrastructure-related possibilities as Hartwell, along with a number of other uses. If WISPs were placed on standard household items such as cups, these tags could inform doctors about the rehabilitation progress of stroke victims. If the cups the patient normally uses remain stationary, Smith explains, then the individual probably is not up and moving around.

The potential applications for ubiquitous sensing are so broad—a physicist recently contacted him about using WISPs to monitor the temperature outside a proposed neutrino detector—that, as with the Internet, Smith says it is impossible to foresee them all. “In terms of the impact it is going to have on our lives,” Hartwell adds, “you haven’t seen anything yet.”

The Do-Anything Robot
Your PC can accomplish any computing task you ask of it. Why isn’t the same true for robots
By Gregory Mone

Robots have proved to be valuable tools for soldiers, surgeons and homeowners hoping to keep the carpet clean. But in each case, they are designed and built specifically for the job. Now there is a movement under way to build multipurpose machines—robots that can navigate changing environments such as offices or living rooms and work with their hands.

All-purpose robots are not, of course, a new vision. “It’s been five or 10 years from happening for about 50 years,” says Eric Berger, co-director of the Personal Robotics Program at Willow Garage, a Silicon Valley start-up. The delay is in part because even simple tasks require a huge set of capabilities. For a robot to fetch a mug, for example, it needs to make sense of data gathered by a variety of sensors—laser scanners identifying potential obstacles, cameras searching for the target, force feedback in the fingers that grasp the mug, and more. Yet Berger and other experts are confident that real progress could be made in the next decade.

The problem, according to Willow Garage, is the lack of a common platform for all that computational effort. Instead of building on the capabilities of a single machine, everyone is designing robots, and the software to control them, from the ground up. To help change this, Willow Garage is currently producing 25 copies of its model PR2 (for “Personal Robot 2”), a two-armed, wheeled machine that can unplug an appliance, open doors and move through a room. Ten of the robots will stay in-house, but 10 more will go to outside research groups, and everyone will pool their advances. This way, Berger says, if you want to build the robotic equivalent of a Twitter, you won’t start by constructing a computer: “you build the thing that’s new.”

Pocket Translator
The military, short on linguists, is building smart phone–based devices to do the job
By Gregory Mone

Sakhr Software, a company that builds automatic language translators, recently unveiled a prototype smart phone application that transforms spoken English phrases into spoken Arabic, and vice versa, in near real time. The technology isn’t quite ready for your next trip to Cairo, but thanks to recent advances in machine-translation techniques, plus the advent of higher-fidelity microphones and increasing processing power in smart phones, this mobile technology could soon allow two people speaking different languages to have basic conversations.

Before the 1990s automatic translation meant programming in an endless list of linguistic rules, a technique that proved too labor-intensive and insufficiently accurate. Today’s leading programs—developed by BBN Technologies?, IBM, Sakhr and others as part of a Defense Advanced Research Projects Agency effort to eliminate the military’s need for human translators—rely on machine-learning techniques instead. The software works from a database of parallel texts—for example, War and Peace in two different languages, translated United Nations speeches, and documents pulled off the Web. Algorithms identify short matching phrases across sources, and the software uses them to build statistical models that link English phrases to Arabic ones.

John Makhoul, BBN’s chief scientist, says the current technology is at its best when confined to subject areas with specific phrases and terminology—translating a weather report from English into French, for example, or helping soldiers gather basic biographical information from people in the field. Makhoul envisions the first consumer applications, five years from now, being similarly constrained. A tourism-related translation app on a smart phone could help an American in Florence get directions from a non-English-speaking local, but they won’t chat about Renaissance art. “It is not going to work perfectly,” he says, “but it will do a pretty good job.”

Know if Disease Grows Inside You
Complex diseases have complex causes. Luckily, they also leave a multitude of traces
By Melinda Wenner

With the exception of certain infectious diseases, few of humanity’s ailments have cures. More than 560,000 Americans will die of cancer this year, and despite the 250,000 coronary bypass surgeries doctors perform annually, heart disease is still the country’s number-one killer.

The hardest diseases to cure are the ones that take the longest to develop. They are the end result of decades of complex molecular interactions inside your body. Yet this complexity also pre­sents an opportunity. Scientists have discovered that these interactions leave discernible fingerprints on the body. By unweaving the complex tapestry of molecular clues—changes in the body’s proteins, nucleic acids and metabolites, collectively called biomarkers—doctors hope they will soon be able to not only detect disease but predict a coming illness in time to take action.

Biomarkers are not new. Since 1986 doctors have monitor­ed prostate cancer by measuring blood levels of the protein known as prostate-specific antigen (PSA). But tests that rely on a single biomarker to detect disease are rare, because most disorders involve intricate changes in a collection of biomarkers.

Take schizophrenia: in January 2010 scientists will release a biomarker test that distinguishes schizophrenia from other psychiatric conditions. The test, which is being commercialized by Rules-Based Medicine, a laboratory in Austin, Tex., is based on the characteristics of about 40 blood-based proteins.

To find potentially useful biomarkers, researchers collect blood samples from thousands of healthy people and analyze them. Biomarker levels in these samples provide a baseline reading. Then they do the same for people with a specific condition such as diabetes or breast cancer. If reproducible differences emerge between the groups, scientists can use the patterns in the disease group to diagnose the same condition in others. By collecting samples over time, researchers can also go back and analyze early samples from individuals who later become ill to identify patterns indicative of early disease or high disease risk.

Biophysical Corporation, a sister company to Rules-Based Medicine, is one of several companies that has developed blood-based biomarker tests and marketed them to the public [see “The Ultimate Blood Test,” by Philip Yam; Scientific American, June 2006]. The company searches for up to 250 biomarkers suggestive of cancer, inflammatory conditions, heart disease and other illnesses. Mark Chandler, Biophysical’s chair and CEO, says that the real value of the tests lies in long-term monitoring. A person could “get a test monthly, just a finger stick, that would be able to say, we have had a serious change here that is indicative of an early-stage cancer,” he explains.

Yet not all experts are convinced that the age of biomarkers is at hand. Cheryl Barton, an independent U.K.-based pharmaceutical consultant who authored a Business Insights market analysis report on biomarkers in 2006, says she remains “a little bit skeptical about how clinically useful they are.” A study of 5,000 subjects published in the Journal of the American Medical Association in July 2009 found that six cardiovascular biomarkers were only marginally better at predicting heart disease than were standard cardiovascular risk factors, such as whether the subjects smoked or had diabetes.

Adding to the overall difficulty, a person might suffer from two or more diseases—prostate cancer and heart disease, for example. No one knows how multiple diseases might affect overall biomarker signatures or how profiles will change as other diseases develop. “When you get to be 65 or 70, almost everybody has other conditions,” Chandler says. “We don’t know how to deal with that right now.” And scientists still need to discern which biomarkers are truly relevant to disease—a difficult task when working with blood, which contains tens of thousands of proteins at concentrations spanning more than 10 orders of magnitude.

Some companies have simplified the problem by avoiding blood altogether. LabCorp recently commercialized a biomarker test that analyzes colon cells in stool for the chemical signatures indicative of colorectal cancer. “The stool is in intimate contact with the lining of the colon, so it becomes much more highly populated with these rare molecules than would get into the bloodstream from colon cancer,” says Barry Berger, chief medical officer of Exact Sciences, a Madison, Wis.–based biotechnology company that developed the test technology.

In time, scientists are confident that they will eventually crack the more difficult problem of finding distinct disease signatures in the noisy data. “The evolutionary process, being complex and unknown, does not always give us an easy route,” Berger notes, “but it definitely gives us lots of opportunities.”

Satellites Diagnose Disease Outbreaks
Space-based data are helping to track and predict the spread of deadly diseases ?
By Katherine Harmon

Many contagious diseases spread through carriers such as birds and mosquitoes. These vectors in turn move with heat and rainfall. With this in mind, researchers have begun to use satellite data to monitor the environmental conditions that lead to disease. “Ideally, we could predict conditions that would result in some of these major outbreaks of cholera, malaria, even avian flu,” says Tim Ford of the University of New England at Biddeford and co-author of a paper on the subject published this past September in Emerging Infectious Diseases.

Satellite data have already been used to map the advance of the H5N1 avian influenza in Asia. The domestic duck, a common inhabitant of Southeast Asia’s rice paddies, is one of the main carriers of the disease. Xiangming Xiao, associate director of the University of Oklahoma?’s Center for Spatial Analysis, uses satellite images to map agricultural patterns in the region. These maps show where the ducks are most likely to live and thus where the avian influenza is most likely to spread.

Migratory birds also carry the virus, but their travel patterns are more difficult to predict. Xiao and his colleagues combine the satellite imagery with satellite-gathered surface-temperature data to estimate the birds’—and thereby the virus’s—trajectory. Computer models then link these environmental drivers to the spread of the flu in human populations.

Of course, not all of the work can be outsourced to orbiting observatories. Xiao says that judging the severity of avian flu’s spread from satellite imaging required knowing details about the human populations as well—for instance, how likely certain communities were to raise ducks for poultry consumption. “Satellite monitoring has a capacity to provide consistent observation,” Xiao says. “On the other hand, the in situ observations are still very, very important, so the key is to combine those together. That is a real challenge.”

More Ideas to watch
By Melinda Wenner

Quick Clots
Emergency technicians could prevent up to 35 percent of prehospital trauma deaths if they had better and cheaper ways to prevent blood loss. Now a University of Maryland–affiliated start-up called Trauma Solutions has developed a synthetic hydrogel that can clot blood by prompting the body to make fibrin, a protein that seals wounds and stops bleeding. Future iterations could simultaneously release such medicines as antibiotics and painkillers. Each application will cost about $5, compared with some natural blood-clotting substances that cost upward of $500.

Lab-on-a-Stamp
Liver damage is a major side effect of HIV/AIDS and tuberculosis drugs, yet few developing countries have enough trained scientists or equipment to monitor it. Nonprofit Cambridge, Mass.–based Diagnostics For All has developed an inexpensive fingernail-size device made almost entirely of paper that monitors liver damage using a single drop of blood. Channels in the paper guide blood to regions that change color depending on the levels of two damage-related liver enzymes.

Bacterial Toothpaste
Streptococcus mutans bacteria in the mouth decay teeth by converting sugars into enamel-eroding lactic acid. Florida-based Oragenics has genetically engineered a new strain of bacteria that converts sugars to trace amounts of alcohol instead. Because the new strain permanently displaces natural S. mutans, the therapy, which is currently in clinical trials, will be available as a one-time prescription that will protect teeth for life.

| Read Comments (29)

 

World population hits 7 billion

World population hits 7 billion on Oct. 31, or thereabouts – latimes.com.

It took only a dozen years for humanity to add another billion people to the planet, reaching the milestone of 7 billion Monday — give or take a few months.

Demographers at the United Nations Population Division set Oct. 31, 2011, as the “symbolic” date for hitting 7 billion, while acknowledging that it’s impossible to know for sure the specific time or day. Using slightly different calculations, the U.S. Census Bureau estimates the 7-billion threshold will not be reached until March.

Under any methodology, demographers agree that humanity remains on a steep growth curve, which is likely to keep climbing through the rest of this century. The U.N.’s best estimate is that population will march past 9.3 billion by 2050 and exceed 10.1 billion by the end of the century. It could be far more, if birthrates do not continue to drop as they have in the last half-century.

Nearly all the projected growth this century is expected to occur in developing countries in Asia, Africa and Latin America, while the combined populations in Europe, North America and other wealthy industrialized nations will remain relatively flat. Some countries, such as Germany, Russia and Japan, are poised to edge downward, their loss made up mostly by ongoing growth in the United States, which is bolstered by waves of immigrants.

The buildup to Monday’s milestone has briefly turned up the flame on long-simmering debates about growth on a finite planet: Whether a growing population or growing consumption remains the biggest environmental challenge, how best to help lift a billion people out of poverty and misery, whether governments should provide contraception for those who cannot afford it.

The new leader of the United Nations Population Fund, Dr. Babatunde Osotimehin, a Nigerian obstetrician-gynecologist, stepped gingerly into the fray. His agency remains a favorite punching bag of antiabortion activists in the United States for its role in supporting family planning clinics in developing countries.

“Instead of asking questions like, ‘Are we too many?’ we should instead be asking, ‘What can I do to make our world better?’ ” wrote Osotimehin in the annual State of the World Population report. The report chronicles disparities between rich nations and poor ones. Poor countries continue to have low education levels and startlingly high rates of teenage pregnancy and maternal and child deaths due to complications from childbirth.

“In many parts of the developing world, where population growth is outpacing economic growth, the need for reproductive health services, especially family planning, remains great,” Osotimehin concluded.

Some have used the occasion to celebrate the unrivaled success of the human species. Population grows when births exceed deaths. The 7-billion mark was reached because people are living longer and the number of infant deaths has dropped, because of a more secure food supply and because of advances in sanitation and medicine.

U.N. Secretary-General Ban Ki-moon will hold a news conference Monday to mark the date and talk about challenges ahead, particularly how to reduce poverty, invest in the world’s 1.8 billion youth and help countries develop in a sustainable way.

In 1999, his predecessor, Kofi Annan, designated a boy born to refugee parents in Sarajevo, Bosnia-Herzegovina, as Baby 6 Billion. He had been plucked from the hundreds of thousands of babies born that day to put a face on global population growth. Adnan Mevic, now 12, has become something of a celebrity.

None of the estimated 382,000 babies born Monday will have such an honor.

There is no word yet on how the United Nations will handle the next milestone, when the globe’s population hits 8 billion — about 14 years from now.

Malaria deaths fall 20% worldwide in last decade

BBC News – Malaria deaths fall over 20% worldwide in last decade.

There has been a fall of just over 20% in the number of deaths from malaria worldwide in the past decade, the World Health Organization says.

A new report said that one-third of the 108 countries where malaria was endemic were on course to eradicate the disease within 10 years.

Experts said if targets continued to be met, a further three million lives could be saved by 2015.

Malaria is one of the deadliest global diseases, particularly in Africa.

In 2009, 781,000 people died from malaria. The mosquito-borne disease is most prevalent in sub-Saharan Africa, where 85% of deaths occurred, most of them children under five.

An earlier report here incorrectly referred to a 40% drop in deaths.

It has been eradicated from three countries since 2007 – Morocco, Turkmenistan and Armenia.

The Roll Back Malaria Partnership aims to eliminate malaria in another eight to 10 countries by the end of 2015, including the entire WHO European Region.

Malaria Factfile

  • 2000: 233 million cases, 985,000 deaths
  • 2009: 225 million cases, 781,000 deaths
  • Malaria present in 108 countries and territories
  • 1.3% GDP reduction in heavily-infected countries

Robert Newman, director of the WHO’s Global Malaria Programme, said “remarkable progress” had been made.

“Better diagnostic testing and surveillance has provided a clearer picture of where we are on the ground – and has shown that there are countries eliminating malaria in all endemic regions of the world,” he told an international Malaria Forum conference in Seattle.

“We know that we can save lives with today’s tools.”

Global eradication

Graphic: Global Malaria deaths 2000-09

A global malaria eradication campaign, launched by WHO in 1955, succeeded in eliminating the disease in 16 countries and territories.

But after less than two decades, the WHO decided to concentrate instead on the less ambitious goal of malaria control.

However, another eight nations were declared malaria-free up until 1987, when certification was abandoned for 20 years.

In recent years, interest in malaria eradication as a long-term goal has re-emerged.

The WHO estimates that malaria causes significant economic losses, and can decrease gross domestic product (GDP) by as much as 1.3% in countries with high levels of transmission.

In the worst-affected countries, the disease accounts for: Up to 40% of public health expenditures; 30% to 50% of inpatient hospital admissions; and up to 60% of outpatient health clinic visits.

More on This Story

Related Stories

geoengineers forced to proceed slowly

Would-be geoengineers must listen to the public – opinion – 03 October 2011 – New Scientist.

It is no surprise that a proposed test of a climate engineering technology has raised hackles despite being environmentally benign

The world’s first field test of a technology that might, one day, cool global temperatures has been put on hold for at least six months, amid disquiet.

In October, the skies above Norfolk in the UK were scheduled to host a giant balloon attached to a 1-kilometre pipe that would harmlessly spray water droplets into the air. The work could be the first practical step in combating the effects of climate change.

The test is part of a research programme that aims to evaluate the feasibility of Stratospheric Particle Injection for Climate Engineering (SPICE). This is just one of many proposals for technologies that could manipulate the climate through “solar radiation management” – if ever deployed, it would involve spraying reflective particles into the stratosphere to deflect a proportion of sunlight.

It is by no means clear that these ideas will ever become reality. The SPICE project has become a lightning rod for discussion and debate, however, because it is the first solar radiation management technology to be officially tested outside the lab.

Now the body overseeing all this, the UK’s Engineering and Physical Sciences Research Council, has decided it needs further consultation with “stakeholders”, such as environmental organisations, before going ahead.

Atmospheric liposuction

Writing in The Guardian, activist and writer George Monbiot described the project as a “complete waste of time” and geoengineering as “atmospheric liposuction”. The ETC group, an international technology watchdog, denounced the trial as an “unhelpful provocation” likely to undermine international climate negotiations.

The SPICE team are quick to point out that the test itself does not constitute geoengineering – they are simply testing a delivery mechanism for spraying particles, and observing the movements of the balloon. But although the process may be environmentally benign, it is proving somewhat toxic socially.

It may be tempting to dismiss opposition to SPICE as anti-scientific, but there are good reasons to scratch beneath the surface of these views. Experience with previous science-society controversies such as the genetically modified crops debate or the continuing concerns over nanotechnologies shows that public mistrust is only partly based on scientific grounds. When members of the public are given the opportunity to discuss emerging areas of science, they often bring perspectives to the table that the scientific community may have missed.

Beyond questions about the safety or unintended side effects of geoengineering, there are deeper issues outside the remit of a purely scientific investigation. Is the intentional manipulation of the climate acceptable in principle? Will geoengineering technologies be likely to cause international conflict? Whose voices will be represented in decision-making about research and deployment? These are questions that should not be restricted to scientific or political elites, and opposition to seemingly benign scientific tests must be seen against this backdrop. What the SPICE test represents is just as important as its physical effect.

Most people are not experts on the science of geoengineering, but we can all claim to have some expertise in making moral and social judgements. Opinions that may appear irrational from a scientific perspective often seem less so when considered through a social, political or ethical lens. Incorporating these broader public views into decision-making about research and development is essential when the stakes are so high and uncertainty so great.

Ideology and technology

Research by Dan Kahan and colleagues at Yale University has repeatedly demonstrated that public attitudes about science are coloured by an ideological filter. Risk perceptions are not simply a matter of weighing up the pros and cons of a particular technology: they are also judgements about the role of science in society. If you are generally predisposed – for entirely legitimate reasons – to be sceptical about the value of grand, high-tech solutions to societal problems, and dubious about the capacity of governments and industry to regulate them, then the SPICE test is likely to set alarm bells ringing.

This position is not irrational in any meaningful sense of the word. Whether or not to deploy geoengineering is clearly a value judgement. But so is the decision about whether to conduct research into it at all. Moreover, there are good reasons for taking these broader perspectives seriously. The lesson for scientists from the controversy over GM crop trials is that ignoring seemingly unscientific opposition is counterproductive. Public engagement should not be considered as an opportunity to “sell” new technologies. When the public perceives this to be the case, opposition is likely to harden.

Incorporating public perspectives is also unlikely to deliver easy answers. In research conducted recently by the Understanding Risk group at Cardiff University, UK, members of the public expressed a range of views about SPICE. Most were willing to entertain the notion that the test should be pursued as a research opportunity. But very few were unconditionally positive about either the idea of solar radiation management or the test itself.

When such a critical issue is at stake it is perhaps inevitable that debates will be characterised by hyperbole and inflammatory rhetoric. But it is vital to remember that public attitudes towards science are not simply read off from scientific risk assessments. The SPICE test is a deeply symbolic development, and opposition towards it must be understood in this context.

Adam Corner is a research associate in the School of Psychology at Cardiff University, UK, focusing on the communication of climate change and public engagement with emerging technologies such as geoengineering.