Category Archives: FUTURISM

World Bank issues SOS for oceans, backs alliance

NewsDaily: World Bank issues SOS for oceans, backs alliance.

 

By David FogartyPosted 2012/02/24 at 12:41 am EST

SINGAPORE, Feb. 24, 2012 (Reuters) — The World Bank announced on Friday a global alliance to better manage and protect the world’s oceans, which are under threat from over-fishing, pollution and climate change.



Prostate Cancer Survivors
Does Your Hospital Publish Their Prostate Cancer TX Results? We do.
cancercenter.com/cancer-statistics
Fujitsu® Official Site
Free Hard Drive and Free Shipping. Six Months No Payment. Buy Today!
www.ShopFujitsu.com
Air Pollution Control
Custom designed thermal and catalytic oxidizers!
www.thecmmgroup.com
Energy Efficiency Program
Find Out How To Get Your Insulation Rebate From New Mexico Gas Company!
www.nmgco.com

Oceans are the lifeblood of the planet and the global economy, World Bank President Robert Zoellick told a conference on ocean conservation in Singapore. Yet the seas have become overexploited, coastlines badly degraded and reefs under threat from pollution and rising temperatures.

“We need a new SOS: Save Our Seas,” Zoellick said in announcing the alliance.

The partnership would bring together countries, scientific centers, non-governmental groups, international organizations, foundations and the private sector, he said.

The World Bank could help guide the effort by bringing together existing global ocean conservation programs and support efforts to mobilize finance and develop market-mechanisms to place a value on the benefits that oceans provide.

Millions of people rely on oceans for jobs and food and that dependence will grow as the world’s population heads for 9 billion people, underscoring the need to better manage the seas.

Zoellick said the alliance was initially committed to mobilizing at least $300 million in finance.

“Working with governments, the scientific community, civil society organizations, and the private sector, we aim to leverage as much as $1.2 billion to support healthy and sustainable oceans.”

FISH STOCKS

A key focus was understanding the full value of the oceans’ wealth and ecosystem services. Oceans are the top source of oxygen, help regulate the climate, while mangroves, reefs and wetlands are critical to protecting increasingly populous coastal areas against hazards such as storms — benefits that are largely taken for granted.

“Whatever the resource, it is impossible to evolve a plan to manage and grow the resource without knowing its value,” he said.

Another aim was to rebuild at least half the world’s fish stocks identified as depleted. About 85 percent of ocean fisheries are fully exploited, over-exploited or depleted.

“We should increase the annual net benefits of fisheries to between $20 billion and $30 billion. We estimate that global fisheries currently run a net economic loss of about $5 billion per year,” he said.

Participants at the conference spoke of the long-term dividends from ocean conservation and better management of its resources. But that needed economists, bankers and board rooms to place a value on the oceans’ “natural capital”.

“The key to the success of this partnership will be new market mechanisms that value natural capital and can attract private finance,” Abyd Karmali, global head of carbon markets at Bank of America Merrill Lynch, told Reuters.

He pointed to the value in preserving carbon-rich mangrove forests and sea grassbeds and the possibility of earning carbon offsets for projects that conserve these areas.

“The oceans’ stock is in trouble. We have diminished its asset value to a huge degree and poor asset management is poor economics,” Stephen Palumbi, director of the Hopkins Marine Station, Stanford University, told the conference.

(Editing by Robert Birsel)

How to make green steel

CultureLab: How to make steel go green – with songs!.

Michael Marshall, environment reporter

greensteel2.jpg

This is something you don’t see every day: a substantial, carefully-researched book on how to reform our manufacturing industries, paired with an album of songs on the same theme.

Let’s start with the book. Sustainable Materials: With Both Eyes Open tackles a particularly thorny question: how can we cut our greenhouse gas emissions to a safe level, without shutting down essential industries? It focuses on steel and aluminium, which between them account for 28 per cent of all industrial emissions, although later chapters briefly consider cement, paper and plastics as well.

This is a follow-up book to David MacKay’s much-vaunted Sustainable Energy – Without the Hot Air. Both feature academics from the University of Cambridge carefully working out how we can transform an emissions-heavy sector of the economy.

The eight authors, led by Julian Allwood and Jonathan Cullen, first take a close look at how steel and aluminium are produced from their respective ores, asking “how much can the metals industry do to clean up its act?” The answer they come up with: “plenty, but nowhere near enough”.

So they take a second approach, asking whether we can redesign the things we make to use less metal, use them for longer, and recycle their components when they wear out. This also offer plenty of options. Reassuringly, when the two approaches are combined the total emissions cuts are substantial.

 

Some of the ideas they come up with are so simple, I wondered why no one thought of them before. For instance, the average fridge lasts about 10 years, and gets thrown out when the compressor fails. This is a small part, but it takes a lot of work to replace so it’s cheaper to buy a new fridge. If fridges were redesigned so that the compressor was easy to replace, they would last far longer. “You shouldn’t have to buy two fridges in your lifetime,” they say.

Of course, this is another example of a solution for climate change that involves huge numbers of people taking concerted action. The problem is people’s disinclination to get off their backsides.

It’s quite a technical book, so it may not have much popular appeal, despite its nicely chatty style. But for policy-makers trying to cut emissions, and anyone in manufacturing, it should be required reading.

And so to the album, a collaboration between Allwood and soprano Adey Grummet, which is much better than it has any right to be. Worthy music on eco-conscious themes can sound like Spinal Tap’s Listen to the Flower People, but With Both Eyes Open actually contains a couple of good tunes.

The strongest songs get away from the details of materials science and become universal. The opening track, You Gotta Start, is an up-tempo number extolling the virtues of having a go, even when you don’t know exactly what you need to do. It’s not just about sustainability.

Similarly, the title track is a passionate call to arms, urging people to move away from blind consumerism. The closing line – “the stuff of life is life and not just stuff” – is better and more relevant than anything Coldplay will write next year.

Given how specialist the subject matter is, I’m not sure how many people the album will really appeal to. Of the 12 songs, I only expect to keep the two I’ve highlighted on my MP3 player. Unfortunately, the rest just restate ideas from the book in a slightly less clear way.

I worry that the album will give people, particularly policy-makers, the impression that the book is somehow flaky and not worth paying attention to. That would be a crying shame, because the book’s lessons are clear, well-supported, and vital.

Book information
Sustainable Materials: With Both Eyes Open
by Julian Allwood and Jonathan Cullen
UIT Cambridge
Free online or £31.82

High-Tech Hydroponic Farm Transforms Abandoned Bowling Alley

High-Tech Hydroponic Farm Transforms Abandoned Bowling Alley | Wired Science | Wired.com.

Gotham Greens

NEW YORK CITY — On top of an old bowling alley in industrial northern Brooklyn sits an expansive translucent greenhouse. Inside, a bounty of produce thrives under the supervision of a computer-controlled network of sensors, motors and plumbing.

The 15,000-square-foot hydroponic greenhouse facility, called Gotham Greens, is reputedly the first commercial-scale urban operation of its kind in the United States. Thousands of lettuce and basil seedlings were plopped into a soil-less farming system in May. Since then, three local entrepreneurs say their operation is on track to deliver 100 tons of produce by the one-year mark.

While that pales in comparison to about 1.5 million tons of soil-free produce trucked into the city each year, and is far less than the output of nearby soil rooftop farms, the $2 million startup can’t keep up with demand from the city’s top chefs and upscale grocery stores.

“On the first harvest day we had so much lettuce we almost didn’t know what to do with it all, but now we can’t grow it fast enough,” said greenhouse director Jennifer Nelkin.

Gotham Greens is already eyeing some of the the city’s more than 940 million square feet of rooftop space to expand their high-tech operation.

The hardest task, said co-founder and CEO Viraj Puri, is convincing landlords to entertain the idea of putting a watery business on their rooftops. After that it’s a matter of navigating zoning restrictions, building codes and figuring out how to engineer the plumbing.

“You can’t bury anything on a roof,” Puri said. “It requires some clever technology.”

Images: 1) Inside Gotham Greens at 810 Humboldt St. in Greenpoint, Brooklyn. (Dave Mosher/Wired.com) 2) The outside of the greenhouse. (Copyright of Gotham Greens)

 

seedlings in basalt rock plugs

 

Plant Plugs

The greenhouse begins its work by germinating seeds of four lettuce types and one basil variety in plastic bins. Fibrous plugs, spun from a volcanic rock called basalt, draw water to the fledgling roots and provide a medium for them to grow in.

“Not just hydroponic growers use the plugs,” Nelkin said. “A lot of farmers try and get a head-start at their soil farms, usually about six weeks before they can plant, by germinating seedlings in them.”

Images: Dave Mosher/Wired.com

 

irrigation system

 

Irrigating Gutters

Between 10 and 14 days after planting in the basalt plugs (above), the seeds sprout into seedlings and are ushered into hydroponic gutters (below). A series of pumps and drains constantly move nutrient-rich water through the gutters. Gotham Greens uses the nutrient film technique, which circulates a very shallow layer of water to supply roots with ample oxygen. Tight control over the nutrients and climate gives growers extreme control over their products.

“Many [hydroponic] tomatoes in the store, for example, taste like a swimming pool — and it’s too bad they’re giving hydroponics a bad name. It’s indicative of the grower, not the method,” Nelkin said. “You can really manipulate produce with hydroponics. You can choose to grow a tasteless tomato full of water, or grow the best, sweetest, juiciest tomato you’ve ever had.”

Images: Dave Mosher/Wired.com

 

calcium iron micronutrient bucket

 

Produce Juice

Plants primarily need water, carbon dioxide, oxygen, nitrogen, phosphorus and sulfur to grow. Most of these  materials come from air and tap water, but some trace nutrients need to be supplemented.

Gotham Greens stores nutrient mixes in giant buckets (above). When low levels are detected by sensors or in human-collected samples, computer-controlled pumps move the fluid into a nearby lagoon. (For proprietary reasons, Wired.com was only allowed to photograph this part of the system.) From there the solution is delivered to the gutters, and runoff returns to the lagoon for recycling.

The irrigation system is less complex than those of other hydroponic greenhouses, but Nelkin said anything more would be cost-prohibitive. “For our scale, it doesn’t make practical sense to micromanage every nutrient,” she said. “We’re not that large.” Their water comes from the tap.

Image: Dave Mosher/Wired.com

 

weather station

 

Climate Control

Important to Gotham Greens’ farming efficiency is a computer controller that monitors environmental conditions, keeping the greenhouse climate as ideal as possible for each type of veggie.

A weather station (above) monitors outdoor conditions while a photometer (below) and other sensors help keep tabs inside.

“When it hits a certain climate, [the controller] can turn on the fans, draw the sun shades, open vents, turn on the lights, turn on the heaters and so on,” Nelkin said.

Images: Dave Mosher/Wired.com

 

fly paper

 

Shoo, Fly

But what of the pests found in abundance near any unprotected plant? Gotham Greens doesn’t use pesticides. They fight fire with bug-eating fire.

Colored plastic cards covered with sticky goo attract the pests, which Nelkin and others check each day. When a bothersome bug is identified, Nelkin shops online for its predator, orders it and releases hordes of them in the greenhouse.

For aphids (above), a tray of ladybugs (below) usually does the trick.

“About 1,000 ladybugs costs probably $20,” Nelkin said. “It’s more expensive than pesticides, but it works.”

Controlling other pests requires the introduction of predatory wasps.

Images: Dave Mosher/Wired.com

 

solar panels

 

Green Power

Intelligent, organic greenhouses require electricity for lamps, pumps, computers and more. Solar panels installed by Gotham Greens satisfy about half of the facility’s needs, roughly enough to power 12 New York City households.

“Over the summer, we generated a good amount of power,” Puri said. “We’re going to track that data so we’ll be able to say exactly how much they help per year.”

Image: Dave Mosher/Wired.com

 

Jennifer Nelkin and Viraj Puri

 

Co-founders

Jennifer Nelkin (above, left) and Viraj Puri dreamed up Gotham Greens in 2008 after collaborating on a greenhouse that floated on the Hudson River. They’re now spending 100 percent of their time to develop Gotham Greens along with co-founder Eric Haley.

Their local food business may cut back carbon emissions better than most farms by minimizing transportation and relying on solar energy, but Puri said their main focus is delivering good produce. “The green aspects are a great bonus, but we want to be known for the quality of our products,” he said.

After Gotham Greens’ trial year ends in 2012, the company hopes to expand its line of crops to include tomatoes, cucumbers, peppers, squash, strawberries and even eggplant.

Images: Dave Mosher/Wired.com

20 Ways to Build a Cleaner, Healthier, Smarter World

World Changing Ideas: 20 Ways to Build a Cleaner, Healthier, Smarter World: Scientific American.

What would happen if solar panels were free? What if it were possible to know everything about the world—not the Internet, but the living, physical world—in real time? What if doctors could forecast a disease years before it strikes? This is the promise of the World Changing Idea: a vision so simple yet so ambitious that its full impact is impossible to predict. Scientific American’s editorial and advisory boards have chosen projects in five general categories—Energy, Transportation, Environment, Electronics and Robotics, and Health and Medicine—that highlight the power of science and technology to improve the world. Some are in use now; others are emerging from the lab. But all of them show that innovation is the most promising elixir for what ails us.  —The Editors

The No-Money-Down Solar Plan
A new wave of start-ups wants to install rooftop solar panels on your house. Upfront cost: nothing
By Christopher Mims

The biggest thing stopping the sun is money. Installing a rooftop array of solar panels large enough to produce all of the energy required by a building is the equivalent of prepaying its electricity bill for the next seven to 10 years—and that’s after federal and state incentives. A new innovation in financing, however, has opened up an additional possibility for homeowners who want to reduce their carbon footprint and lower their electric bills: get the panels for free, then pay for the power as you go.

The system works something like a home mortgage. Organizations and individuals looking for a steady return on their investment, typically banks or municipal bond holders, use a pool of cash to pay for the solar panels. Directly or indirectly, homeowners buy the electricity produced by their own rooftop at a rate that is less, per kilowatt-hour, than they would pay for electricity from the grid. Investors get a safe investment—the latest generation of solar-panel technology works dependably for years—and homeowners get a break on their monthly bills, not to mention the satisfaction of significantly reducing their carbon footprint. “This is a way to get solar without putting any money down and to start saving money from day one. That’s a first,” says SolarCity co-founder Peter Rive.

SolarCity is the largest installer of household solar panels to have adopted this strategy. Founded in 2006 by two brothers who are also Silicon Valley–based serial entrepreneurs, SolarCity leases its panels to homeowners but gives the electricity away for free. The net effect is a much reduced utility bill (customers still need utility-delivered power when the sun isn’t out) plus a monthly SolarCity bill. The total for both comes out to less than the old bill. SunRun in San Francisco offers consumers a similar package, except that the company sells customers the electricity instead of leasing them the panels.

Cities such as Berkeley and Boulder are pioneering their own version of solar-panel financing by loaning individuals the entire amount required to pay for solar panels and installation. The project is paid for by municipal bonds, and the homeowner pays back the loan over 20 years as a part of the property tax bill. The effect is the same whichever route a consumer takes: the new obligation, in the form of taxes, a lease or a long-term contract for electricity, ends up costing less than the existing utility bill.

“What we’re really seeing is a transition in how we think about buying energy goods and services,” says Daniel M. Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley. Kammen, who did the initial analysis on Berkeley’s financing model, believes that by turning to financing, consumers can overcome the inherent disadvantage renewables have when compared with existing energy sources: the infrastructure for power from the grid has already been paid for and, in many cases, has been subsidized for decades.

All three approaches are rapidly expanding across the country. Despite the Berkeley program being less than two years old, 10 different states have passed legislation allowing their cities to set up a Berkeley-style bond-financed loan program. With the passage of the Waxman-Markey climate bill, the option for cities to set up these programs would become federal law. SunEdison in Maryland is currently active in nine states. SolarCity, which has more than 4,000 customers, is active in California, Arizona and Oregon and has promised to announce additional states after the new year.

Right now it is not possible to lower the overall cost of rooftop solar to “grid parity,” that is, to the same price as electricity from local utility companies, without federal subsidies such as the investment tax credit, which lowers the tax bill of banks financing these projects. Those subsidies, which amount to 30 percent of the cost of a solar installation, are guaranteed for at least eight years. By then, SolarCity and its competitors claim they won’t need them.

“Grid parity is driven by multiple factors,” says Attila Toth, vice president of marketing at SunEdison, including the cost of capital, the cost of panels and their installation, and the intensity of sunlight in a given region. “It will occur in different states at different times, but, for example, we expect that California will be one of the first states in the U.S. to get to grid parity, sometime between three and five years from now.”

While the cost of electricity from fossil fuels has increased 3 to 5 percent a year for the past decade, the cost of solar panels has fallen on average 20 percent for every doubling of its installed base. Grid parity is where these trend lines cross—after that, solar has the potential to power more than just homes. It’s hardly a coincidence that Elon Musk, head of electric car company Tesla Motors, sits on SolarCity’s board of directors.

More Ideas to watch
by Christopher Mims

The Gasoline Garden
It is the next step for biofuels: genetically engineered plant life that produces hydrocarbons as a by-product of its normal metabolism. The result will be fuel—common gasoline, even—using nothing but sunlight and CO2. In July, Exxon Mobil announced plans to spend more than $600 million in pursuit of algae that can accomplish the task. Joule Biotechnologies claims to have already succeeded, although the company has yet to reveal any details of its proprietary system.

Hot Nukes
Uranium and plutonium are not the only fuels that can power a nuclear reactor. With an initial kick from more traditional fissile materials, thorium can set up a self-sustaining “breeder” reaction that produces uranium 233, which is well suited to nuclear power generation. The process has the added benefit of being resistant to nuclear proliferation, because its end products emit enough gamma rays to make the fuel dangerous to handle and easy to track.

Save Energy with Information
Studies show that simply making customers aware of their energy use lowers it
by 5 to 15 percent. Smart meters allow customers to track their energy consumption minute by minute and appliance by appliance. Countless start-ups are offering the devices, and Google and Microsoft are independently partnering with local utilities to allow individuals to monitor their power usage over the Web.

Wind Power from the Stratosphere
According to a Stanford University study released in July, the high-altitude winds that constantly blow tens of thousands of feet above the earth hold enough energy to supply all of human civilization 100 times over. California’s Sky WindPower has proposed harvesting this energy by building fleets of giant, airborne, ground-tethered windmills, while Italy’s Kite Gen proposes to accomplish the same feat using kites.

Delivering the U.S. from Oil
Plug-in hybrid trucks are improving the long view of the short haul
By Amanda Schupak

Cargo trucks gulp about 40 percent of the fuel pumped in the U.S. While most consumer attention focuses on improving the fuel economy of consumer vehicles, a major opportunity goes rumbling by. “Folks do not realize that the fuel use of even a small truck is equal to many, many cars,” says Bill Van Amburg, senior vice president of Calstart, a clean transportation technology nonprofit, and director of the Hybrid Truck Users Forum. “A utility truck as a hybrid would reduce more petroleum than nine Priuses.”

Some 1,300 commercial hybrids on the road today get up to twice the fuel efficiency of their conventional counterparts. But these traditional hybrids are inherently limited. They make more efficient use of petroleum-based fuel by capturing some of the energy lost during braking.

Plug-in hybrids, on the other hand, draw energy from the grid. They can drive for miles—in many cases, an entire day’s route—without using any fossil fuel at all. This shifts energy demand away from petroleum and toward grid-based sources. (Last year zero-carbon renewables and nuclear supplied 30 percent of all electric power in the U.S.)

In many ways, plug-in hybrid technology makes more sense for delivery trucks than for consumer sedans. A cargo truck runs a short daily route that includes many stops to aid in regenerative braking. Most of the U.S. Postal Service’s 200,000-plus mail trucks, for example, travel fewer than 20 miles a day. In addition, fleet vehicles return nightly to storage lots that have ready access to the 120- or 240-volt outlets required to charge them.

The Department of Energy recently launched the nation’s largest commercial plug-in hybrid program, a $45.4-million project to get 378 medium-duty vehicles on the road in early 2011. The trucks, which will go to 50 municipal and utility fleets, will feature a power system from Eaton, a large manufacturer of electrical components, on a Ford F-550 chassis. (For its part, Ford will wait for the market to prove itself before designing its own commercial plug-ins.) “These are going to start breaking free in 2011,” says Paul Scott, president of the Electric Vehicle Association of Southern California.

Start-up company Bright Automotive has a more ambitious plan. It aims to replace at least 50,000 trucks with plug-in hybrids by 2014. Bright’s IDEA prototype travels 40 miles on battery power before switching to a four-cylinder engine that gets 40 miles to the gallon. The streamlined aluminum body has the payload of a postal truck yet is far more aerodynamic. The truck weighs as much as a midsize sedan.

John E. Waters, Bright Automotive’s founder and the former developer of the battery system for General Motors?’s groundbreaking EV1 electric car, says that each IDEA would save 1,500 gallons of fuel and 16 tons of carbon dioxide emissions a year over a standard utility truck. Waters says he is ready to begin assembly in his U.S. plant once a pending $450-million federal loan comes through.

Despite the appeal of the carbon savings, the fleet owners who are the trucks’ primary customers have more practical considerations. Bright’s executives are coy about the IDEA’s eventual price tag but assert that a customer with 2,000 trucks driving 80 miles a day five days a week could save $7.2 million a year. Right now that is probably not enough to justify large-scale purchases without additional rebates—or a price on carbon. Van Amburg estimates that going hybrid currently adds $30,000 to $50,000 in upfront costs per vehicle, although that figure should come down as production volumes increase.

Improved battery technology will also help. Today the IDEA’s 13-kilowatt-hour lithium-ion battery pack accounts for nearly a quarter of the vehicle’s total cost. Much of the research being done for the batteries going into the Chevy Volt? and other consumer plug-ins should also be applicable to commercial batteries. “For all the good we all want to do,” says David Lauzun, Bright’s vice president of product development, “these vehicles will not take over the world until it becomes the economic choice—‘I have to have them because it saves me money.’”

Bus Rapid Transit
Subwaylike bus lines mobilize the urban future
By Michael Moyer

For the first time in human civilization, more people now live in urban areas than in the countryside. This shift creates a number of dilemmas, not least of which is how to move people within the world’s rapidly growing metropolises. Pollution and traffic point away from car-based options, while light-rail systems are slow to construct and prohibitively expensive. One disarmingly simple—and cheap—possibility is Bus Rapid Transit, which is engineered to operate like a subway on wheels. In these systems, concrete dividers on existing roads separate high-capacity buses from the rest of traffic. Riders pay before boarding, then wait in enclosed stations. When a bus arrives, sliding partitions open to allow riders to board from a platform that is level with the bus floor. The traffic-free thoroughfares, quick boarding times, and modern, comfortable stations resemble light-rail systems more than the chaos of typical bus travel. In Bogotá, Colombia, which has had seven Bus Rapid Transit lines in operation since 2001, the buses handle 1.6 million trips a day. Its success has allowed the city to remove 7,000 private buses from the city, reducing consumption of bus fuel and its associated pollution by more than 59 percent.

Ocean Overhaul
Marine zoning is a bold remedy for sick seas
By Sarah Simpson

These days not even many politicians deny that the oceans are ill. Protecting the health of coastal waters is now a matter of national policy in dozens of countries, including the U.S., and world leaders are beginning to prescribe a revolutionary remedy that conservationists have been promoting for years: marine planning and zoning.

The idea is a natural extension of management policies that have guided the development of cities and landscapes for nearly a century. Porn shops aren’t next to preschools, after all, and drilling rigs aren’t the centerpieces of national parks. Similarly, zoning advocates envision a mosaic of regional maps in which every watery space on the planet is designated for a particular purpose. Drilling and mining would be allowed only in certain parts of the ocean; fishing in others. The most critically threatened areas would be virtually off-limits.

Whereas people can easily find maps telling them what they can do where on land, the marine realm is a hodgepodge of rules emanating from an army of agencies, each one managing a single use or symptom. In the U.S., for example, one body regulates commercial fishing, usually a single species at a time. Another group manages toxic substances, still another seabed mining, and so on—some 20 federal agencies in all. They tend to make decisions without regard to what the others are doing, explains Duke University? marine ecologist Larry B. Crowder. “Imagine all of the medical specialists visiting a patient in intensive care one at a time and never talking to one another,” he says. “It’s a wonder that the oceans aren’t in worse shape than they are now.”

Ocean advocates such as Crowder eagerly await the final recommendations of a special task force President Barack Obama charged with presenting a plan for overhauling management of U.S. waters, which extend 200 nautical miles offshore. The scope of such an undertaking is huge: the U.S. controls 4.4 million square miles of seascape, making the country’s underwater real estate 25 percent larger than its landmass. The committee’s preliminary report, released in September, suggests that the best way to minimize harmful human impacts on the oceans is to manage regions rather than symptoms.

Many environmentalists are hopeful that such plans will be implemented through the marine equivalent of municipal zoning, which would give them some influence in areas where they now have none. In zones where conservation is designated as the dominant activity, fishing and industrial activities such as mining would no longer have free rein. Under current rules, about the only way a conservation group can block a project it deems harmful—say, a new site for offshore drilling—is through expensive litigation.

So far, though, the president’s task force has been careful not to suggest that ocean zoning will be the only treatment plan, in great part because any effort to restrict commercial interests is bound to meet stiff opposition. “Zoning isn’t anybody’s favorite exercise,” notes John C. Ogden, director of the Florida Institute of Oceanography at the University of South Florida at Tampa. “Someone’s ox is always getting gored.” Most resistant to such change will most likely be the traditional users of the open ocean—namely, commercial fisheries and the petroleum industry. “They’ve had the place to themselves for a long time,” Ogden says.

Ogden and others are quick to point out, however, that zoning practices can benefit commerce as much as conservation. By giving up access to certain areas, industries gain the security of knowing their activities would be licensed in a more predictable and less costly manner than they are today, explains Josh Eagle, associate professor at the University of South Carolina School of Law. Now an oil company can apply for permits to drill virtually anywhere, but it takes on a significant financial risk each time. The business may dump millions of dollars into researching a new facility only to have a lawsuit derail it at the last moment. When opposing parties have more or less equal voices early in the planning process, Eagle says, they are less inclined to block one another’s activities once zones are drawn on a map.

Whether the final report of the president’s task force will promote ocean zoning explicitly is uncertain. But the group has already promised to overhaul the structure of ocean governance by proposing the creation of a National Ocean Council, whose job it will be to coordinate efforts of the myriad federal agencies now in charge.

The move comes just in time. Just as society is beginning to appreciate the enormous efforts it will take to preserve the health of the oceans, it must ask more of them—more energy, more food, and better resilience to coastal development and climate change. The reason the oceans are in trouble is not what people put in and take out. It is a failure of governments to manage these activities properly. Says Crowder: “We have to treat the oceans holistically, not one symptom at a time.”

The Power of Garbage
Trapped lightning could help zap trash and generate electricity
By John Pavlus

Trash is loaded with the energy trapped in its chemical bonds. Plasma gasification, a technology that has been in development for decades, could finally be ready to extract it.

In theory, the process is simple. Torches pass an electric current through a gas (often ordinary air) in a chamber to create a superheated plasma—an ionized gas with a temperature upward of 7,000 degrees Celsius, hotter than the surface of the sun. When this occurs naturally we call it lightning, and plasma gasification is literally lightning in a bottle: the plasma’s tremendous heat dissociates the molecular bonds of any garbage placed inside the chamber, converting organic compounds into syngas (a combination of carbon monoxide and hydrogen) and trapping everything else in an inert vitreous solid called slag. The syngas can be used as fuel in a turbine to generate electricity. It can also be used to create ethanol, methanol and biodiesel. The slag can be processed into materials suitable for use in construction.

In practice, the gasification idea has been unable to compete economically with traditional municipal waste processing. But the maturing technology has been coming down in cost, while energy prices have been on the rise. Now “the curves are finally crossing—it’s becoming cheaper to take the trash to a plasma plant than it is to dump it in a landfill,” says Louis Circeo, director of Plasma Research at the Georgia Tech Research Institute. Earlier this summer garbage-disposal giant Waste Management partnered with InEnTec, an Oregon-based start-up, to begin commercializing the latter’s plasma-gasification processes. And major pilot plants capable of processing 1,000 daily tons of trash or more are under development in Florida, Louisiana and California.

Plasma isn’t perfect. The toxic heavy metals sequestered in slag pass the Environmental Protection Agency?’s leachability standards (and have been used in construction for years in Japan and France) but still give pause to communities considering building the plants. And although syngas-generated electricity has an undeniably smaller carbon footprint than coal—“For every ton of trash you process with plasma, you reduce the amount of CO2 going into the atmosphere by about two tons,” Circeo says—it is still a net contributor of greenhouse gases.

“It is too good to be true,” Circeo admits, “but the EPA has estimated that if all the municipal solid waste in the U.S. were processed with plasma to make electricity, we could produce between 5 and 8 percent of our total electrical needs—equivalent to about 25 nuclear power plants or all of our current hydropower output.” With the U.S. expected to generate a million tons of garbage every day by 2020, using plasma to reclaim some of that energy could be too important to pass up.

More Ideas to watch
By John Pavlus

Cement as a Carbon Sponge
Traditional cement production creates at least 5 percent of global carbon dioxide emissions, but new materials could create carbon-neutral cement. Start-up Novacem, supported by Imperial College London, uses magnesium oxide to make cement that naturally absorbs CO2 as it hardens. California-based Calera uses seawater to sequester carbon emissions from a nearby power plant in cement.

The New Honeybee
Colony collapse disorder (CCD) has killed more than a third of honeybee colonies since 2006. Farmers who depend on bees to pollinate such crops as almonds, peaches and apples are looking to the blue orchard bee to pick up the slack.

One efficient Osmia lignaria can pollinate as much territory as 50 honeybees, but the bees are harder to cultivate because of their solitary nature. These pinch hitters won’t completely replace honeybees, but as scientists continue to grapple with CCD, they could act as an agricultural safety net.

Saltwater Crops
As the world’s freshwater supply becomes scarcer and food production needs balloon, salt-tolerant crops could ease the burden. Researchers at Australia’s University of Adelaide used genetic engineering to enhance a model crop’s natural ability to prevent saline buildup in its leaves, allowing the plant to thrive in conditions that would typically wither it. If the same gene tweak works in cereal crops such as rice and wheat—the researchers are testing them now—fallow lands destroyed by drought or overirrigation could become new breadbaskets.

The Omnipotence Machines
Tiny, ubiquitous sensors will allow us to index the physical world the way the Web maps cyberspace
By Gregory Mone

Earlier this year Hewlett-Packard announced the launch of its Central Nervous System for the Earth (CeNSE) project, a 10-year effort to embed up to a trillion pushpin-size sensors across the planet. Technologists say that the information gathered by this kind of ubiquitous sensing network could change our knowledge of the world as profoundly as the Internet has changed business. “People had no idea the Web was coming,” says technology forecaster Paul Saffo?. “We are at that moment now with ubiquitous sensing. There is quite an astonishing revolution just around the corner.”

The spread of versatile sensors, or “motes,” and the ability of computers to analyze and either recommend or initiate responses to the data they generate, will not merely enhance our understanding of nature. It could lead to buildings that manage their own energy use, bridges that flag engineers when in need of repair, cars that track traffic patterns and detect potholes, and home security systems that distinguish between the footfalls of an intruder and the dog, to name a few.

CeNSE is the boldest project yet announced, but HP is not the only organization developing the technology to make ubiquitous sensing possible. Intel is also designing novel sensor packages, as are numerous university labs.

For all the momentum in the field, though, this sensor-filled future is by no means inevitable. These devices will need to generate rich, reliable data and be rugged enough to survive tough environments. The sensor packages themselves will be small, but the computing effort required will be enormous. All the information they gather will have to be transmitted, hosted on server farms, and analyzed. Finally, someone is going to have to pay for it all. “There is the fundamental question of economics,” notes computer scientist Deborah Estrin of the University of California, Los Angeles. “Every sensor is a nonzero cost. There is maintenance, power, keeping them calibrated. You don’t just strew them around.”

In fact, HP senior researcher Peter Hartwell acknowledges that for CeNSE to hit its goals, the sensors will need to be nearly free. That is one of the reasons why HP is designing a single, do-everything, pushpin-size package stacked with a variety of gauges—light, temperature, humidity, vibration and strain, among others—instead of a series of devices for different tasks. Hartwell says that focusing on one versatile device will drive up volume, reducing the cost for each unit, but it could also allow HP to serve several clients at once with the same sensors.

Consider his chief engineering project, an ultrasensitive accelerometer. Housed inside a chip, the sensor tracks the motion of a tiny, internal movable platform relative to the rest of the chip. It can measure changes in acceleration 1,000 times as accurately as the technology in the Nintendo Wii?.

Hartwell imagines situating one of these pins every 16 feet along a highway. Thanks to the temperature, humidity and light sensors, the motes could serve as mini weather stations. But the accelerometers’ vibration data could also be analyzed to determine traffic conditions—roughly how many cars are moving past and how quickly. The local highway department would be interested in this information, he guesses, but there are potential consumer applications, too. “Your wireless company might want to take that information and tell you how to get to the airport the fastest,” Hartwell says.

All of this gathering and transmission of data requires power, of course, and to guarantee an extended life, the HP pushpin will not rely solely on batteries. “It is going to have some sort of energy-scavenging ability,” Hartwell says. “Maybe a solar panel or a thermoelectric device to help keep the battery charged.”

With the power hurdle in mind, other groups are forgoing batteries altogether. At Intel Labs in Seattle, engineer Josh Smith? has developed a sensor package that runs on wireless power. Like the HP pushpin, Intel’s WISP, or Wireless Identification and Sensing Platform, will include a variety of gauges, but it will also draw energy from the radio waves emitted by long-range radio-frequency ID chip readers. Smith says a single reader, plugged into a wall outlet, can already power and communicate with a network of prototype WISPs five to 10 feet away—a distance that should increase.

Smith cites many of the same infrastructure-related possibilities as Hartwell, along with a number of other uses. If WISPs were placed on standard household items such as cups, these tags could inform doctors about the rehabilitation progress of stroke victims. If the cups the patient normally uses remain stationary, Smith explains, then the individual probably is not up and moving around.

The potential applications for ubiquitous sensing are so broad—a physicist recently contacted him about using WISPs to monitor the temperature outside a proposed neutrino detector—that, as with the Internet, Smith says it is impossible to foresee them all. “In terms of the impact it is going to have on our lives,” Hartwell adds, “you haven’t seen anything yet.”

The Do-Anything Robot
Your PC can accomplish any computing task you ask of it. Why isn’t the same true for robots
By Gregory Mone

Robots have proved to be valuable tools for soldiers, surgeons and homeowners hoping to keep the carpet clean. But in each case, they are designed and built specifically for the job. Now there is a movement under way to build multipurpose machines—robots that can navigate changing environments such as offices or living rooms and work with their hands.

All-purpose robots are not, of course, a new vision. “It’s been five or 10 years from happening for about 50 years,” says Eric Berger, co-director of the Personal Robotics Program at Willow Garage, a Silicon Valley start-up. The delay is in part because even simple tasks require a huge set of capabilities. For a robot to fetch a mug, for example, it needs to make sense of data gathered by a variety of sensors—laser scanners identifying potential obstacles, cameras searching for the target, force feedback in the fingers that grasp the mug, and more. Yet Berger and other experts are confident that real progress could be made in the next decade.

The problem, according to Willow Garage, is the lack of a common platform for all that computational effort. Instead of building on the capabilities of a single machine, everyone is designing robots, and the software to control them, from the ground up. To help change this, Willow Garage is currently producing 25 copies of its model PR2 (for “Personal Robot 2”), a two-armed, wheeled machine that can unplug an appliance, open doors and move through a room. Ten of the robots will stay in-house, but 10 more will go to outside research groups, and everyone will pool their advances. This way, Berger says, if you want to build the robotic equivalent of a Twitter, you won’t start by constructing a computer: “you build the thing that’s new.”

Pocket Translator
The military, short on linguists, is building smart phone–based devices to do the job
By Gregory Mone

Sakhr Software, a company that builds automatic language translators, recently unveiled a prototype smart phone application that transforms spoken English phrases into spoken Arabic, and vice versa, in near real time. The technology isn’t quite ready for your next trip to Cairo, but thanks to recent advances in machine-translation techniques, plus the advent of higher-fidelity microphones and increasing processing power in smart phones, this mobile technology could soon allow two people speaking different languages to have basic conversations.

Before the 1990s automatic translation meant programming in an endless list of linguistic rules, a technique that proved too labor-intensive and insufficiently accurate. Today’s leading programs—developed by BBN Technologies?, IBM, Sakhr and others as part of a Defense Advanced Research Projects Agency effort to eliminate the military’s need for human translators—rely on machine-learning techniques instead. The software works from a database of parallel texts—for example, War and Peace in two different languages, translated United Nations speeches, and documents pulled off the Web. Algorithms identify short matching phrases across sources, and the software uses them to build statistical models that link English phrases to Arabic ones.

John Makhoul, BBN’s chief scientist, says the current technology is at its best when confined to subject areas with specific phrases and terminology—translating a weather report from English into French, for example, or helping soldiers gather basic biographical information from people in the field. Makhoul envisions the first consumer applications, five years from now, being similarly constrained. A tourism-related translation app on a smart phone could help an American in Florence get directions from a non-English-speaking local, but they won’t chat about Renaissance art. “It is not going to work perfectly,” he says, “but it will do a pretty good job.”

Know if Disease Grows Inside You
Complex diseases have complex causes. Luckily, they also leave a multitude of traces
By Melinda Wenner

With the exception of certain infectious diseases, few of humanity’s ailments have cures. More than 560,000 Americans will die of cancer this year, and despite the 250,000 coronary bypass surgeries doctors perform annually, heart disease is still the country’s number-one killer.

The hardest diseases to cure are the ones that take the longest to develop. They are the end result of decades of complex molecular interactions inside your body. Yet this complexity also pre­sents an opportunity. Scientists have discovered that these interactions leave discernible fingerprints on the body. By unweaving the complex tapestry of molecular clues—changes in the body’s proteins, nucleic acids and metabolites, collectively called biomarkers—doctors hope they will soon be able to not only detect disease but predict a coming illness in time to take action.

Biomarkers are not new. Since 1986 doctors have monitor­ed prostate cancer by measuring blood levels of the protein known as prostate-specific antigen (PSA). But tests that rely on a single biomarker to detect disease are rare, because most disorders involve intricate changes in a collection of biomarkers.

Take schizophrenia: in January 2010 scientists will release a biomarker test that distinguishes schizophrenia from other psychiatric conditions. The test, which is being commercialized by Rules-Based Medicine, a laboratory in Austin, Tex., is based on the characteristics of about 40 blood-based proteins.

To find potentially useful biomarkers, researchers collect blood samples from thousands of healthy people and analyze them. Biomarker levels in these samples provide a baseline reading. Then they do the same for people with a specific condition such as diabetes or breast cancer. If reproducible differences emerge between the groups, scientists can use the patterns in the disease group to diagnose the same condition in others. By collecting samples over time, researchers can also go back and analyze early samples from individuals who later become ill to identify patterns indicative of early disease or high disease risk.

Biophysical Corporation, a sister company to Rules-Based Medicine, is one of several companies that has developed blood-based biomarker tests and marketed them to the public [see “The Ultimate Blood Test,” by Philip Yam; Scientific American, June 2006]. The company searches for up to 250 biomarkers suggestive of cancer, inflammatory conditions, heart disease and other illnesses. Mark Chandler, Biophysical’s chair and CEO, says that the real value of the tests lies in long-term monitoring. A person could “get a test monthly, just a finger stick, that would be able to say, we have had a serious change here that is indicative of an early-stage cancer,” he explains.

Yet not all experts are convinced that the age of biomarkers is at hand. Cheryl Barton, an independent U.K.-based pharmaceutical consultant who authored a Business Insights market analysis report on biomarkers in 2006, says she remains “a little bit skeptical about how clinically useful they are.” A study of 5,000 subjects published in the Journal of the American Medical Association in July 2009 found that six cardiovascular biomarkers were only marginally better at predicting heart disease than were standard cardiovascular risk factors, such as whether the subjects smoked or had diabetes.

Adding to the overall difficulty, a person might suffer from two or more diseases—prostate cancer and heart disease, for example. No one knows how multiple diseases might affect overall biomarker signatures or how profiles will change as other diseases develop. “When you get to be 65 or 70, almost everybody has other conditions,” Chandler says. “We don’t know how to deal with that right now.” And scientists still need to discern which biomarkers are truly relevant to disease—a difficult task when working with blood, which contains tens of thousands of proteins at concentrations spanning more than 10 orders of magnitude.

Some companies have simplified the problem by avoiding blood altogether. LabCorp recently commercialized a biomarker test that analyzes colon cells in stool for the chemical signatures indicative of colorectal cancer. “The stool is in intimate contact with the lining of the colon, so it becomes much more highly populated with these rare molecules than would get into the bloodstream from colon cancer,” says Barry Berger, chief medical officer of Exact Sciences, a Madison, Wis.–based biotechnology company that developed the test technology.

In time, scientists are confident that they will eventually crack the more difficult problem of finding distinct disease signatures in the noisy data. “The evolutionary process, being complex and unknown, does not always give us an easy route,” Berger notes, “but it definitely gives us lots of opportunities.”

Satellites Diagnose Disease Outbreaks
Space-based data are helping to track and predict the spread of deadly diseases ?
By Katherine Harmon

Many contagious diseases spread through carriers such as birds and mosquitoes. These vectors in turn move with heat and rainfall. With this in mind, researchers have begun to use satellite data to monitor the environmental conditions that lead to disease. “Ideally, we could predict conditions that would result in some of these major outbreaks of cholera, malaria, even avian flu,” says Tim Ford of the University of New England at Biddeford and co-author of a paper on the subject published this past September in Emerging Infectious Diseases.

Satellite data have already been used to map the advance of the H5N1 avian influenza in Asia. The domestic duck, a common inhabitant of Southeast Asia’s rice paddies, is one of the main carriers of the disease. Xiangming Xiao, associate director of the University of Oklahoma?’s Center for Spatial Analysis, uses satellite images to map agricultural patterns in the region. These maps show where the ducks are most likely to live and thus where the avian influenza is most likely to spread.

Migratory birds also carry the virus, but their travel patterns are more difficult to predict. Xiao and his colleagues combine the satellite imagery with satellite-gathered surface-temperature data to estimate the birds’—and thereby the virus’s—trajectory. Computer models then link these environmental drivers to the spread of the flu in human populations.

Of course, not all of the work can be outsourced to orbiting observatories. Xiao says that judging the severity of avian flu’s spread from satellite imaging required knowing details about the human populations as well—for instance, how likely certain communities were to raise ducks for poultry consumption. “Satellite monitoring has a capacity to provide consistent observation,” Xiao says. “On the other hand, the in situ observations are still very, very important, so the key is to combine those together. That is a real challenge.”

More Ideas to watch
By Melinda Wenner

Quick Clots
Emergency technicians could prevent up to 35 percent of prehospital trauma deaths if they had better and cheaper ways to prevent blood loss. Now a University of Maryland–affiliated start-up called Trauma Solutions has developed a synthetic hydrogel that can clot blood by prompting the body to make fibrin, a protein that seals wounds and stops bleeding. Future iterations could simultaneously release such medicines as antibiotics and painkillers. Each application will cost about $5, compared with some natural blood-clotting substances that cost upward of $500.

Lab-on-a-Stamp
Liver damage is a major side effect of HIV/AIDS and tuberculosis drugs, yet few developing countries have enough trained scientists or equipment to monitor it. Nonprofit Cambridge, Mass.–based Diagnostics For All has developed an inexpensive fingernail-size device made almost entirely of paper that monitors liver damage using a single drop of blood. Channels in the paper guide blood to regions that change color depending on the levels of two damage-related liver enzymes.

Bacterial Toothpaste
Streptococcus mutans bacteria in the mouth decay teeth by converting sugars into enamel-eroding lactic acid. Florida-based Oragenics has genetically engineered a new strain of bacteria that converts sugars to trace amounts of alcohol instead. Because the new strain permanently displaces natural S. mutans, the therapy, which is currently in clinical trials, will be available as a one-time prescription that will protect teeth for life.

| Read Comments (29)

 

Mitsubishi Victim of Chinese cyber attack

BBC News – Japan defence firm Mitsubishi Heavy in cyber attack.

Japan’s top weapons maker has confirmed it was the victim of a cyber attack reportedly targeting data on missiles, submarines and nuclear power plants.

Mitsubishi Heavy Industries (MHI) said viruses were found on more than 80 of its servers and computers last month.

The government said it was not aware of any leak of sensitive information.

But the defence ministry has demanded MHI carry out a full investigation. Officials were angered after learning of the breach from local media reports.

Speaking at a news conference on Tuesday, Japan’s defence minister Yasuo Ichikawa said the cyber attackers had not succeeded in accessing any important information but MHI would be instructed “to undertake a review of their information control systems”.

“The ministry will continue to monitor the problem and conduct investigations if necessary,” Mr Ichikawa added.

All government contractors are obliged to inform ministers promptly of any breach of sensitive or classified information.

Analysis

The Ministry of Defence has said the delay in Mitsubishi Heavy Industries informing it of the cyber attack is “regrettable” – a bland term regularly deployed by Japanese bureaucrats to describe everything from near indifference to utter outrage.

But it is clear there is concern in Japan about security at the country’s biggest defence contractor.

Mitsubishi Heavy makes everything from warships to missiles. The giant company says it discovered the breach in mid- August, and informed the Japanese police at the end of the month.

But the defence ministry was not told until Monday afternoon, after reports had appeared in local media.

The key issue is just how serious the attack was – and whether any of Japan’s defence secrets have leaked.

Mitsubishi Heavy says the virus was confined to just 45 servers and 38 computer terminals – out of the many thousands it operates.

An ongoing internal investigation has found only network information, such as IP addresses, has been compromised.

“It’s up to the defence ministry to decide whether or not the information is important. That is not for Mitsubishi Heavy to decide. A report should have been made,” a defence ministry spokesman was earlier quoted by Reuters as saying.

Better protection

The online attacks – which are believed to be the first of their kind against Japan’s defence industry – originated outside the company’s computer network, MHI said.

They have been described as spear phishing attacks – when hackers send highly customised and specifically targeted messages aimed at tricking people into visiting a fake webpage and giving away login details.

Neither the Japanese government nor MHI have said who may be responsible. A report in one Japanese newspaper said Chinese language script was detected in the attack against MHI.

But China rebuffed suggestions it could be behind the attacks.

“China is one of the main victims of hacking… Criticising China as being the source of hacking attacks not only is baseless, it is also not beneficial for promoting international co-operation for internet security,” foreign ministry spokesman Hong Lei said.

China has in the past been accused of carrying out online attacks on foreign government agencies and firms.

Beijing routinely denies that it is behind this kind of hacking but, says the BBC’s Defence Correspondent Jonathan Marcus, the US military is more and more concerned about China’s abilities in this field.

Fear of the “cyber-dragon” is driving forward a fundamental re-think of US policy which is coming more and more to regard computer hacking as a potential act of war, our correspondent adds.

MHI confirmed that 45 of its servers and 38 computers were infected by at least eight viruses.

The viruses targeted a shipyard in Nagasaki, where destroyers are built, and a facility in Kobe that manufactures submarines and parts for nuclear power stations, public broadcaster NHK reported.

A plant in Nagoya, where the company designs and builds guidance and propulsion systems for rockets and missiles, was also reportedly compromised.

MHI said it had consulted the Tokyo police department and was carrying out an investigation alongside security experts, which should be concluded by the end of the month.

Lockheed case

A second defence contractor, IHI, which supplies engine parts for military aircraft, said it had also been targeted.

IHI said it had been receiving emails containing viruses for months, but its security systems had prevented infection.

There are also reports that Japanese government websites, including the cabinet office and a video distribution service, have been hit by distributed denial-of-service attacks.

A typical DDoS attack involves hundreds or thousands of computers, under the control of hackers, bombarding an organisation’s website with so many hits that it collapses.

Last month, a Japanese defence white paper urged better protection against cyber attacks after US defence contractors were hit by a spate of assaults.

One of the most high-profile cases involved Lockheed Martin – the world’s biggest aerospace company, which makes F-16, F-22 and F-35 fighter jets as well as warships.

Although the firm said none of its programmes were compromised in the attack in May, it prompted other defence contractors to assess their own security measures.

More on This Story

Related Stories

Could the World Situation Get Any Worse? You Betcha!

From Bad to Worse – By David J. Rothkopf | Foreign Policy.

It is hard to deny. Things are looking bleak. But are they as bad as they could get?

The answer, of course, is no.

Here are 10 things that could happen between now and the end of next year that could make things much worse and why President Obama should consider not running for reelection.

Europe’s debt crisis could deepen
The European Central Bank’s interventions to prop up Spain and Italy could prove inadequate. EU leaders will continue to avoid real structural reform. European banks, now showing a reluctance to lend (akin to their mood immediately after the collapse of Lehman Brothers) could themselves teeter, burdened by the prospects of sovereign debt defaults and a global slow down. Spain and Italy could take a turn for the worse. The rest of the world, preoccupied with their own problems, might be as distracted as are the northern Europeans frustrated with bailing out their feckless southern neighbors.

Tensions tighten
Europe’s economic problems could beget deepening social tensions. Unrest like that seen in the United Kingdom could become more commonplace. With jobs drying up, anti-immigrant violence could grow. Nationalism could feed off these tensions and fuel more steps like Denmark’s move away from the EU’s commitment to open borders among its states.

U.S. recession regression
The United States could officially enter recession. Reduced tax revenues will be one painful consequence of the slow down. Politicians will struggle to reduce debt but find it hard to do so in the near term. The problem will burgeon. Small- and medium-sized communities will default. Several large cities and perhaps one or two significant states will be at risk of being unable to pay their bills. Draconian cutbacks in police and social services will blend with high unemployment and growing inequality to produce social unrest in the United States. Stock markets will continue their slide.

Global contagion
We could enter a global recession. Downturns in the United States and the European Union could feed off of one another and the fragile Japanese economy would almost certainly sink as a result. Credit tightness and political indecisiveness will deepen the gloom.

Inflation hits the BRICs
While emerging markets like China and Brazil might see inflation worries ebb due to the global recession and falling demand for high-priced commodities … they might not. Their currencies could strengthen as established ones falter, making exports more costly at just the wrong moment. Secular growth in demand for commodities may slow declines somewhat reducing the “benefits” of declining demand. Alternatively, or additionally, real estate and financial bubbles might burst in each of these countries as investor doubts grow. Note that Brazil took a particular beating during the recent downward market spike.

Middle East meltdown
Tensions in the Middle East could grow. Palestine’s push for statehood might be followed by massive displays of civic unrest. An Israeli government burdened with economic problems of its own and a little arthritic when it comes to its willingness to show flexibility with its near-neighbors will move too slowly. States elsewhere in the region grappling with their own problems — a more anti-Israel Egypt, Syria, Iran, and others — will fan the flames. Meanwhile, the problems in those states will put the entire region on the edge of an unprecedented meltdown. Thus, even with falling global demand and the recent downturn in oil prices, you could see upward pressure on petroleum as well. Then, Iran announces it has successfully tested a nuclear weapon.

Sub-continental showdown
The government in Pakistan could totter or be decapitated thus heightening fears of even more pronounced Islamist influence and of growing tension with India. Indian markets fall. The Indian government is unable to pursue needed economic reforms. Social unrest might be seen throughout the sub-continent.

Another Eyjafjallajokull
One or more exogenous events of the type that regularly occur without warning — a terror attack, an earthquake, a tsunami, a devastating hurricane or typhoon, the eruption of an Icelandic volcano — could slam a major economy weakening the global situation further.

Expect the unexpected
An unexpected or unexpectedly intense conflict could erupt in the Russian near abroad, in Central Asia, in Turkey, in Africa, or in the Middle East creating even more uncertainty. With economically unsteady and politically hesitant leadership in the world’s most important powers growing instability fueled by rogue opportunists seems increasingly likely.

Some combination of the above could then turn the global recession plus related banking, derivatives and stock-market crises into a depression.

It is undeniable that many of the above developments are not highly likely. But what is striking is just how plausible most of them are. These are the kind of medium-to-low probability outcomes with significant consequences that planners must take into account.  It is also easy to see how further inaction, half-steps, and wrong steps by leaders could make these and other grim turns much more likely.

Such possibilities should not trigger panic. They should however, focus the minds of politicians, bankers, and electorates everywhere. The problem with the leadership failures of the recent past is not just that they have slammed the world economy yet again, it is that they have made the future more dangerous than it was.

My fantasy is that recognizing this, President Obama would do as he once promised he would do, set personal ambition aside and announce he is not running for re-election. Instead, he would say that he wanted to shrug off the straight jacket of political considerations and focus exclusively on finding bi-partisan solutions to America’s problems. Perhaps he would make a bold gesture, like appointing Erskine Bowles and Alan Simpson co-secretaries of the Treasury or, at least, give both economic leadership roles on his team. Others in the Democratic Party can focus on 2012 and beyond. There are many qualified to lead. (Who knows, perhaps the next candidate we find can actually have experience with markets and with the rest of the world.) There are certainly plenty of Democrats who stand head and shoulders above the current, feeble array the Republican Party has rolled out, which will only grow more feeble with the likely addition of Rick Perry this weekend. And then Barack Obama, a decent, talented, and gifted man, can fashion a unique legacy for himself, as a public servant who actually thought his first duty was to serve the public.

But, I’ll admit it, that fantasy is less likely to occur than any of the other events I listed above.  And so I will continue to hope for the next best thing: The president and his fellow heads of state and government worldwide begin to govern as though they didn’t care whether they won re-election or not, but instead as though their top priorities was ro

the Democratization of Destruction

Get Ready for the Democratization of Destruction – By Andrew Krepinevich | Foreign Policy.

As Niels Bohr famously observed, “Prediction is very difficult, especially if it’s about the future.” But we need not be caught entirely unaware by future events. The rapid pace of technological progression, as well as its ongoing diffusion, offer clues as to some of the likely next big things in warfare. Indeed, important military shifts have already been set in motion that will be difficult if not impossible to reverse. Sadly, these developments, combined with others in the economic, geopolitical, and demographic realms, seem likely to make the world a less stable and more dangerous place.

Consider, to start, the U.S. military’s loss of its near monopoly in precision-guided munitions warfare, which it has enjoyed since the Gulf War two decades ago. Today China is fielding precision-guided ballistic and cruise missiles, as well as other “smart” munitions, in ever greater numbers. They can be used to threaten the few major U.S. bases remaining in the Western Pacific and, increasingly, to target American warships. Like Beijing, Iran is buying into the precision-guided weapons revolution, but at the low end, producing a poor man’s version of China’s capabilities, to include anti-ship cruise missiles and smart anti-ship mines. As these trends play out we could find that by the beginning of the next decade, major parts of the Western Pacific, as well as the Persian Gulf, become no-go zones for the U.S. military: areas where the risks of operating are prohibitively high.

Even nonstate groups are getting into the game. During its war with Israel in 2006, Hezbollah fired more than 4,000 relatively inaccurate RAMM projectiles — rockets, artillery, mortars, and missiles — into Israel, leading to the evacuation of at least 300,000 Israelis from their homes and causing significant disruption to that country’s economy. Out of these thousands of munitions, only a few drones and anti-ship cruise missiles were guided. But as the proliferation of guided munitions — G-RAMM weapons — continues, irregular warfare will be transformed to the point that the roadside bomb threats that the United States has spent tens of billions of dollars defending against in Iraq and Afghanistan may seem trivial by comparison.

The spread of nuclear weapons to the developing world is equally alarming. If Iran becomes a nuclear power, the pressure on the leading Arab states as well as Turkey to follow suit is likely to prove irresistible. With ballistic-missile flight times between states in the region measured in single-digit minutes, the stability of the global economy’s energy core would be exceedingly fragile.

But the greatest danger of a catastrophic attack on the U.S. homeland will likely come not from nuclear-armed missiles, but from cyberattacks conducted at the speed of light. The United States, which has an advanced civilian cyberinfrastructure but prohibits its military from defending it, will prove a highly attractive target, particularly given that the processes for attributing attacks to their perpetrators are neither swift nor foolproof. Foreign powers may already have prepositioned “logic bombs” — computer code inserted surreptitiously to trigger a future malicious effect — in the U.S. power grid, potentially enabling them to trigger a prolonged and massive future blackout.

As in the cyber realm, the very advances in biotechnology that appear to offer such promise for improving the human condition have the potential to inflict incalculable suffering. For example, “designer” pathogens targeting specific human subgroups or designed to overcome conventional antibiotics and antiviral countermeasures now appear increasingly plausible, giving scientists a power once thought to be the province of science fiction. As in the cyber realm, such advances will rapidly increase the potential destructive power of small groups, a phenomenon that might be characterized as the “democratization of destruction.”

International stability is also increasingly at risk owing to structural weaknesses in the global economic system. Commercial man-made satellites, for instance, offer little, if any, protection against the growing threat of anti-satellite systems, whether ground-based lasers or direct-ascent kinetic-kill vehicles. The Internet was similarly constructed with a benign environment in mind, and the progression toward potential sources of single-point system failure, in the forms of both common software and data repositories like the “cloud,” cannot be discounted.

Then there is the undersea economic infrastructure, primarily located on the world’s continental shelves. It provides a substantial portion of the world’s oil and natural gas, while also hosting a web of cables connecting the global fiber-optic grid. The value of the capital assets on the U.S. continental shelves alone runs into the trillions of dollars. These assets — wellheads, pumping stations, cables, floating platforms — are effectively undefended.

As challenges to the global order increase in scale and shift in form, the means for addressing them are actually declining. The age of austerity is upon us, and it seems likely if not certain that the U.S. military will confront these growing challenges with relatively diminished resources. The Pentagon’s budget is scheduled for $400 billion or more in cuts over the next decade. Europe certainly cannot be counted on to pick up the slack. Nor is it clear whether rising great powers such as Brazil and India will try to fill the void.

With technology advancing so rapidly, might the United States attempt to preserve its military dominance, and international stability, by developing new sources of military advantage? Recently, there have been dramatic innovations in directed energy — lasers and particle beams — that could enable major advances in key mission areas. But there are indications that competitors, China in particular, are keeping pace and may even enjoy an advantage.

The United States has the lead in robotics — for now. While many are aware of the Predator drones used in the war against radical Islamist groups, robots are also appearing in the form of undersea craft and terrestrial mechanical “mules” used to move equipment. But the Pentagon will need to prove better than its rivals at exploiting advances in artificial intelligence to enhance the performance of its unmanned systems. The U.S. military will also need to make its robot crafts stealthier, reduce their vulnerability to more sophisticated rivals than the Taliban, and make their data links more robust in order to fend off efforts to disable them.

The bottom line is that the United States and its allies risk losing their military edge, and new threats to global security are arising faster than they can counter them. Think the current world order is fragile? In the words of the great Al Jolson, “You ain’t seen nothin’ yet.”

A Plan to Power 100 Percent of the Planet with Renewables

A Plan to Power 100 Percent of the Planet with Renewables: Scientific American.

In December leaders from around the world will meet in Copenhagen to try to agree on cutting back greenhouse gas emissions for decades to come. The most effective step to implement that goal would be a massive shift away from fossil fuels to clean, renewable energy sources. If leaders can have confidence that such a transformation is possible, they might commit to an historic agreement. We think they can.
A year ago former vice president Al Gore threw down a gauntlet: to repower America with 100 percent carbon-free electricity within 10 years. As the two of us started to evaluate the feasibility of such a change, we took on an even larger challenge: to determine how 100 percent of the world’s energy, for all purposes, could be supplied by wind, water and solar resources, by as early as 2030. Our plan is presented here.

Scientists have been building to this moment for at least a decade, analyzing various pieces of the challenge. Most recently, a 2009 Stanford University study ranked energy systems according to their impacts on global warming, pollution, water supply, land use, wildlife and other concerns. The very best options were wind, solar, geothermal, tidal and hydroelectric power—all of which are driven by wind, water or sunlight (referred to as WWS). Nuclear power, coal with carbon capture, and ethanol were all poorer options, as were oil and natural gas. The study also found that battery-electric vehicles and hydrogen fuel-cell vehicles recharged by WWS options would largely eliminate pollution from the transportation sector.

Our plan calls for millions of wind turbines, water machines and solar installations. The numbers are large, but the scale is not an insurmountable hurdle; society has achieved massive transformations before. During World War II, the U.S. retooled automobile factories to produce 300,000 aircraft, and other countries produced 486,000 more. In 1956 the U.S. began building the Interstate Highway System, which after 35 years extended for 47,000 miles, changing commerce and society.

Is it feasible to transform the world’s energy systems? Could it be accomplished in two decades? The answers depend on the technologies chosen, the availability of critical materials, and economic and political factors.

Clean Technologies Only
Renewable energy comes from enticing sources: wind, which also produces waves; water, which includes hydroelectric, tidal and geothermal energy (water heated by hot underground rock); and sun, which includes photovoltaics and solar power plants that focus sunlight to heat a fluid that drives a turbine to generate electricity. Our plan includes only technologies that work or are close to working today on a large scale, rather than those that may exist 20 or 30 years from now.

To ensure that our system remains clean, we consider only technologies that have near-zero emissions of greenhouse gases and air pollutants over their entire life cycle, including construction, operation and decommissioning. For example, when burned in vehicles, even the most ecologically acceptable sources of ethanol create air pollution that will cause the same mortality level as when gasoline is burned. Nuclear power results in up to 25 times more carbon emissions than wind energy, when reactor construction and uranium refining and transport are considered. Carbon capture and sequestration technology can reduce carbon dioxide emissions from coal-fired power plants but will increase air pollutants and will extend all the other deleterious effects of coal mining, transport and processing, because more coal must be burned to power the capture and storage steps. Similarly, we consider only technologies that do not present significant waste disposal or terrorism risks.

In our plan, WWS will supply electric power for heating and transportation—industries that will have to revamp if the world has any hope of slowing climate change. We have assumed that most fossil-fuel heating (as well as ovens and stoves) can be replaced by electric systems and that most fossil-fuel transportation can be replaced by battery and fuel-cell vehicles. Hydrogen, produced by using WWS electricity to split water (electrolysis), would power fuel cells and be burned in airplanes and by industry.

Plenty of Supply
Today the maximum power consumed worldwide at any given moment is about 12.5 trillion watts (terawatts, or TW), according to the U.S. Energy Information Administration. The agency projects that in 2030 the world will require 16.9 TW of power as global population and living standards rise, with about 2.8 TW in the U.S. The mix of sources is similar to today’s, heavily dependent on fossil fuels. If, however, the planet were powered entirely by WWS, with no fossil-fuel or biomass combustion, an intriguing savings would occur. Global power demand would be only 11.5 TW, and U.S. demand would be 1.8 TW. That decline occurs because, in most cases, electrification is a more efficient way to use energy. For example, only 17 to 20 percent of the energy in gasoline is used to move a vehicle (the rest is wasted as heat), whereas 75 to 86 percent of the electricity delivered to an electric vehicle goes into motion.

Even if demand did rise to 16.9 TW, WWS sources could provide far more power. Detailed studies by us and others indicate that energy from the wind, worldwide, is about 1,700 TW. Solar, alone, offers 6,500 TW. Of course, wind and sun out in the open seas, over high mountains and across protected regions would not be available. If we subtract these and low-wind areas not likely to be developed, we are still left with 40 to 85 TW for wind and 580 TW for solar, each far beyond future human demand. Yet currently we generate only 0.02 TW of wind power and 0.008 TW of solar. These sources hold an incredible amount of untapped potential.

The other WWS technologies will help create a flexible range of options. Although all the sources can expand greatly, for practical reasons, wave power can be extracted only near coastal areas. Many geothermal sources are too deep to be tapped economically. And even though hydroelectric power now exceeds all other WWS sources, most of the suitable large reservoirs are already in use.

The Plan: Power Plants Required
Clearly, enough renewable energy exists. How, then, would we transition to a new infrastructure to provide the world with 11.5 TW? We have chosen a mix of technologies emphasizing wind and solar, with about 9 percent of demand met by mature water-related methods. (Other combinations of wind and solar could be as successful.)

Wind supplies 51 percent of the demand, provided by 3.8 million large wind turbines (each rated at five megawatts) worldwide. Although that quantity may sound enormous, it is interesting to note that the world manufactures 73 million cars and light trucks every year. Another 40 percent of the power comes from photovoltaics and concentrated solar plants, with about 30 percent of the photovoltaic output from rooftop panels on homes and commercial buildings. About 89,000 photovoltaic and concentrated solar power plants, averaging 300 megawatts apiece, would be needed. Our mix also includes 900 hydroelectric stations worldwide, 70 percent of which are already in place.

Only about 0.8 percent of the wind base is installed today. The worldwide footprint of the 3.8 million turbines would be less than 50 square kilometers (smaller than Manhattan). When the needed spacing between them is figured, they would occupy about 1 percent of the earth’s land, but the empty space among turbines could be used for agriculture or ranching or as open land or ocean. The nonrooftop photovoltaics and concentrated solar plants would occupy about 0.33 percent of the planet’s land. Building such an extensive infrastructure will take time. But so did the current power plant network. And remember that if we stick with fossil fuels, demand by 2030 will rise to 16.9 TW, requiring about 13,000 large new coal plants, which themselves would occupy a lot more land, as would the mining to supply them.

The Materials Hurdle
The scale of the WWS infrastructure is not a barrier. But a few materials needed to build it could be scarce or subject to price manipulation.

Enough concrete and steel exist for the millions of wind turbines, and both those commodities are fully recyclable. The most problematic materials may be rare-earth metals such as neodymium used in turbine gearboxes. Although the metals are not in short supply, the low-cost sources are concentrated in China, so countries such as the U.S. could be trading dependence on Middle Eastern oil for dependence on Far Eastern metals. Manufacturers are moving toward gearless turbines, however, so that limitation may become moot.

Photovoltaic cells rely on amorphous or crystalline silicon, cadmium telluride, or copper indium selenide and sulfide. Limited supplies of tellurium and indium could reduce the prospects for some types of thin-film solar cells, though not for all; the other types might be able to take up the slack. Large-scale production could be restricted by the silver that cells require, but finding ways to reduce the silver content could tackle that hurdle. Recycling parts from old cells could ameliorate material difficulties as well.

Three components could pose challenges for building millions of electric vehicles: rare-earth metals for electric motors, lithium for lithium-ion batteries and platinum for fuel cells. More than half the world’s lithium reserves lie in Bolivia and Chile. That concentration, combined with rapidly growing demand, could raise prices significantly. More problematic is the claim by Meridian International Research that not enough economically recoverable lithium exists to build anywhere near the number of batteries needed in a global electric-vehicle economy. Recycling could change the equation, but the economics of recycling depend in part on whether batteries are made with easy recyclability in mind, an issue the industry is aware of. The long-term use of platinum also depends on recycling; current available reserves would sustain annual production of 20 million fuel-cell vehicles, along with existing industrial uses, for fewer than 100 years.

Smart Mix for Reliability
A new infrastructure must provide energy on demand at least as reliably as the existing infrastructure. WWS technologies generally suffer less downtime than traditional sources. The average U.S. coal plant is offline 12.5 percent of the year for scheduled and unscheduled maintenance. Modern wind turbines have a down time of less than 2 percent on land and less than 5 percent at sea. Photovoltaic systems are also at less than 2 percent. Moreover, when an individual wind, solar or wave device is down, only a small fraction of production is affected; when a coal, nuclear or natural gas plant goes offline, a large chunk of generation is lost.

The main WWS challenge is that the wind does not always blow and the sun does not always shine in a given location. Intermittency problems can be mitigated by a smart balance of sources, such as generating a base supply from steady geothermal or tidal power, relying on wind at night when it is often plentiful, using solar by day and turning to a reliable source such as hydroelectric that can be turned on and off quickly to smooth out supply or meet peak demand. For example, interconnecting wind farms that are only 100 to 200 miles apart can compensate for hours of zero power at any one farm should the wind not be blowing there. Also helpful is interconnecting geographically dispersed sources so they can back up one another, installing smart electric meters in homes that automatically recharge electric vehicles when demand is low and building facilities that store power for later use.

Because the wind often blows during stormy conditions when the sun does not shine and the sun often shines on calm days with little wind, combining wind and solar can go a long way toward meeting demand, especially when geothermal provides a steady base and hydroelectric can be called on to fill in the gaps.

As Cheap as Coal
The mix of WWS sources in our plan can reliably supply the residential, commercial, industrial and transportation sectors. The logical next question is whether the power would be affordable. For each technology, we calculated how much it would cost a producer to generate power and transmit it across the grid. We included the annualized cost of capital, land, operations, maintenance, energy storage to help offset intermittent supply, and transmission. Today the cost of wind, geothermal and hydroelectric are all less than seven cents a kilowatt-hour (¢/kWh); wave and solar are higher. But by 2020 and beyond wind, wave and hydro are expected to be 4¢/kWh or less.

For comparison, the average cost in the U.S. in 2007 of conventional power generation and transmission was about 7¢/kWh, and it is projected to be 8¢/kWh in 2020. Power from wind turbines, for example, already costs about the same or less than it does from a new coal or natural gas plant, and in the future wind power is expected to be the least costly of all options. The competitive cost of wind has made it the second-largest source of new electric power generation in the U.S. for the past three years, behind natural gas and ahead of coal.

Solar power is relatively expensive now but should be competitive as early as 2020. A careful analysis by Vasilis Fthenakis of Brookhaven National Laboratory indicates that within 10 years, photovoltaic system costs could drop to about 10¢/kWh, including long-distance transmission and the cost of compressed-air storage of power for use at night. The same analysis estimates that concentrated solar power systems with enough thermal storage to generate electricity 24 hours a day in spring, summer and fall could deliver electricity at 10¢/kWh or less.

Transportation in a WWS world will be driven by batteries or fuel cells, so we should compare the economics of these electric vehicles with that of internal-combustion-engine vehicles. Detailed analyses by one of us (Delucchi) and Tim Lipman of the University of California, Berkeley, have indicated that mass-produced electric vehicles with advanced lithium-ion or nickel metal-hydride batteries could have a full lifetime cost per mile (including battery replacements) that is comparable with that of a gasoline vehicle, when gasoline sells for more than $2 a gallon.

When the so-called externality costs (the monetary value of damages to human health, the environment and climate) of fossil-fuel generation are taken into account, WWS technologies become even more cost-competitive.

Overall construction cost for a WWS system might be on the order of $100 trillion worldwide, over 20 years, not including transmission. But this is not money handed out by governments or consumers. It is investment that is paid back through the sale of electricity and energy. And again, relying on traditional sources would raise output from 12.5 to 16.9 TW, requiring thousands more of those plants, costing roughly $10 trillion, not to mention tens of trillions of dollars more in health, environmental and security costs. The WWS plan gives the world a new, clean, efficient energy system rather than an old, dirty, inefficient one.

Political Will
Our analyses strongly suggest that the costs of WWS will become competitive with traditional sources. In the interim, however, certain forms of WWS power will be significantly more costly than fossil power. Some combination of WWS subsidies and carbon taxes would thus be needed for a time. A feed-in tariff (FIT) program to cover the difference between generation cost and wholesale electricity prices is especially effective at scaling-up new technologies. Combining FITs with a so-called declining clock auction, in which the right to sell power to the grid goes to the lowest bidders, provides continuing incentive for WWS developers to lower costs. As that happens, FITs can be phased out. FITs have been implemented in a number of European countries and a few U.S. states and have been quite successful in stimulating solar power in Germany.

Taxing fossil fuels or their use to reflect their environmental damages also makes sense. But at a minimum, existing subsidies for fossil energy, such as tax benefits for exploration and extraction, should be eliminated to level the playing field. Misguided promotion of alternatives that are less desirable than WWS power, such as farm and production subsidies for biofuels, should also be ended, because it delays deployment of cleaner systems. For their part, legislators crafting policy must find ways to resist lobbying by the entrenched energy industries.

Finally, each nation needs to be willing to invest in a robust, long-distance transmission system that can carry large quantities of WWS power from remote regions where it is often greatest—such as the Great Plains for wind and the desert Southwest for solar in the U.S.—to centers of consumption, typically cities. Reducing consumer demand during peak usage periods also requires a smart grid that gives generators and consumers much more control over electricity usage hour by hour.

A large-scale wind, water and solar energy system can reliably supply the world’s needs, significantly benefiting climate, air quality, water quality, ecology and energy security. As we have shown, the obstacles are primarily political, not technical. A combination of feed-in tariffs plus incentives for providers to reduce costs, elimination of fossil subsidies and an intelligently expanded grid could be enough to ensure rapid deployment. Of course, changes in the real-world power and transportation industries will have to overcome sunk investments in existing infrastructure. But with sensible policies, nations could set a goal of generating 25 percent of their new energy supply with WWS sources in 10 to 15 years and almost 100 percent of new supply in 20 to 30 years. With extremely aggressive policies, all existing fossil-fuel capacity could theoretically be retired and replaced in the same period, but with more modest and likely policies full replacement may take 40 to 50 years. Either way, clear leadership is needed, or else nations will keep trying technologies promoted by industries rather than vetted by scientists.

A decade ago it was not clear that a global WWS system would be technically or economically feasible. Having shown that it is, we hope global leaders can figure out how to make WWS power politically feasible as well. They can start by committing to meaningful climate and renewable energy goals now.

Note: This article was originally printed with the title, “A Path to Sustainable Energy by 2030.”

Read Comments (166)

 

DARPA at Phase 3 on solar powered surveillance strato-ship

DARPA at Phase 3 on solar powered surveillance strato-ship • The Register.

This technology has HUGE non-military potential, from portable self-powered comm towers to situation monitoring and aerial recon, to portable power generation;

The famed Pentagon Q-branch boffinry hothouse, DARPA, has unveiled another ambitious plan to further US military-technical dominance. It has given $400m to American weapons globocorp Lockheed to develop a solar-powered robot radar airship, able to lurk in the stratosphere for a year at a time, potentially tracking individual people walking about on the ground across areas 1200km wide.

DARPA concept of the ISIS radar airshipThe government spooks didn’t need numberplate tracking any more.

Yesterday’s contract announcement was for Phase 3 of DARPA’s Integrated Sensor Is Structure (ISIS) project, in which a flying sub-scale demonstrator will be built to prove that the concept can work as planned. Phases 1 and 2 consisted mostly of design studies and materials work.

The idea of ISIS is to hugely improve on what a normal airship can do, by using the ship itself as a radar antenna rather than carrying a separate piece of machinery – hence the name. DARPA believe this will hugely increase the size of radar antenna a stratospheric airship can carry, which in turn means the radar would deliver much better sensor resolution for much less power.

The lowered power requirements of the ISIS radar-ship, DARPA believes, will mean it can run on solar power. Excess energy generated during the day will be stored by cracking water into hydrogen: at night, this will be burned in fuel cells to keep the ship flying and its radar shining even in darkness.

DARPA calculate that the ship should be able to cruise at 60 knots or sprint at 100, which will let it deploy from the US to a global troublespot in 10 days. It will then be able to hold station easily in the stratospheric “wind bucket” found at 65,000 to 70,000 feet, scanning the ground beneath it with its all-seeing radar mega-eye.

The performance of the massive scanner, according to DARPA, should be such that it can track unobscured “dismounts [people walking] across the entire line of sight” – in other words out to the horizon, which at operational height will be 600km away.

That said, the contract announcement suggests a slight bit of neck-winding, referring to an ability to track “all ground targets” to 300km. Closer in, the Pentagon boffins think, it will be capable of tracking such small objects even through overhanging foliage. Performance against easier airborne targets – planes, missiles etc. – would definitely be right out to the horizon at 600km.

If the ISIS can do all that DARPA suggest, it will handily trump most of the other aerial scanners in use by the US forces, including AWACS sky-scanner planes, the smaller E-2 Hawkeye AWACS that flies from US carriers, Joint STARS ground-sweeping tank sniffers, and the JLENS moored-balloon radar plan. The potential would be there perhaps to do without all these things, simply assigning a single ISIS ship in place of the several AWACS or whatever you formerly needed so as to keep one up on patrol.

An ISIS airship would potentially be vulnerable to enemy action, but at 70,000 feet only quite serious enemies – the sort who could also threaten AWACS or JSTARS aircraft – would have any chance of hitting it. And those planes carry large crews, whereas the ISIS is unmanned.

So this is potentially big news for the US military, the more so in that ISIS has now made it to Phase 3 – we’re no longer talking just about design studies here. The privacy/surveillance issues – the chance that ISIS spy-ships might lurk one day above US or allied territory, tracking every vehicle or even every person walking about – could be even more significant. Forget about numberplate cameras or face tracking; you’d have to live underground to avoid this sort of thing.

For those who’d like to know more, there’s a pdf on ISIS from DARPA here. ®