Category Archives: Alternative ways!

20 Ways to Build a Cleaner, Healthier, Smarter World

World Changing Ideas: 20 Ways to Build a Cleaner, Healthier, Smarter World: Scientific American.

What would happen if solar panels were free? What if it were possible to know everything about the world—not the Internet, but the living, physical world—in real time? What if doctors could forecast a disease years before it strikes? This is the promise of the World Changing Idea: a vision so simple yet so ambitious that its full impact is impossible to predict. Scientific American’s editorial and advisory boards have chosen projects in five general categories—Energy, Transportation, Environment, Electronics and Robotics, and Health and Medicine—that highlight the power of science and technology to improve the world. Some are in use now; others are emerging from the lab. But all of them show that innovation is the most promising elixir for what ails us.  —The Editors

The No-Money-Down Solar Plan
A new wave of start-ups wants to install rooftop solar panels on your house. Upfront cost: nothing
By Christopher Mims

The biggest thing stopping the sun is money. Installing a rooftop array of solar panels large enough to produce all of the energy required by a building is the equivalent of prepaying its electricity bill for the next seven to 10 years—and that’s after federal and state incentives. A new innovation in financing, however, has opened up an additional possibility for homeowners who want to reduce their carbon footprint and lower their electric bills: get the panels for free, then pay for the power as you go.

The system works something like a home mortgage. Organizations and individuals looking for a steady return on their investment, typically banks or municipal bond holders, use a pool of cash to pay for the solar panels. Directly or indirectly, homeowners buy the electricity produced by their own rooftop at a rate that is less, per kilowatt-hour, than they would pay for electricity from the grid. Investors get a safe investment—the latest generation of solar-panel technology works dependably for years—and homeowners get a break on their monthly bills, not to mention the satisfaction of significantly reducing their carbon footprint. “This is a way to get solar without putting any money down and to start saving money from day one. That’s a first,” says SolarCity co-founder Peter Rive.

SolarCity is the largest installer of household solar panels to have adopted this strategy. Founded in 2006 by two brothers who are also Silicon Valley–based serial entrepreneurs, SolarCity leases its panels to homeowners but gives the electricity away for free. The net effect is a much reduced utility bill (customers still need utility-delivered power when the sun isn’t out) plus a monthly SolarCity bill. The total for both comes out to less than the old bill. SunRun in San Francisco offers consumers a similar package, except that the company sells customers the electricity instead of leasing them the panels.

Cities such as Berkeley and Boulder are pioneering their own version of solar-panel financing by loaning individuals the entire amount required to pay for solar panels and installation. The project is paid for by municipal bonds, and the homeowner pays back the loan over 20 years as a part of the property tax bill. The effect is the same whichever route a consumer takes: the new obligation, in the form of taxes, a lease or a long-term contract for electricity, ends up costing less than the existing utility bill.

“What we’re really seeing is a transition in how we think about buying energy goods and services,” says Daniel M. Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley. Kammen, who did the initial analysis on Berkeley’s financing model, believes that by turning to financing, consumers can overcome the inherent disadvantage renewables have when compared with existing energy sources: the infrastructure for power from the grid has already been paid for and, in many cases, has been subsidized for decades.

All three approaches are rapidly expanding across the country. Despite the Berkeley program being less than two years old, 10 different states have passed legislation allowing their cities to set up a Berkeley-style bond-financed loan program. With the passage of the Waxman-Markey climate bill, the option for cities to set up these programs would become federal law. SunEdison in Maryland is currently active in nine states. SolarCity, which has more than 4,000 customers, is active in California, Arizona and Oregon and has promised to announce additional states after the new year.

Right now it is not possible to lower the overall cost of rooftop solar to “grid parity,” that is, to the same price as electricity from local utility companies, without federal subsidies such as the investment tax credit, which lowers the tax bill of banks financing these projects. Those subsidies, which amount to 30 percent of the cost of a solar installation, are guaranteed for at least eight years. By then, SolarCity and its competitors claim they won’t need them.

“Grid parity is driven by multiple factors,” says Attila Toth, vice president of marketing at SunEdison, including the cost of capital, the cost of panels and their installation, and the intensity of sunlight in a given region. “It will occur in different states at different times, but, for example, we expect that California will be one of the first states in the U.S. to get to grid parity, sometime between three and five years from now.”

While the cost of electricity from fossil fuels has increased 3 to 5 percent a year for the past decade, the cost of solar panels has fallen on average 20 percent for every doubling of its installed base. Grid parity is where these trend lines cross—after that, solar has the potential to power more than just homes. It’s hardly a coincidence that Elon Musk, head of electric car company Tesla Motors, sits on SolarCity’s board of directors.

More Ideas to watch
by Christopher Mims

The Gasoline Garden
It is the next step for biofuels: genetically engineered plant life that produces hydrocarbons as a by-product of its normal metabolism. The result will be fuel—common gasoline, even—using nothing but sunlight and CO2. In July, Exxon Mobil announced plans to spend more than $600 million in pursuit of algae that can accomplish the task. Joule Biotechnologies claims to have already succeeded, although the company has yet to reveal any details of its proprietary system.

Hot Nukes
Uranium and plutonium are not the only fuels that can power a nuclear reactor. With an initial kick from more traditional fissile materials, thorium can set up a self-sustaining “breeder” reaction that produces uranium 233, which is well suited to nuclear power generation. The process has the added benefit of being resistant to nuclear proliferation, because its end products emit enough gamma rays to make the fuel dangerous to handle and easy to track.

Save Energy with Information
Studies show that simply making customers aware of their energy use lowers it
by 5 to 15 percent. Smart meters allow customers to track their energy consumption minute by minute and appliance by appliance. Countless start-ups are offering the devices, and Google and Microsoft are independently partnering with local utilities to allow individuals to monitor their power usage over the Web.

Wind Power from the Stratosphere
According to a Stanford University study released in July, the high-altitude winds that constantly blow tens of thousands of feet above the earth hold enough energy to supply all of human civilization 100 times over. California’s Sky WindPower has proposed harvesting this energy by building fleets of giant, airborne, ground-tethered windmills, while Italy’s Kite Gen proposes to accomplish the same feat using kites.

Delivering the U.S. from Oil
Plug-in hybrid trucks are improving the long view of the short haul
By Amanda Schupak

Cargo trucks gulp about 40 percent of the fuel pumped in the U.S. While most consumer attention focuses on improving the fuel economy of consumer vehicles, a major opportunity goes rumbling by. “Folks do not realize that the fuel use of even a small truck is equal to many, many cars,” says Bill Van Amburg, senior vice president of Calstart, a clean transportation technology nonprofit, and director of the Hybrid Truck Users Forum. “A utility truck as a hybrid would reduce more petroleum than nine Priuses.”

Some 1,300 commercial hybrids on the road today get up to twice the fuel efficiency of their conventional counterparts. But these traditional hybrids are inherently limited. They make more efficient use of petroleum-based fuel by capturing some of the energy lost during braking.

Plug-in hybrids, on the other hand, draw energy from the grid. They can drive for miles—in many cases, an entire day’s route—without using any fossil fuel at all. This shifts energy demand away from petroleum and toward grid-based sources. (Last year zero-carbon renewables and nuclear supplied 30 percent of all electric power in the U.S.)

In many ways, plug-in hybrid technology makes more sense for delivery trucks than for consumer sedans. A cargo truck runs a short daily route that includes many stops to aid in regenerative braking. Most of the U.S. Postal Service’s 200,000-plus mail trucks, for example, travel fewer than 20 miles a day. In addition, fleet vehicles return nightly to storage lots that have ready access to the 120- or 240-volt outlets required to charge them.

The Department of Energy recently launched the nation’s largest commercial plug-in hybrid program, a $45.4-million project to get 378 medium-duty vehicles on the road in early 2011. The trucks, which will go to 50 municipal and utility fleets, will feature a power system from Eaton, a large manufacturer of electrical components, on a Ford F-550 chassis. (For its part, Ford will wait for the market to prove itself before designing its own commercial plug-ins.) “These are going to start breaking free in 2011,” says Paul Scott, president of the Electric Vehicle Association of Southern California.

Start-up company Bright Automotive has a more ambitious plan. It aims to replace at least 50,000 trucks with plug-in hybrids by 2014. Bright’s IDEA prototype travels 40 miles on battery power before switching to a four-cylinder engine that gets 40 miles to the gallon. The streamlined aluminum body has the payload of a postal truck yet is far more aerodynamic. The truck weighs as much as a midsize sedan.

John E. Waters, Bright Automotive’s founder and the former developer of the battery system for General Motors?’s groundbreaking EV1 electric car, says that each IDEA would save 1,500 gallons of fuel and 16 tons of carbon dioxide emissions a year over a standard utility truck. Waters says he is ready to begin assembly in his U.S. plant once a pending $450-million federal loan comes through.

Despite the appeal of the carbon savings, the fleet owners who are the trucks’ primary customers have more practical considerations. Bright’s executives are coy about the IDEA’s eventual price tag but assert that a customer with 2,000 trucks driving 80 miles a day five days a week could save $7.2 million a year. Right now that is probably not enough to justify large-scale purchases without additional rebates—or a price on carbon. Van Amburg estimates that going hybrid currently adds $30,000 to $50,000 in upfront costs per vehicle, although that figure should come down as production volumes increase.

Improved battery technology will also help. Today the IDEA’s 13-kilowatt-hour lithium-ion battery pack accounts for nearly a quarter of the vehicle’s total cost. Much of the research being done for the batteries going into the Chevy Volt? and other consumer plug-ins should also be applicable to commercial batteries. “For all the good we all want to do,” says David Lauzun, Bright’s vice president of product development, “these vehicles will not take over the world until it becomes the economic choice—‘I have to have them because it saves me money.’”

Bus Rapid Transit
Subwaylike bus lines mobilize the urban future
By Michael Moyer

For the first time in human civilization, more people now live in urban areas than in the countryside. This shift creates a number of dilemmas, not least of which is how to move people within the world’s rapidly growing metropolises. Pollution and traffic point away from car-based options, while light-rail systems are slow to construct and prohibitively expensive. One disarmingly simple—and cheap—possibility is Bus Rapid Transit, which is engineered to operate like a subway on wheels. In these systems, concrete dividers on existing roads separate high-capacity buses from the rest of traffic. Riders pay before boarding, then wait in enclosed stations. When a bus arrives, sliding partitions open to allow riders to board from a platform that is level with the bus floor. The traffic-free thoroughfares, quick boarding times, and modern, comfortable stations resemble light-rail systems more than the chaos of typical bus travel. In Bogotá, Colombia, which has had seven Bus Rapid Transit lines in operation since 2001, the buses handle 1.6 million trips a day. Its success has allowed the city to remove 7,000 private buses from the city, reducing consumption of bus fuel and its associated pollution by more than 59 percent.

Ocean Overhaul
Marine zoning is a bold remedy for sick seas
By Sarah Simpson

These days not even many politicians deny that the oceans are ill. Protecting the health of coastal waters is now a matter of national policy in dozens of countries, including the U.S., and world leaders are beginning to prescribe a revolutionary remedy that conservationists have been promoting for years: marine planning and zoning.

The idea is a natural extension of management policies that have guided the development of cities and landscapes for nearly a century. Porn shops aren’t next to preschools, after all, and drilling rigs aren’t the centerpieces of national parks. Similarly, zoning advocates envision a mosaic of regional maps in which every watery space on the planet is designated for a particular purpose. Drilling and mining would be allowed only in certain parts of the ocean; fishing in others. The most critically threatened areas would be virtually off-limits.

Whereas people can easily find maps telling them what they can do where on land, the marine realm is a hodgepodge of rules emanating from an army of agencies, each one managing a single use or symptom. In the U.S., for example, one body regulates commercial fishing, usually a single species at a time. Another group manages toxic substances, still another seabed mining, and so on—some 20 federal agencies in all. They tend to make decisions without regard to what the others are doing, explains Duke University? marine ecologist Larry B. Crowder. “Imagine all of the medical specialists visiting a patient in intensive care one at a time and never talking to one another,” he says. “It’s a wonder that the oceans aren’t in worse shape than they are now.”

Ocean advocates such as Crowder eagerly await the final recommendations of a special task force President Barack Obama charged with presenting a plan for overhauling management of U.S. waters, which extend 200 nautical miles offshore. The scope of such an undertaking is huge: the U.S. controls 4.4 million square miles of seascape, making the country’s underwater real estate 25 percent larger than its landmass. The committee’s preliminary report, released in September, suggests that the best way to minimize harmful human impacts on the oceans is to manage regions rather than symptoms.

Many environmentalists are hopeful that such plans will be implemented through the marine equivalent of municipal zoning, which would give them some influence in areas where they now have none. In zones where conservation is designated as the dominant activity, fishing and industrial activities such as mining would no longer have free rein. Under current rules, about the only way a conservation group can block a project it deems harmful—say, a new site for offshore drilling—is through expensive litigation.

So far, though, the president’s task force has been careful not to suggest that ocean zoning will be the only treatment plan, in great part because any effort to restrict commercial interests is bound to meet stiff opposition. “Zoning isn’t anybody’s favorite exercise,” notes John C. Ogden, director of the Florida Institute of Oceanography at the University of South Florida at Tampa. “Someone’s ox is always getting gored.” Most resistant to such change will most likely be the traditional users of the open ocean—namely, commercial fisheries and the petroleum industry. “They’ve had the place to themselves for a long time,” Ogden says.

Ogden and others are quick to point out, however, that zoning practices can benefit commerce as much as conservation. By giving up access to certain areas, industries gain the security of knowing their activities would be licensed in a more predictable and less costly manner than they are today, explains Josh Eagle, associate professor at the University of South Carolina School of Law. Now an oil company can apply for permits to drill virtually anywhere, but it takes on a significant financial risk each time. The business may dump millions of dollars into researching a new facility only to have a lawsuit derail it at the last moment. When opposing parties have more or less equal voices early in the planning process, Eagle says, they are less inclined to block one another’s activities once zones are drawn on a map.

Whether the final report of the president’s task force will promote ocean zoning explicitly is uncertain. But the group has already promised to overhaul the structure of ocean governance by proposing the creation of a National Ocean Council, whose job it will be to coordinate efforts of the myriad federal agencies now in charge.

The move comes just in time. Just as society is beginning to appreciate the enormous efforts it will take to preserve the health of the oceans, it must ask more of them—more energy, more food, and better resilience to coastal development and climate change. The reason the oceans are in trouble is not what people put in and take out. It is a failure of governments to manage these activities properly. Says Crowder: “We have to treat the oceans holistically, not one symptom at a time.”

The Power of Garbage
Trapped lightning could help zap trash and generate electricity
By John Pavlus

Trash is loaded with the energy trapped in its chemical bonds. Plasma gasification, a technology that has been in development for decades, could finally be ready to extract it.

In theory, the process is simple. Torches pass an electric current through a gas (often ordinary air) in a chamber to create a superheated plasma—an ionized gas with a temperature upward of 7,000 degrees Celsius, hotter than the surface of the sun. When this occurs naturally we call it lightning, and plasma gasification is literally lightning in a bottle: the plasma’s tremendous heat dissociates the molecular bonds of any garbage placed inside the chamber, converting organic compounds into syngas (a combination of carbon monoxide and hydrogen) and trapping everything else in an inert vitreous solid called slag. The syngas can be used as fuel in a turbine to generate electricity. It can also be used to create ethanol, methanol and biodiesel. The slag can be processed into materials suitable for use in construction.

In practice, the gasification idea has been unable to compete economically with traditional municipal waste processing. But the maturing technology has been coming down in cost, while energy prices have been on the rise. Now “the curves are finally crossing—it’s becoming cheaper to take the trash to a plasma plant than it is to dump it in a landfill,” says Louis Circeo, director of Plasma Research at the Georgia Tech Research Institute. Earlier this summer garbage-disposal giant Waste Management partnered with InEnTec, an Oregon-based start-up, to begin commercializing the latter’s plasma-gasification processes. And major pilot plants capable of processing 1,000 daily tons of trash or more are under development in Florida, Louisiana and California.

Plasma isn’t perfect. The toxic heavy metals sequestered in slag pass the Environmental Protection Agency?’s leachability standards (and have been used in construction for years in Japan and France) but still give pause to communities considering building the plants. And although syngas-generated electricity has an undeniably smaller carbon footprint than coal—“For every ton of trash you process with plasma, you reduce the amount of CO2 going into the atmosphere by about two tons,” Circeo says—it is still a net contributor of greenhouse gases.

“It is too good to be true,” Circeo admits, “but the EPA has estimated that if all the municipal solid waste in the U.S. were processed with plasma to make electricity, we could produce between 5 and 8 percent of our total electrical needs—equivalent to about 25 nuclear power plants or all of our current hydropower output.” With the U.S. expected to generate a million tons of garbage every day by 2020, using plasma to reclaim some of that energy could be too important to pass up.

More Ideas to watch
By John Pavlus

Cement as a Carbon Sponge
Traditional cement production creates at least 5 percent of global carbon dioxide emissions, but new materials could create carbon-neutral cement. Start-up Novacem, supported by Imperial College London, uses magnesium oxide to make cement that naturally absorbs CO2 as it hardens. California-based Calera uses seawater to sequester carbon emissions from a nearby power plant in cement.

The New Honeybee
Colony collapse disorder (CCD) has killed more than a third of honeybee colonies since 2006. Farmers who depend on bees to pollinate such crops as almonds, peaches and apples are looking to the blue orchard bee to pick up the slack.

One efficient Osmia lignaria can pollinate as much territory as 50 honeybees, but the bees are harder to cultivate because of their solitary nature. These pinch hitters won’t completely replace honeybees, but as scientists continue to grapple with CCD, they could act as an agricultural safety net.

Saltwater Crops
As the world’s freshwater supply becomes scarcer and food production needs balloon, salt-tolerant crops could ease the burden. Researchers at Australia’s University of Adelaide used genetic engineering to enhance a model crop’s natural ability to prevent saline buildup in its leaves, allowing the plant to thrive in conditions that would typically wither it. If the same gene tweak works in cereal crops such as rice and wheat—the researchers are testing them now—fallow lands destroyed by drought or overirrigation could become new breadbaskets.

The Omnipotence Machines
Tiny, ubiquitous sensors will allow us to index the physical world the way the Web maps cyberspace
By Gregory Mone

Earlier this year Hewlett-Packard announced the launch of its Central Nervous System for the Earth (CeNSE) project, a 10-year effort to embed up to a trillion pushpin-size sensors across the planet. Technologists say that the information gathered by this kind of ubiquitous sensing network could change our knowledge of the world as profoundly as the Internet has changed business. “People had no idea the Web was coming,” says technology forecaster Paul Saffo?. “We are at that moment now with ubiquitous sensing. There is quite an astonishing revolution just around the corner.”

The spread of versatile sensors, or “motes,” and the ability of computers to analyze and either recommend or initiate responses to the data they generate, will not merely enhance our understanding of nature. It could lead to buildings that manage their own energy use, bridges that flag engineers when in need of repair, cars that track traffic patterns and detect potholes, and home security systems that distinguish between the footfalls of an intruder and the dog, to name a few.

CeNSE is the boldest project yet announced, but HP is not the only organization developing the technology to make ubiquitous sensing possible. Intel is also designing novel sensor packages, as are numerous university labs.

For all the momentum in the field, though, this sensor-filled future is by no means inevitable. These devices will need to generate rich, reliable data and be rugged enough to survive tough environments. The sensor packages themselves will be small, but the computing effort required will be enormous. All the information they gather will have to be transmitted, hosted on server farms, and analyzed. Finally, someone is going to have to pay for it all. “There is the fundamental question of economics,” notes computer scientist Deborah Estrin of the University of California, Los Angeles. “Every sensor is a nonzero cost. There is maintenance, power, keeping them calibrated. You don’t just strew them around.”

In fact, HP senior researcher Peter Hartwell acknowledges that for CeNSE to hit its goals, the sensors will need to be nearly free. That is one of the reasons why HP is designing a single, do-everything, pushpin-size package stacked with a variety of gauges—light, temperature, humidity, vibration and strain, among others—instead of a series of devices for different tasks. Hartwell says that focusing on one versatile device will drive up volume, reducing the cost for each unit, but it could also allow HP to serve several clients at once with the same sensors.

Consider his chief engineering project, an ultrasensitive accelerometer. Housed inside a chip, the sensor tracks the motion of a tiny, internal movable platform relative to the rest of the chip. It can measure changes in acceleration 1,000 times as accurately as the technology in the Nintendo Wii?.

Hartwell imagines situating one of these pins every 16 feet along a highway. Thanks to the temperature, humidity and light sensors, the motes could serve as mini weather stations. But the accelerometers’ vibration data could also be analyzed to determine traffic conditions—roughly how many cars are moving past and how quickly. The local highway department would be interested in this information, he guesses, but there are potential consumer applications, too. “Your wireless company might want to take that information and tell you how to get to the airport the fastest,” Hartwell says.

All of this gathering and transmission of data requires power, of course, and to guarantee an extended life, the HP pushpin will not rely solely on batteries. “It is going to have some sort of energy-scavenging ability,” Hartwell says. “Maybe a solar panel or a thermoelectric device to help keep the battery charged.”

With the power hurdle in mind, other groups are forgoing batteries altogether. At Intel Labs in Seattle, engineer Josh Smith? has developed a sensor package that runs on wireless power. Like the HP pushpin, Intel’s WISP, or Wireless Identification and Sensing Platform, will include a variety of gauges, but it will also draw energy from the radio waves emitted by long-range radio-frequency ID chip readers. Smith says a single reader, plugged into a wall outlet, can already power and communicate with a network of prototype WISPs five to 10 feet away—a distance that should increase.

Smith cites many of the same infrastructure-related possibilities as Hartwell, along with a number of other uses. If WISPs were placed on standard household items such as cups, these tags could inform doctors about the rehabilitation progress of stroke victims. If the cups the patient normally uses remain stationary, Smith explains, then the individual probably is not up and moving around.

The potential applications for ubiquitous sensing are so broad—a physicist recently contacted him about using WISPs to monitor the temperature outside a proposed neutrino detector—that, as with the Internet, Smith says it is impossible to foresee them all. “In terms of the impact it is going to have on our lives,” Hartwell adds, “you haven’t seen anything yet.”

The Do-Anything Robot
Your PC can accomplish any computing task you ask of it. Why isn’t the same true for robots
By Gregory Mone

Robots have proved to be valuable tools for soldiers, surgeons and homeowners hoping to keep the carpet clean. But in each case, they are designed and built specifically for the job. Now there is a movement under way to build multipurpose machines—robots that can navigate changing environments such as offices or living rooms and work with their hands.

All-purpose robots are not, of course, a new vision. “It’s been five or 10 years from happening for about 50 years,” says Eric Berger, co-director of the Personal Robotics Program at Willow Garage, a Silicon Valley start-up. The delay is in part because even simple tasks require a huge set of capabilities. For a robot to fetch a mug, for example, it needs to make sense of data gathered by a variety of sensors—laser scanners identifying potential obstacles, cameras searching for the target, force feedback in the fingers that grasp the mug, and more. Yet Berger and other experts are confident that real progress could be made in the next decade.

The problem, according to Willow Garage, is the lack of a common platform for all that computational effort. Instead of building on the capabilities of a single machine, everyone is designing robots, and the software to control them, from the ground up. To help change this, Willow Garage is currently producing 25 copies of its model PR2 (for “Personal Robot 2”), a two-armed, wheeled machine that can unplug an appliance, open doors and move through a room. Ten of the robots will stay in-house, but 10 more will go to outside research groups, and everyone will pool their advances. This way, Berger says, if you want to build the robotic equivalent of a Twitter, you won’t start by constructing a computer: “you build the thing that’s new.”

Pocket Translator
The military, short on linguists, is building smart phone–based devices to do the job
By Gregory Mone

Sakhr Software, a company that builds automatic language translators, recently unveiled a prototype smart phone application that transforms spoken English phrases into spoken Arabic, and vice versa, in near real time. The technology isn’t quite ready for your next trip to Cairo, but thanks to recent advances in machine-translation techniques, plus the advent of higher-fidelity microphones and increasing processing power in smart phones, this mobile technology could soon allow two people speaking different languages to have basic conversations.

Before the 1990s automatic translation meant programming in an endless list of linguistic rules, a technique that proved too labor-intensive and insufficiently accurate. Today’s leading programs—developed by BBN Technologies?, IBM, Sakhr and others as part of a Defense Advanced Research Projects Agency effort to eliminate the military’s need for human translators—rely on machine-learning techniques instead. The software works from a database of parallel texts—for example, War and Peace in two different languages, translated United Nations speeches, and documents pulled off the Web. Algorithms identify short matching phrases across sources, and the software uses them to build statistical models that link English phrases to Arabic ones.

John Makhoul, BBN’s chief scientist, says the current technology is at its best when confined to subject areas with specific phrases and terminology—translating a weather report from English into French, for example, or helping soldiers gather basic biographical information from people in the field. Makhoul envisions the first consumer applications, five years from now, being similarly constrained. A tourism-related translation app on a smart phone could help an American in Florence get directions from a non-English-speaking local, but they won’t chat about Renaissance art. “It is not going to work perfectly,” he says, “but it will do a pretty good job.”

Know if Disease Grows Inside You
Complex diseases have complex causes. Luckily, they also leave a multitude of traces
By Melinda Wenner

With the exception of certain infectious diseases, few of humanity’s ailments have cures. More than 560,000 Americans will die of cancer this year, and despite the 250,000 coronary bypass surgeries doctors perform annually, heart disease is still the country’s number-one killer.

The hardest diseases to cure are the ones that take the longest to develop. They are the end result of decades of complex molecular interactions inside your body. Yet this complexity also pre­sents an opportunity. Scientists have discovered that these interactions leave discernible fingerprints on the body. By unweaving the complex tapestry of molecular clues—changes in the body’s proteins, nucleic acids and metabolites, collectively called biomarkers—doctors hope they will soon be able to not only detect disease but predict a coming illness in time to take action.

Biomarkers are not new. Since 1986 doctors have monitor­ed prostate cancer by measuring blood levels of the protein known as prostate-specific antigen (PSA). But tests that rely on a single biomarker to detect disease are rare, because most disorders involve intricate changes in a collection of biomarkers.

Take schizophrenia: in January 2010 scientists will release a biomarker test that distinguishes schizophrenia from other psychiatric conditions. The test, which is being commercialized by Rules-Based Medicine, a laboratory in Austin, Tex., is based on the characteristics of about 40 blood-based proteins.

To find potentially useful biomarkers, researchers collect blood samples from thousands of healthy people and analyze them. Biomarker levels in these samples provide a baseline reading. Then they do the same for people with a specific condition such as diabetes or breast cancer. If reproducible differences emerge between the groups, scientists can use the patterns in the disease group to diagnose the same condition in others. By collecting samples over time, researchers can also go back and analyze early samples from individuals who later become ill to identify patterns indicative of early disease or high disease risk.

Biophysical Corporation, a sister company to Rules-Based Medicine, is one of several companies that has developed blood-based biomarker tests and marketed them to the public [see “The Ultimate Blood Test,” by Philip Yam; Scientific American, June 2006]. The company searches for up to 250 biomarkers suggestive of cancer, inflammatory conditions, heart disease and other illnesses. Mark Chandler, Biophysical’s chair and CEO, says that the real value of the tests lies in long-term monitoring. A person could “get a test monthly, just a finger stick, that would be able to say, we have had a serious change here that is indicative of an early-stage cancer,” he explains.

Yet not all experts are convinced that the age of biomarkers is at hand. Cheryl Barton, an independent U.K.-based pharmaceutical consultant who authored a Business Insights market analysis report on biomarkers in 2006, says she remains “a little bit skeptical about how clinically useful they are.” A study of 5,000 subjects published in the Journal of the American Medical Association in July 2009 found that six cardiovascular biomarkers were only marginally better at predicting heart disease than were standard cardiovascular risk factors, such as whether the subjects smoked or had diabetes.

Adding to the overall difficulty, a person might suffer from two or more diseases—prostate cancer and heart disease, for example. No one knows how multiple diseases might affect overall biomarker signatures or how profiles will change as other diseases develop. “When you get to be 65 or 70, almost everybody has other conditions,” Chandler says. “We don’t know how to deal with that right now.” And scientists still need to discern which biomarkers are truly relevant to disease—a difficult task when working with blood, which contains tens of thousands of proteins at concentrations spanning more than 10 orders of magnitude.

Some companies have simplified the problem by avoiding blood altogether. LabCorp recently commercialized a biomarker test that analyzes colon cells in stool for the chemical signatures indicative of colorectal cancer. “The stool is in intimate contact with the lining of the colon, so it becomes much more highly populated with these rare molecules than would get into the bloodstream from colon cancer,” says Barry Berger, chief medical officer of Exact Sciences, a Madison, Wis.–based biotechnology company that developed the test technology.

In time, scientists are confident that they will eventually crack the more difficult problem of finding distinct disease signatures in the noisy data. “The evolutionary process, being complex and unknown, does not always give us an easy route,” Berger notes, “but it definitely gives us lots of opportunities.”

Satellites Diagnose Disease Outbreaks
Space-based data are helping to track and predict the spread of deadly diseases ?
By Katherine Harmon

Many contagious diseases spread through carriers such as birds and mosquitoes. These vectors in turn move with heat and rainfall. With this in mind, researchers have begun to use satellite data to monitor the environmental conditions that lead to disease. “Ideally, we could predict conditions that would result in some of these major outbreaks of cholera, malaria, even avian flu,” says Tim Ford of the University of New England at Biddeford and co-author of a paper on the subject published this past September in Emerging Infectious Diseases.

Satellite data have already been used to map the advance of the H5N1 avian influenza in Asia. The domestic duck, a common inhabitant of Southeast Asia’s rice paddies, is one of the main carriers of the disease. Xiangming Xiao, associate director of the University of Oklahoma?’s Center for Spatial Analysis, uses satellite images to map agricultural patterns in the region. These maps show where the ducks are most likely to live and thus where the avian influenza is most likely to spread.

Migratory birds also carry the virus, but their travel patterns are more difficult to predict. Xiao and his colleagues combine the satellite imagery with satellite-gathered surface-temperature data to estimate the birds’—and thereby the virus’s—trajectory. Computer models then link these environmental drivers to the spread of the flu in human populations.

Of course, not all of the work can be outsourced to orbiting observatories. Xiao says that judging the severity of avian flu’s spread from satellite imaging required knowing details about the human populations as well—for instance, how likely certain communities were to raise ducks for poultry consumption. “Satellite monitoring has a capacity to provide consistent observation,” Xiao says. “On the other hand, the in situ observations are still very, very important, so the key is to combine those together. That is a real challenge.”

More Ideas to watch
By Melinda Wenner

Quick Clots
Emergency technicians could prevent up to 35 percent of prehospital trauma deaths if they had better and cheaper ways to prevent blood loss. Now a University of Maryland–affiliated start-up called Trauma Solutions has developed a synthetic hydrogel that can clot blood by prompting the body to make fibrin, a protein that seals wounds and stops bleeding. Future iterations could simultaneously release such medicines as antibiotics and painkillers. Each application will cost about $5, compared with some natural blood-clotting substances that cost upward of $500.

Lab-on-a-Stamp
Liver damage is a major side effect of HIV/AIDS and tuberculosis drugs, yet few developing countries have enough trained scientists or equipment to monitor it. Nonprofit Cambridge, Mass.–based Diagnostics For All has developed an inexpensive fingernail-size device made almost entirely of paper that monitors liver damage using a single drop of blood. Channels in the paper guide blood to regions that change color depending on the levels of two damage-related liver enzymes.

Bacterial Toothpaste
Streptococcus mutans bacteria in the mouth decay teeth by converting sugars into enamel-eroding lactic acid. Florida-based Oragenics has genetically engineered a new strain of bacteria that converts sugars to trace amounts of alcohol instead. Because the new strain permanently displaces natural S. mutans, the therapy, which is currently in clinical trials, will be available as a one-time prescription that will protect teeth for life.

| Read Comments (29)

 

Green sidewalk makes electricity

Green sidewalk makes electricity — one footstep at a time – CNN.com.

“PaveGen” pavement slabs convert the energy from footsteps into electricity.

STORY HIGHLIGHTS
  • Engineer has developed paving slab that transforms energy from footsteps into electricity
  • First commercial order expected to power half of outdoor lighting at vast London shopping mall
  • Sustainability expert says slabs encourage sense of collective participation
  • Hopes one day to export across developing world to provide source of “off-grid” energy

London, England (CNN) — Paving slabs that convert energy from people’s footsteps into electricity are set to help power Europe’s largest urban mall, at the 2012 London Olympics site.Watch movie online John Wick: Chapter 2 (2017)

The recycled rubber “PaveGen” paving slabs harvest kinetic energy from the impact of people stepping on them and instantly deliver tiny bursts of electricity to nearby appliances. The slabs can also store energy for up to three days in an on-board battery, according to its creator.

In their first commercial application, 20 tiles will be scattered along the central crossing between London’s Olympic stadium and the recently opened Westfield Stratford City mall — which expects an estimated 30 million customers in its first year.

“That should be enough feet to power about half its (the mall’s) outdoor lighting needs,” said Laurence Kemball-Cook, a 25-year-old engineering graduate who developed the prototype during his final year of university in 2009.

The green slabs are designed to compress five millimeters when someone steps on them, but PaveGen will not share the precise mechanism responsible for converting absorbed kinetic energy into electricity.

A computer generated image showing PaveGen slabs installed on a subway staircase
A computer generated image showing PaveGen slabs installed on a subway staircase

Although each step produces only enough electricity to keep an LED-powered street lamp lit for 30 seconds, Kemball-Cook says that the tiles are a real-world “crowdsourcing” application, harnessing small contributions from a large number of individuals.

“We recently came back from a big outdoor festival where we got over 250,000 footsteps — that was enough to charge 10,000 mobile phones,” said Kemball-Cook.

The young inventor envisages PaveGen systems being used to power off-grid appliances such as public lighting, illuminated street maps and advertising, and to be installed in areas of dense human traffic such as city centers, underground stations and school corridors.

“Our main test installation is at a school in Kent (southeast England) — where 1,100 kids have devoted their lives to stamping all over them for the last eight months,” said Kemball-Cook.

In its current form, the PaveGen paving slab contains a low-energy LED which lights up, expressing the energy transfer idea to the user but only consuming around 5% of the energy from each footstep.

“This is what I really enjoy about the design,” said Richard Miller, head of sustainability at the UK’s government-funded Technology Strategy Board.

“As much as it’s an effective, common-sense source of some sustainable electricity, it’s also a great way for people to engage with the issue of sustainability … to feel like they are part of the solution in a very immediate, fun and visual way that doesn’t make you do anything you wouldn’t already be doing,” said Miller.

However, although generally enthusiastic about the product, for the time being Miller withholds speculation about its far-reaching impact.

“As with all things of this nature, on a large scale and in the long term, its success will be determined by how cost-effective it is to produce … If it turns out to be expensive, then it will struggle to find a place as anything more than a niche application,” he said.

Kemball-Cook declines to comment on the cost of each slab, arguing that their current price is much higher than what it will be when they go into mass production.

As with all things of this nature…its success will be determined by how cost-effective it is to produce
Richard Miller, UK Technology Strategy Board

That said, the company has already won a spate of awards, including the Big Idea category at the UK’s Ethical Business Awards and the Shell LiveWire Grand Ideas Award. PaveGen has also recently received a round of financing from a group of London-based angel investors, although the sum is undisclosed.

Kemball-Cook is confident that the slab is durable. Over the course of a month it was subjected to a machine that replicates the pounding of footsteps, non-stop every day, he added.

“It’s also really easy to install as a retrofit on existing pavements, because they can be made to match their exact dimensions … you just replace one slab with another,” he said.

Looking to the future, Kemball-Cook would like to see the paving system introduced to the developing world, in areas that have a high footfall, but are off-grid, such as the slums in Mumbai.

“The average person takes 150 million steps in their lifetime, just imagine the potential,” he said.

Engineers can build a low-carbon world if we let them

Engineers can build a low-carbon world if we let them – opinion – 26 September 2011 – New Scientist.

The engineering solutions to combat climate change already exist. Politicians must be brave enough to use them before it’s too late

One word sums up the attitude of engineers towards climate change: frustration. Political inertia following the high-profile failure of 2009’s Copenhagen climate conference has coupled with a chorus of criticism from a vocal minority of climate-change sceptics. Add the current economic challenges and the picture looks bleak. Our planet is warming and we are doing woefully little to prevent it getting worse.

Engineers know there is so much more that we could do. While the world’s politicians have been locked in predominantly fruitless talks, engineers have been developing the technologies we need to bring down emissions and help create a more stable future.

Wind, wave and solar power, zero-emissions transport, low-carbon buildings and energy-efficiency technologies have all been shown feasible. To be rolled out on a global scale, they are just waiting for the political will. Various models, such as the European Climate Foundation’s Roadmap 2050, show that implementing these existing technologies would bring about an 85 per cent drop in carbon emissions by 2050. The idea that we need silver-bullet technologies to be developed before the green technology revolution can happen is a myth. The revolution is waiting to begin.

Climate call

The barriers preventing the creation of a low-carbon society are not technological but political and financial. That’s why at a landmark London conference convened by the UK’s Institution of Mechanical Engineers, 11 national engineering institutions representing 1.2 million engineers from across the globe, under the banner of the Future Climate project, made a joint call for action at December’s COP17 climate change conference in Durban, South Africa.

The statement calls on governments to move from warm words to solid actions. They need to introduce legislation and financial support to get these technologies out of the workshop and into our homes and businesses and onto our roads. Targeted regulation and taxation will also drive innovation. This will require bold politics, and spending at a time when money is scarce. It is far from unaffordable, however. The UK’s Committee on Climate Change, which advises the British government, continues to support the view of the Stern reportMovie Camera – an assessment of the climate change challenge in the UK – that the move to a low-carbon society will cost no more than 1 per cent of GDP by 2050.

Resistance to wind turbines and the power lines they feed, nuclear power and electric cars, as well as the economic costs, all make public opinion a powerful brake on change. However the alternative seems certain to be worse. It is not only the challenges of a deteriorating climate: with inaction comes a great risk to our economy in the long term. The green technology revolution, just like the industrial revolution before it, will give jobs to those countries which have created the right conditions for it to flourish.

China in front

Which countries these will be is still an open question. India, Germany, Australia and the UK were among the nations signed up to the Future Climate statement, whereas the world’s largest greenhouse gas emitters – China and the US – were not. When it comes to investment in clean technology, however, that’s not the whole story.

Although China is continuing to build coal-fired electricity plants at an alarming rate to power its rapid economic growth, the UN Environment Programme confirmed last month that it is now by far the world’s biggest investor in renewable energy. Last year, China’s wind, solar and biomass power industries received $49 billion of new investment, a third of the global total, and it now has the largest installed wind capacity in the world. When predicting who the front runner in this next great technological revolution will be, it is difficult to see past the emerging superpower to the east.

The US is going in the opposite direction. A natural gas rush driven by the development of controversial “fracking” techniques over the past decade has echoes of the oil rush that transformed Texas a century ago. The Financial Times reports that just one company, BHP Billiton, is investing as much as $79 billion in US shale gas fields – over three times the amount invested in all US renewables in a year. This will secure cheap energy in the short term, but it is a finite resource and ultimately a dead end. In due course we could face the interesting prospect of the US turning to China to acquire its wind turbine technology.

Nuclear elephant

Investment in renewable energy is vital for a prosperous, low-carbon society. However, decision-makers cannot ignore the elephant in the room – nuclear power. The enormous cost of implementing 100 per cent renewable power is not realistic for most nations, so nuclear offers our best chance of making a low-carbon society achievable and affordable. Yet the incident at Fukushima earlier this year has reinforced some long-standing concerns.

Unlike road use or smoking, nuclear power stirs anxieties in many of us that are out of proportion with its true risks. This is not to be complacent about the potential danger of a nuclear plant, but it is striking that nuclear power has killed fewer than 5000 people in its entire history. Compare that with coal mining, which in just one year and in one country – China in 2006 – killed 4700.

Germany’s decision to phase out all nuclear power as a result of Fukushima will most likely have unintended consequences. The Association of German Engineers has estimated that it will cost €53 billion every year in Germany to close down its nuclear generation and switch to 100 per cent renewable energy. It will be interesting to see how public opinion, now so clearly against nuclear power, responds as the economic costs become apparent.

Any technological revolution requires two crucial ingredients – engineers to design, develop and manufacture the technology, and politicians to help create the legislative, behavioural and societal environment that allows change to happen. Today’s engineers have fulfilled their side of the bargain. It is time for our politicians to show their mettle.

Colin Brown is director of engineering at the UK’s Institution of Mechanical Engineers

Simulation shows it’s possible to tow an iceberg to drought areas

Simulation shows it’s possible to tow an iceberg to drought areas.

Iceberg

Enlarge

Image credit: Trevor Williams.

(PhysOrg.com) — Way back in the 70’s Georges Mougin, then an engineering graduate, had a big idea. He suggested that icebergs floating around in the North Atlantic could be tethered and dragged south to places that were experiencing a severe drought, such as the Sahel of West Africa. Mougin received some backing funds from a Saudi prince but most “experts” at the time scoffed at his idea and the whole scheme was eventually shelved.

Cut to 2009 and French software firm Dassault Systemes, who thought maybe Mougin was on to something after all and contacted him to suggest modeling the whole idea on a computer. After applying 15 engineers to the problem, the team concluded that towing an from the waters around Newfoundland to the Canary Islands off the northwest coast of Africa, could be done, and would take under five months, though it would cost nearly ten million dollars.

In the simulation, as in a real world attempt, the selected iceberg would first be fitted with an insulating skirt to stave off melting; it would then be connected to a tugboat (and a kite sail) that would travel at about one knot (assuming assistance from ocean currents). In the simulated test, the iceberg arrived intact having lost only 38 percent of its seven ton mass.

A real world project would of course require hauling a much bigger berg; experts estimate a 30 million ton iceberg could provide fresh water for half a million people for up to a year. There would also be the problem of transporting the water from the berg in the ocean to the drought stricken people. The extraordinary costs for such a project would, it is assumed, come from the price tag for the skirt, five months of diesel fuel for the tugboat, the man hours involved and then finally, distribution of the fresh water at the destination.

Scientists estimate that some 40,000 icebergs break away from the polar ice caps each year, though only a fraction of them would be large enough to be worth the time and expense of dragging them to a place experiencing a drought, such as the devastating one currently going on in the Horn of Africa.

Mougin, newly reinvigorated by the results of the recent study, at age 86, is now trying to raise money for a real-world test of the idea.

A Plan to Power 100 Percent of the Planet with Renewables

A Plan to Power 100 Percent of the Planet with Renewables: Scientific American.

In December leaders from around the world will meet in Copenhagen to try to agree on cutting back greenhouse gas emissions for decades to come. The most effective step to implement that goal would be a massive shift away from fossil fuels to clean, renewable energy sources. If leaders can have confidence that such a transformation is possible, they might commit to an historic agreement. We think they can.
A year ago former vice president Al Gore threw down a gauntlet: to repower America with 100 percent carbon-free electricity within 10 years. As the two of us started to evaluate the feasibility of such a change, we took on an even larger challenge: to determine how 100 percent of the world’s energy, for all purposes, could be supplied by wind, water and solar resources, by as early as 2030. Our plan is presented here.

Scientists have been building to this moment for at least a decade, analyzing various pieces of the challenge. Most recently, a 2009 Stanford University study ranked energy systems according to their impacts on global warming, pollution, water supply, land use, wildlife and other concerns. The very best options were wind, solar, geothermal, tidal and hydroelectric power—all of which are driven by wind, water or sunlight (referred to as WWS). Nuclear power, coal with carbon capture, and ethanol were all poorer options, as were oil and natural gas. The study also found that battery-electric vehicles and hydrogen fuel-cell vehicles recharged by WWS options would largely eliminate pollution from the transportation sector.

Our plan calls for millions of wind turbines, water machines and solar installations. The numbers are large, but the scale is not an insurmountable hurdle; society has achieved massive transformations before. During World War II, the U.S. retooled automobile factories to produce 300,000 aircraft, and other countries produced 486,000 more. In 1956 the U.S. began building the Interstate Highway System, which after 35 years extended for 47,000 miles, changing commerce and society.

Is it feasible to transform the world’s energy systems? Could it be accomplished in two decades? The answers depend on the technologies chosen, the availability of critical materials, and economic and political factors.

Clean Technologies Only
Renewable energy comes from enticing sources: wind, which also produces waves; water, which includes hydroelectric, tidal and geothermal energy (water heated by hot underground rock); and sun, which includes photovoltaics and solar power plants that focus sunlight to heat a fluid that drives a turbine to generate electricity. Our plan includes only technologies that work or are close to working today on a large scale, rather than those that may exist 20 or 30 years from now.

To ensure that our system remains clean, we consider only technologies that have near-zero emissions of greenhouse gases and air pollutants over their entire life cycle, including construction, operation and decommissioning. For example, when burned in vehicles, even the most ecologically acceptable sources of ethanol create air pollution that will cause the same mortality level as when gasoline is burned. Nuclear power results in up to 25 times more carbon emissions than wind energy, when reactor construction and uranium refining and transport are considered. Carbon capture and sequestration technology can reduce carbon dioxide emissions from coal-fired power plants but will increase air pollutants and will extend all the other deleterious effects of coal mining, transport and processing, because more coal must be burned to power the capture and storage steps. Similarly, we consider only technologies that do not present significant waste disposal or terrorism risks.

In our plan, WWS will supply electric power for heating and transportation—industries that will have to revamp if the world has any hope of slowing climate change. We have assumed that most fossil-fuel heating (as well as ovens and stoves) can be replaced by electric systems and that most fossil-fuel transportation can be replaced by battery and fuel-cell vehicles. Hydrogen, produced by using WWS electricity to split water (electrolysis), would power fuel cells and be burned in airplanes and by industry.

Plenty of Supply
Today the maximum power consumed worldwide at any given moment is about 12.5 trillion watts (terawatts, or TW), according to the U.S. Energy Information Administration. The agency projects that in 2030 the world will require 16.9 TW of power as global population and living standards rise, with about 2.8 TW in the U.S. The mix of sources is similar to today’s, heavily dependent on fossil fuels. If, however, the planet were powered entirely by WWS, with no fossil-fuel or biomass combustion, an intriguing savings would occur. Global power demand would be only 11.5 TW, and U.S. demand would be 1.8 TW. That decline occurs because, in most cases, electrification is a more efficient way to use energy. For example, only 17 to 20 percent of the energy in gasoline is used to move a vehicle (the rest is wasted as heat), whereas 75 to 86 percent of the electricity delivered to an electric vehicle goes into motion.

Even if demand did rise to 16.9 TW, WWS sources could provide far more power. Detailed studies by us and others indicate that energy from the wind, worldwide, is about 1,700 TW. Solar, alone, offers 6,500 TW. Of course, wind and sun out in the open seas, over high mountains and across protected regions would not be available. If we subtract these and low-wind areas not likely to be developed, we are still left with 40 to 85 TW for wind and 580 TW for solar, each far beyond future human demand. Yet currently we generate only 0.02 TW of wind power and 0.008 TW of solar. These sources hold an incredible amount of untapped potential.

The other WWS technologies will help create a flexible range of options. Although all the sources can expand greatly, for practical reasons, wave power can be extracted only near coastal areas. Many geothermal sources are too deep to be tapped economically. And even though hydroelectric power now exceeds all other WWS sources, most of the suitable large reservoirs are already in use.

The Plan: Power Plants Required
Clearly, enough renewable energy exists. How, then, would we transition to a new infrastructure to provide the world with 11.5 TW? We have chosen a mix of technologies emphasizing wind and solar, with about 9 percent of demand met by mature water-related methods. (Other combinations of wind and solar could be as successful.)

Wind supplies 51 percent of the demand, provided by 3.8 million large wind turbines (each rated at five megawatts) worldwide. Although that quantity may sound enormous, it is interesting to note that the world manufactures 73 million cars and light trucks every year. Another 40 percent of the power comes from photovoltaics and concentrated solar plants, with about 30 percent of the photovoltaic output from rooftop panels on homes and commercial buildings. About 89,000 photovoltaic and concentrated solar power plants, averaging 300 megawatts apiece, would be needed. Our mix also includes 900 hydroelectric stations worldwide, 70 percent of which are already in place.

Only about 0.8 percent of the wind base is installed today. The worldwide footprint of the 3.8 million turbines would be less than 50 square kilometers (smaller than Manhattan). When the needed spacing between them is figured, they would occupy about 1 percent of the earth’s land, but the empty space among turbines could be used for agriculture or ranching or as open land or ocean. The nonrooftop photovoltaics and concentrated solar plants would occupy about 0.33 percent of the planet’s land. Building such an extensive infrastructure will take time. But so did the current power plant network. And remember that if we stick with fossil fuels, demand by 2030 will rise to 16.9 TW, requiring about 13,000 large new coal plants, which themselves would occupy a lot more land, as would the mining to supply them.

The Materials Hurdle
The scale of the WWS infrastructure is not a barrier. But a few materials needed to build it could be scarce or subject to price manipulation.

Enough concrete and steel exist for the millions of wind turbines, and both those commodities are fully recyclable. The most problematic materials may be rare-earth metals such as neodymium used in turbine gearboxes. Although the metals are not in short supply, the low-cost sources are concentrated in China, so countries such as the U.S. could be trading dependence on Middle Eastern oil for dependence on Far Eastern metals. Manufacturers are moving toward gearless turbines, however, so that limitation may become moot.

Photovoltaic cells rely on amorphous or crystalline silicon, cadmium telluride, or copper indium selenide and sulfide. Limited supplies of tellurium and indium could reduce the prospects for some types of thin-film solar cells, though not for all; the other types might be able to take up the slack. Large-scale production could be restricted by the silver that cells require, but finding ways to reduce the silver content could tackle that hurdle. Recycling parts from old cells could ameliorate material difficulties as well.

Three components could pose challenges for building millions of electric vehicles: rare-earth metals for electric motors, lithium for lithium-ion batteries and platinum for fuel cells. More than half the world’s lithium reserves lie in Bolivia and Chile. That concentration, combined with rapidly growing demand, could raise prices significantly. More problematic is the claim by Meridian International Research that not enough economically recoverable lithium exists to build anywhere near the number of batteries needed in a global electric-vehicle economy. Recycling could change the equation, but the economics of recycling depend in part on whether batteries are made with easy recyclability in mind, an issue the industry is aware of. The long-term use of platinum also depends on recycling; current available reserves would sustain annual production of 20 million fuel-cell vehicles, along with existing industrial uses, for fewer than 100 years.

Smart Mix for Reliability
A new infrastructure must provide energy on demand at least as reliably as the existing infrastructure. WWS technologies generally suffer less downtime than traditional sources. The average U.S. coal plant is offline 12.5 percent of the year for scheduled and unscheduled maintenance. Modern wind turbines have a down time of less than 2 percent on land and less than 5 percent at sea. Photovoltaic systems are also at less than 2 percent. Moreover, when an individual wind, solar or wave device is down, only a small fraction of production is affected; when a coal, nuclear or natural gas plant goes offline, a large chunk of generation is lost.

The main WWS challenge is that the wind does not always blow and the sun does not always shine in a given location. Intermittency problems can be mitigated by a smart balance of sources, such as generating a base supply from steady geothermal or tidal power, relying on wind at night when it is often plentiful, using solar by day and turning to a reliable source such as hydroelectric that can be turned on and off quickly to smooth out supply or meet peak demand. For example, interconnecting wind farms that are only 100 to 200 miles apart can compensate for hours of zero power at any one farm should the wind not be blowing there. Also helpful is interconnecting geographically dispersed sources so they can back up one another, installing smart electric meters in homes that automatically recharge electric vehicles when demand is low and building facilities that store power for later use.

Because the wind often blows during stormy conditions when the sun does not shine and the sun often shines on calm days with little wind, combining wind and solar can go a long way toward meeting demand, especially when geothermal provides a steady base and hydroelectric can be called on to fill in the gaps.

As Cheap as Coal
The mix of WWS sources in our plan can reliably supply the residential, commercial, industrial and transportation sectors. The logical next question is whether the power would be affordable. For each technology, we calculated how much it would cost a producer to generate power and transmit it across the grid. We included the annualized cost of capital, land, operations, maintenance, energy storage to help offset intermittent supply, and transmission. Today the cost of wind, geothermal and hydroelectric are all less than seven cents a kilowatt-hour (¢/kWh); wave and solar are higher. But by 2020 and beyond wind, wave and hydro are expected to be 4¢/kWh or less.

For comparison, the average cost in the U.S. in 2007 of conventional power generation and transmission was about 7¢/kWh, and it is projected to be 8¢/kWh in 2020. Power from wind turbines, for example, already costs about the same or less than it does from a new coal or natural gas plant, and in the future wind power is expected to be the least costly of all options. The competitive cost of wind has made it the second-largest source of new electric power generation in the U.S. for the past three years, behind natural gas and ahead of coal.

Solar power is relatively expensive now but should be competitive as early as 2020. A careful analysis by Vasilis Fthenakis of Brookhaven National Laboratory indicates that within 10 years, photovoltaic system costs could drop to about 10¢/kWh, including long-distance transmission and the cost of compressed-air storage of power for use at night. The same analysis estimates that concentrated solar power systems with enough thermal storage to generate electricity 24 hours a day in spring, summer and fall could deliver electricity at 10¢/kWh or less.

Transportation in a WWS world will be driven by batteries or fuel cells, so we should compare the economics of these electric vehicles with that of internal-combustion-engine vehicles. Detailed analyses by one of us (Delucchi) and Tim Lipman of the University of California, Berkeley, have indicated that mass-produced electric vehicles with advanced lithium-ion or nickel metal-hydride batteries could have a full lifetime cost per mile (including battery replacements) that is comparable with that of a gasoline vehicle, when gasoline sells for more than $2 a gallon.

When the so-called externality costs (the monetary value of damages to human health, the environment and climate) of fossil-fuel generation are taken into account, WWS technologies become even more cost-competitive.

Overall construction cost for a WWS system might be on the order of $100 trillion worldwide, over 20 years, not including transmission. But this is not money handed out by governments or consumers. It is investment that is paid back through the sale of electricity and energy. And again, relying on traditional sources would raise output from 12.5 to 16.9 TW, requiring thousands more of those plants, costing roughly $10 trillion, not to mention tens of trillions of dollars more in health, environmental and security costs. The WWS plan gives the world a new, clean, efficient energy system rather than an old, dirty, inefficient one.

Political Will
Our analyses strongly suggest that the costs of WWS will become competitive with traditional sources. In the interim, however, certain forms of WWS power will be significantly more costly than fossil power. Some combination of WWS subsidies and carbon taxes would thus be needed for a time. A feed-in tariff (FIT) program to cover the difference between generation cost and wholesale electricity prices is especially effective at scaling-up new technologies. Combining FITs with a so-called declining clock auction, in which the right to sell power to the grid goes to the lowest bidders, provides continuing incentive for WWS developers to lower costs. As that happens, FITs can be phased out. FITs have been implemented in a number of European countries and a few U.S. states and have been quite successful in stimulating solar power in Germany.

Taxing fossil fuels or their use to reflect their environmental damages also makes sense. But at a minimum, existing subsidies for fossil energy, such as tax benefits for exploration and extraction, should be eliminated to level the playing field. Misguided promotion of alternatives that are less desirable than WWS power, such as farm and production subsidies for biofuels, should also be ended, because it delays deployment of cleaner systems. For their part, legislators crafting policy must find ways to resist lobbying by the entrenched energy industries.

Finally, each nation needs to be willing to invest in a robust, long-distance transmission system that can carry large quantities of WWS power from remote regions where it is often greatest—such as the Great Plains for wind and the desert Southwest for solar in the U.S.—to centers of consumption, typically cities. Reducing consumer demand during peak usage periods also requires a smart grid that gives generators and consumers much more control over electricity usage hour by hour.

A large-scale wind, water and solar energy system can reliably supply the world’s needs, significantly benefiting climate, air quality, water quality, ecology and energy security. As we have shown, the obstacles are primarily political, not technical. A combination of feed-in tariffs plus incentives for providers to reduce costs, elimination of fossil subsidies and an intelligently expanded grid could be enough to ensure rapid deployment. Of course, changes in the real-world power and transportation industries will have to overcome sunk investments in existing infrastructure. But with sensible policies, nations could set a goal of generating 25 percent of their new energy supply with WWS sources in 10 to 15 years and almost 100 percent of new supply in 20 to 30 years. With extremely aggressive policies, all existing fossil-fuel capacity could theoretically be retired and replaced in the same period, but with more modest and likely policies full replacement may take 40 to 50 years. Either way, clear leadership is needed, or else nations will keep trying technologies promoted by industries rather than vetted by scientists.

A decade ago it was not clear that a global WWS system would be technically or economically feasible. Having shown that it is, we hope global leaders can figure out how to make WWS power politically feasible as well. They can start by committing to meaningful climate and renewable energy goals now.

Note: This article was originally printed with the title, “A Path to Sustainable Energy by 2030.”

Read Comments (166)

 

Can Planting Vegetables in Vacant Lots Save Cleveland?

Can Planting Vegetables in Vacant Lots Save Cleveland? | Wired Science | Wired.com.

Backyard vegetables can fight crime, improve health, and boost the economy.

By transforming its vacant lots, backyards and roof-tops into farming plots, the city of Cleveland could meet all of its fresh produce, poultry and honey needs, calculate economists from Ohio State University. These steps would save up to $155 million annually, boost employment and scale back obesity.

“Post-industrial cities like Cleveland are struggling with more and more unused land, these become sources of crime,” said Parwinder Grewal co-author of a study “Can cities become self-reliant in food?” published July 20 in Cities.

“I was motivated to show how much food a city could actually produce by using this land,” he said. “We could address global problems through this way of gardening.”

Urban gardening improves health, reduces pollution, and creates local businesses, Grewal said. The population of Cleveland, what Grewal considers a typical post-industrial city, peaked near one million in 1950, and has been declining since. Today scarcely half a million people call Cleveland home.

 

As industrial jobs have dried up, the city’s exodus has accelerated. Unable to keep up their properties, many former residents have abandoned their homes. Vacant lots are proliferating, and currently number more than 20,000, according to the Cleveland City Planning Commission.

Ten percent of Clevelanders have been diagnosed with diabetes, as compared to the national average of 8 percent, and more than a third are obese. Among cities with a population between 100,000 and 500,000, it is the seventh most dangerous, according to one crime ranking. Growing tomatoes and beans, and keeping bees and chickens, would change all this, Grewal said. Studies have shown that gardens improve community health, reduce crime and increase property values.

Cleveland city planners have placed special emphasis on programs to foster urban gardening in the past five to 10 years, however, Grewal’s visions are on a more ambitious scale.

In the most intensive scenario he outlines 80 percent of all vacant lots, 62 percent of business rooftops, and 9 percent of residential lots would be tied to food, allowing the city to meet up to 100 percent of its fresh food needs. Grewal, who grows the bulk of his own food in his backyard, believes that his propositions are realistic and practical. The largest barrier is convincing citizens to garden.

“No discredit to the value of Grewal’s study,” said Kim Scott, a Cleveland City Planner and urban gardening specialist, “but articulating an idea is a different experience from implementing it.”

While Cleveland might have enough land to be self-sufficient, it doesn’t yet have the labor force to make it happen, Scott said.

“A mental shift has to take place,” said Scott. “Many people don’t have a clue about farming. They lack the patience to eat whole foods, they lack the desire.”

Both Scott and Grewal hope that shift is coming. Cleveland now has hundreds of community gardens. Some residents are growing market gardens, cultivating and selling produce as a full-time job. The city is seeing the grandest show of large public gardens since the Victory Gardens of World War II, when 40 percent of U.S. vegetables came from private and public gardens.

“If we could do it then,” said Grewal, “we can do it now. And if we design cities that are as self-sufficient as possible, the longer human civilization can sustain itself.”

Image: Parwinder Grewal

See Also:

Dry onion skin a treasure we're tossing out?

Dry onion skin has a use.

ScienceDaily (July 14, 2011) — More than 500,000 tonnes of onion waste are thrown away in the European Union each year. However, scientists say this could have a use as food ingredients. The brown skin and external layers are rich in fibre and flavonoids, while the discarded bulbs contain sulphurous compounds and fructans. All of these substances are beneficial to health.

Production of onion waste has risen over recent years in line with the growing demand for these bulbs. More than 500,000 tonnes of waste are generated in the European Union each year, above all in Spain, Holland and the United Kingdom, where it has become an environmental problem. The waste includes the dry brown skin, the outer layers, roots and stalks, as well as onions that are not big enough to be of commercial use, or onions that are damaged.

“One solution could be to use onion waste as a natural source of ingredients with high functional value, because this vegetable is rich in compounds that provide benefits for human health,” says Vanesa Benítez, a researcher at the Department of Agricultural Chemistry at the Autonomous University of Madrid (Spain).

Benítez’s research group worked with scientists from Cranfield University (United Kingdom) to carry out laboratory experiments to identify the substances and possible uses of each part of the onion. The results have been published in the journal Plant Foods for Human Nutrition.

According to the study, the brown skin could be used as a functional ingredient high in dietary fibre (principally the non-soluble type) and phenolic compounds, such as quercetin and other flavonoids (plant metabolites with medicinal properties). The two outer fleshy layers of the onion also contain fibre and flavonoids.

“Eating fibre reduces the risk of suffering from cardiovascular disease, gastrointestinal complaints, colon cancer, type-2 diabetes and obesity,” the researcher points out.

Phenolic compounds, meanwhile, help to prevent coronary disease and have anti-carcinogenic properties. The high levels of these compounds in the dry skin and the outer layers of the bulbs also give them high antioxidant capacity.

Meanwhile, the researchers suggest using the internal parts and whole onions that are thrown away as a source of fructans and sulphurous compounds. Fructans are prebiotics, in other words they have beneficial health effects as they selectively stimulate the growth and activity of bacteria in the colon.

Sulphurous compounds reduce the accumulation of platelets, improving blood flow and cardiovascular health in general. They also have a positive effect on antioxidant and anti-inflammatory systems in mammals.

“The results show that it would be useful to separate the different parts of onions produced during the industrial process,” explains Benítez. “This would enable them to be used as a source of functional compounds to be added to other foodstuffs.”

Acceptance, not access, the biggest hurdle to eradicating disease

Shot in the Dark – By Charles Kenny | Foreign Policy.

In 2009, veterinarians at the U.N. Food and Agriculture Organization made a remarkable announcement: Rinderpest, a livestock-borne disease, would soon be eradicated. OK, so maybe it wasn’t front-page news, but rinderpest — which causes animals to develop fever, followed by diarrhea and (frequently) death — has over thousands of years been a recurring plague on human civilization. It has destroyed the food supplies of entire countries such as Ethiopia, which lost a third of its population to a rinderpest-related famine in the late 19th century. The FAO’s eradication effort, launched in 1992, marks only the second time a disease has been deliberately wiped off the face of the Eearth; the first, better-known case was smallpox, which killed between 300 million and 500 million people over the course of the 20th century before its eradication in 1980.

On June 13, the global community tried for a repeat performance with a pledge drive, held by the Global Alliance for Vaccines and Immunization (GAVI). Thanks to support from aid agencies from Britain to Russia, as well as the Gates Foundation, GAVI raised $4.3 billion to immunize 250 million kids worldwide between now and 2015, protecting against diseases from tetanus to tuberculosis, whooping cough to diphtheria. It’s a daunting project, but one that is less implausible than it once was: The range of diseases that can be prevented is growing ever longer, and now includes HPV, rubella, typhoid, and Japanese encephalitis. Vaccines for malaria and dengue fever may not be far behind, and there’s even some hope for HIV. GAVI itself boasts a strong track record: Over the organization’s first decade, more than 5 million child deaths were prevented though more rapid introduction and increased coverage of vaccines in low-income countries. But, going forward, the alliance is going to have to think more about getting parents to vaccinate their kids — the demand side of health– especially if it wants to repeat the huge victory of wiping out a disease.

 

Although few in the public-health NGO community would like to admit it, eradicating diseases is at least as dependent on luck as it is on planning and persistence. Universal vaccination — the only nearly surefire means of eradication — is an impossibility in most countries. Even the best-resourced campaigns have to deal with the trouble of reaching remote villages over rutted roads to deliver vaccines that sometimes need to be kept refrigerated, often are difficult to administer, and can take multiple shots to take effect. Add to that the challenge of reaching people who often have no official registration or address, and you can see the problem.

Health professionals instead rely on the strategy of trying to vaccinate enough people, especially in the immediate period of an outbreak, so that the disease eventually retreats toward extinction — always a dicey prospect. Donald Henderson at Johns Hopkins University wrote of smallpox eradication that it “was achieved by only the narrowest of margins” while progress “wavered between success and disaster, often only to be decided by quixotic circumstance or extraordinary performances by field staff.”

Today, the world appears to be walking the same knife-edge with polio. The Global Polio Eradication Initiative was launched in 1988 when there were about a third of a million cases worldwide. Indigenous polio was eradicated in the Americas in 1991 and China in 1996. By 1997, the worldwide total of cases was down to 7,000; by 2009, there were only 1,600. But new cases keep popping up: That same year saw outbreaks in Uganda, Mali, Togo, Ghana, Ivory Coast, and Kenya.

The problem wasn’t vaccine supply; the world has spent $8.2 billion on eradication programs, which bought both vaccines and the human infrastructure required to deliver them. Rather, it was a demand issue, one that hinged in particular on the attitude of governments and parents.

Take the example of the polio vaccination campaign in northern Nigeria in 2003, which responded to a particularly virulent outbreak that was threatening to spread. The governor of Kano state refused to support the vaccination campaign because of rumors that the vaccines were laced with drugs that would sterilize recipients — which he claimed was part of a U.S. conspiracy to depopulate the developing world.

To be fair to Kano’s former governor, only 23 percent of children across the whole of Nigeria were fully immunized in 2008, suggesting that general lack of information, fear of side effects, and the hassle of getting kids vaccinated probably played the larger part in low takeup of polio vaccine on the demand side (not to mention weaknesses in the vaccine delivery and administration system). But that only emphasizes the fact that without at least some support from both local officials and parents, there is no way to complete a vaccination program.

Issues with government prioritization help to account for the fact that about 55 percent of surveyed kids are fully vaccinated in low-income countries, according to my colleague Amanda Glassman at the Center for Global Development — but only 42 percent in lower-middle-income countries are. And in fact, demand-side problems affect the whole world, not just the developing parts of it that are typically the focus of immunization efforts. More and more parents in Europe and the United States have refused to vaccinate their kids over the fear (despite overwhelming evidence to the contrary) that the vaccines lead to autism.

What about the less conspiracy-minded corners of the world? MIT economists Abhijit Banerjee and Esther Duflo, authors of this year’s development blockbuster Poor Economics, studied vaccination rates in Udaipur, India and found that only 16 percent of children below age 2 there were fully immunized against the standard preventable diseases. That was due in part to the limited provision of immunization clinics, but it was also because of low demand for the free immunizations that were available: Even when a local NGO helped ensure regular and well-publicized visits by traveling immunization camps in parts of Udaipur, full immunization only increased from 6 percent in control villages to 18 percent in villages that saw regular camps. World Bank research in India suggests one reason why — many parents do not fully appreciate the health benefits of vaccination and so are unwilling to go to the effort of attending an immunization camp.

One effective response if the kids won’t come to the vaccines is to take the vaccines to the kids. If families are visited at home by a trusted health worker, the World Bank research suggests parents are very happy to see their kids stuck full of needles. But the trouble with such an approach is that house visits are a very expensive way to guarantee coverage. So Banerjee and Duflo tried adding an incentive to get parents to bring their kids to camps, instead — a 2-pound bag of lentils for each immunization and a set of plates if parents ensured their kids got the full program. That more than doubled the full immunization rate to 39 percent in villages where the incentives were offered and even increased immunization rates threefold in neighboring villages. And because the incentives increased the efficiency of the camps (i.e., how many kids were vaccinated each day), they actually reduced the overall cost of providing immunization coverage, from $56 per child without incentives to $28 per child.

The long-term answer to raising vaccination levels worldwide is to spread knowledge of their safety and efficacy. But as both the polio vaccine scare in Nigeria and the recent gross irresponsibility of a doctor and his Hollywood acolyte over vaccines in the West demonstrate, that process can be complex. In the meantime, providing direct incentives to people to get their kids vaccinated are likely to have a more immediate impact on changing behavior — and that will reduce both the immense human costs of infectious disease and the considerable financial costs of preventing them.

Charles Kenny is a senior fellow at the Center for Global Development, a Schwartz fellow at the New America Foundation, and author, most recently, of Getting Better: Why Global Development Is Succeeding and How We Can Improve the World Even More. “The Optimist,” his column for ForeignPolicy.com, runs weekly.

'Super sand' to help clean up dirty drinking water

BBC News – ‘Super sand’ to help clean up dirty drinking water.

Contaminated water can be cleaned much more effectively using a novel, cheap material, say researchers.

Dubbed “super sand”, it could become a low-cost way to purify water in the developing world.

The technology involves coating grains of sand in an oxide of a widely available material called graphite – commonly used as lead in pencils.

The team describes the work in the American Chemical Society journal Applied Materials and Interfaces.

In many countries around the world, access to clean drinking water and sanitation facilities is still limited.

The World Health Organization states that “just 60% of the population in Sub-Saharan African and 50% of the population in Oceania [islands in the tropical Pacific Ocean] use improved sources of drinking-water.”

The graphite-coated sand grains might be a solution – especially as people have already used sand to purify water since ancient times.

Coating the sand

But with ordinary sand, filtering techniques can be tricky.

Start Quote

Given that this can be synthesized using room temperature processes and also from cheap graphite sources, it is likely to be cost-efficient”

Mainak Majumder Monash University, Australia

Wei Gao from Rice university in Texas, US, told BBC News that regular coarse sand was a lot less effective than fine sand when water was contaminated with pathogens, organic contaminants and heavy metal ions.

While fine sand is slightly better, water drains through it very slowly.

“Our product combines coarse sand with functional carbon material that could offer higher retention for those pollutants, and at the same time gives good throughput,” explained the researcher.

She said that the technique the team has developed to make the sand involves dispersing graphite oxide into water and mixing it with regular sand.

“We then heat the whole mixture up to 105C for a couple of hours to evaporate the water, and use the final product – ‘coated sand’ – to purify polluted water.”

Cost-efficient

Sand “Super sand” is made using regular sand – and it could become a low-cost way to purify water

The lead scientist of the study, Professor Pulickel Ajayan, said it was possible to modify the graphite oxide in order to make it more selective and sensitive to certain pollutants – such as organic contaminants or specific metals in dirty water.

Another team member, Dr Mainak Majumder from Monash University in Melbourne, Australia, said it had another advantage – it was cheap.

“This material demonstrates comparable performance to some commercially available activated carbon materials,” he said.

“But given that this can be synthesized using room temperature processes and also from cheap graphite sources, it is likely to be cost-efficient.”

He pointed out that in Australia many mining companies extract graphite and they produce a lot of graphite-rich waste.

“This waste can be harnessed for water purification,” he said.

Fukushima update: TEPCO admits more meltdowns

Short Sharp Science: Fukushima update: TEPCO admits more meltdowns.

Andy Coghlan, reporter

Owners of the nuclear plant at Fukushima crippled by the earthquake and tsunami admitted today that two more of the six reactor units at the facility probably underwent meltdowns soon after the disaster on 11 March.

TEPCO acknowledged last week that fuel rods in reactor unit 1 probably melted down within as little as 16 hours of the quake.

Today, the company said that there were probably meltdowns in reactor units 2 and 3 as well, after the tsunami destroyed cooling systems needed to prevent meltdown through overheating of fuel rods.

Unit 3 probably melted down first, on 13 March, followed next day by Unit 2, after water levels fell below those needed to keep fuel cool enough to avoid meltdown. A day later, on 15 March, faulty valve systems led to an explosion in unit 2 which led to leakage of radioactive water into the sea.

Despite the meltdowns, there is no further danger because the melted cores are now safely covered by at least 3 metres of water, TEPCO officials told journalists today.

The company also challenged stories from last week suggesting that the quake alone had damaged reactor number 1 leading to the meltdown, not the tsunami. TEPCO says that further investigations in a report released on Monday revealed that the quake “inflicted no damage on main equipment”.

But problems are mounting elsewhere, with storage space rapidly running out for the tens of thousands of tonnes of radioactive water that have collected in reactor buildings. A TEPCO spokesman is quoted in UK newspaper The Guardian as saying that dealing with the contaminated water could take till the end of the year, and rise to as much as 200,000 tonnes. TEPCO is collaborating with the French firm, Areva, in a plan to recycle the tainted water through the reactors as a coolant.

Measures are also under way to improve cooling of spent reactor fuel being stored in the reactor buildings. Rods in reactor unit 4 caused problems early on when they caught fire through lack of cooling water, spreading clouds of radiation into the environment.

According to the Japanese Asahi Shimbun newspaper, TEPCO issued a plan on Saturday to install external heat exchangers for cooling storage ponds at units 1, 2 and 3. The exchangers will use cold “secondary” water to cool “primary” water continuously circulating through the rods. Air cooling equipment will be used to re-cool down the secondary water as it circulates.

The newspaper reveals that currently, the temperature is 70 to 80ºC in unit 2’s storage pond, but should be half that. But with the exchanger installed, the company is confident the temperature can be lowered to 41ºC within a month. Exchangers will be fitted in June to the ponds in reactor units 1 and 3, and in July to the pond in unit 4.

With the nuclear industry under such a cloud, Japanese Prime Minister Naoto Kan is pressing on with plans to expand Japan’s reliance on renewable power. According to the Nikkei newspaper, he is considering plans to make installation of solar panels mandatory in all new buildings by 2030, and will make them public later this week at the G8 summit in Deauville, France.

Earlier this month, Kan signalled a move to renewables, especially wind power, but stressed that nuclear power would still be needed to meet the country’s energy needs.