What would happen if solar panels were free? What if it were possible to know everything about the world—not the Internet, but the living, physical world—in real time? What if doctors could forecast a disease years before it strikes? This is the promise of the World Changing Idea: a vision so simple yet so ambitious that its full impact is impossible to predict. Scientific American’s editorial and advisory boards have chosen projects in five general categories—Energy, Transportation, Environment, Electronics and Robotics, and Health and Medicine—that highlight the power of science and technology to improve the world. Some are in use now; others are emerging from the lab. But all of them show that innovation is the most promising elixir for what ails us. —The Editors
The No-Money-Down Solar Plan
A new wave of start-ups wants to install rooftop solar panels on your house. Upfront cost: nothing
By Christopher Mims
The biggest thing stopping the sun is money. Installing a rooftop array of solar panels large enough to produce all of the energy required by a building is the equivalent of prepaying its electricity bill for the next seven to 10 years—and that’s after federal and state incentives. A new innovation in financing, however, has opened up an additional possibility for homeowners who want to reduce their carbon footprint and lower their electric bills: get the panels for free, then pay for the power as you go.
The system works something like a home mortgage. Organizations and individuals looking for a steady return on their investment, typically banks or municipal bond holders, use a pool of cash to pay for the solar panels. Directly or indirectly, homeowners buy the electricity produced by their own rooftop at a rate that is less, per kilowatt-hour, than they would pay for electricity from the grid. Investors get a safe investment—the latest generation of solar-panel technology works dependably for years—and homeowners get a break on their monthly bills, not to mention the satisfaction of significantly reducing their carbon footprint. “This is a way to get solar without putting any money down and to start saving money from day one. That’s a first,” says SolarCity co-founder Peter Rive.
SolarCity is the largest installer of household solar panels to have adopted this strategy. Founded in 2006 by two brothers who are also Silicon Valley–based serial entrepreneurs, SolarCity leases its panels to homeowners but gives the electricity away for free. The net effect is a much reduced utility bill (customers still need utility-delivered power when the sun isn’t out) plus a monthly SolarCity bill. The total for both comes out to less than the old bill. SunRun in San Francisco offers consumers a similar package, except that the company sells customers the electricity instead of leasing them the panels.
Cities such as Berkeley and Boulder are pioneering their own version of solar-panel financing by loaning individuals the entire amount required to pay for solar panels and installation. The project is paid for by municipal bonds, and the homeowner pays back the loan over 20 years as a part of the property tax bill. The effect is the same whichever route a consumer takes: the new obligation, in the form of taxes, a lease or a long-term contract for electricity, ends up costing less than the existing utility bill.
“What we’re really seeing is a transition in how we think about buying energy goods and services,” says Daniel M. Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley. Kammen, who did the initial analysis on Berkeley’s financing model, believes that by turning to financing, consumers can overcome the inherent disadvantage renewables have when compared with existing energy sources: the infrastructure for power from the grid has already been paid for and, in many cases, has been subsidized for decades.
All three approaches are rapidly expanding across the country. Despite the Berkeley program being less than two years old, 10 different states have passed legislation allowing their cities to set up a Berkeley-style bond-financed loan program. With the passage of the Waxman-Markey climate bill, the option for cities to set up these programs would become federal law. SunEdison in Maryland is currently active in nine states. SolarCity, which has more than 4,000 customers, is active in California, Arizona and Oregon and has promised to announce additional states after the new year.
Right now it is not possible to lower the overall cost of rooftop solar to “grid parity,” that is, to the same price as electricity from local utility companies, without federal subsidies such as the investment tax credit, which lowers the tax bill of banks financing these projects. Those subsidies, which amount to 30 percent of the cost of a solar installation, are guaranteed for at least eight years. By then, SolarCity and its competitors claim they won’t need them.
“Grid parity is driven by multiple factors,” says Attila Toth, vice president of marketing at SunEdison, including the cost of capital, the cost of panels and their installation, and the intensity of sunlight in a given region. “It will occur in different states at different times, but, for example, we expect that California will be one of the first states in the U.S. to get to grid parity, sometime between three and five years from now.”
While the cost of electricity from fossil fuels has increased 3 to 5 percent a year for the past decade, the cost of solar panels has fallen on average 20 percent for every doubling of its installed base. Grid parity is where these trend lines cross—after that, solar has the potential to power more than just homes. It’s hardly a coincidence that Elon Musk, head of electric car company Tesla Motors, sits on SolarCity’s board of directors.
More Ideas to watch
by Christopher Mims
The Gasoline Garden
It is the next step for biofuels: genetically engineered plant life that produces hydrocarbons as a by-product of its normal metabolism. The result will be fuel—common gasoline, even—using nothing but sunlight and CO2. In July, Exxon Mobil announced plans to spend more than $600 million in pursuit of algae that can accomplish the task. Joule Biotechnologies claims to have already succeeded, although the company has yet to reveal any details of its proprietary system.
Uranium and plutonium are not the only fuels that can power a nuclear reactor. With an initial kick from more traditional fissile materials, thorium can set up a self-sustaining “breeder” reaction that produces uranium 233, which is well suited to nuclear power generation. The process has the added benefit of being resistant to nuclear proliferation, because its end products emit enough gamma rays to make the fuel dangerous to handle and easy to track.
Save Energy with Information
Studies show that simply making customers aware of their energy use lowers it
by 5 to 15 percent. Smart meters allow customers to track their energy consumption minute by minute and appliance by appliance. Countless start-ups are offering the devices, and Google and Microsoft are independently partnering with local utilities to allow individuals to monitor their power usage over the Web.
Wind Power from the Stratosphere
According to a Stanford University study released in July, the high-altitude winds that constantly blow tens of thousands of feet above the earth hold enough energy to supply all of human civilization 100 times over. California’s Sky WindPower has proposed harvesting this energy by building fleets of giant, airborne, ground-tethered windmills, while Italy’s Kite Gen proposes to accomplish the same feat using kites.
Delivering the U.S. from Oil
Plug-in hybrid trucks are improving the long view of the short haul
By Amanda Schupak
Cargo trucks gulp about 40 percent of the fuel pumped in the U.S. While most consumer attention focuses on improving the fuel economy of consumer vehicles, a major opportunity goes rumbling by. “Folks do not realize that the fuel use of even a small truck is equal to many, many cars,” says Bill Van Amburg, senior vice president of Calstart, a clean transportation technology nonprofit, and director of the Hybrid Truck Users Forum. “A utility truck as a hybrid would reduce more petroleum than nine Priuses.”
Some 1,300 commercial hybrids on the road today get up to twice the fuel efficiency of their conventional counterparts. But these traditional hybrids are inherently limited. They make more efficient use of petroleum-based fuel by capturing some of the energy lost during braking.
Plug-in hybrids, on the other hand, draw energy from the grid. They can drive for miles—in many cases, an entire day’s route—without using any fossil fuel at all. This shifts energy demand away from petroleum and toward grid-based sources. (Last year zero-carbon renewables and nuclear supplied 30 percent of all electric power in the U.S.)
In many ways, plug-in hybrid technology makes more sense for delivery trucks than for consumer sedans. A cargo truck runs a short daily route that includes many stops to aid in regenerative braking. Most of the U.S. Postal Service’s 200,000-plus mail trucks, for example, travel fewer than 20 miles a day. In addition, fleet vehicles return nightly to storage lots that have ready access to the 120- or 240-volt outlets required to charge them.
The Department of Energy recently launched the nation’s largest commercial plug-in hybrid program, a $45.4-million project to get 378 medium-duty vehicles on the road in early 2011. The trucks, which will go to 50 municipal and utility fleets, will feature a power system from Eaton, a large manufacturer of electrical components, on a Ford F-550 chassis. (For its part, Ford will wait for the market to prove itself before designing its own commercial plug-ins.) “These are going to start breaking free in 2011,” says Paul Scott, president of the Electric Vehicle Association of Southern California.
Start-up company Bright Automotive has a more ambitious plan. It aims to replace at least 50,000 trucks with plug-in hybrids by 2014. Bright’s IDEA prototype travels 40 miles on battery power before switching to a four-cylinder engine that gets 40 miles to the gallon. The streamlined aluminum body has the payload of a postal truck yet is far more aerodynamic. The truck weighs as much as a midsize sedan.
John E. Waters, Bright Automotive’s founder and the former developer of the battery system for General Motors’s groundbreaking EV1 electric car, says that each IDEA would save 1,500 gallons of fuel and 16 tons of carbon dioxide emissions a year over a standard utility truck. Waters says he is ready to begin assembly in his U.S. plant once a pending $450-million federal loan comes through.
Despite the appeal of the carbon savings, the fleet owners who are the trucks’ primary customers have more practical considerations. Bright’s executives are coy about the IDEA’s eventual price tag but assert that a customer with 2,000 trucks driving 80 miles a day five days a week could save $7.2 million a year. Right now that is probably not enough to justify large-scale purchases without additional rebates—or a price on carbon. Van Amburg estimates that going hybrid currently adds $30,000 to $50,000 in upfront costs per vehicle, although that figure should come down as production volumes increase.
Improved battery technology will also help. Today the IDEA’s 13-kilowatt-hour lithium-ion battery pack accounts for nearly a quarter of the vehicle’s total cost. Much of the research being done for the batteries going into the Chevy Volt and other consumer plug-ins should also be applicable to commercial batteries. “For all the good we all want to do,” says David Lauzun, Bright’s vice president of product development, “these vehicles will not take over the world until it becomes the economic choice—‘I have to have them because it saves me money.’”
Bus Rapid Transit
Subwaylike bus lines mobilize the urban future
By Michael Moyer
For the first time in human civilization, more people now live in urban areas than in the countryside. This shift creates a number of dilemmas, not least of which is how to move people within the world’s rapidly growing metropolises. Pollution and traffic point away from car-based options, while light-rail systems are slow to construct and prohibitively expensive. One disarmingly simple—and cheap—possibility is Bus Rapid Transit, which is engineered to operate like a subway on wheels. In these systems, concrete dividers on existing roads separate high-capacity buses from the rest of traffic. Riders pay before boarding, then wait in enclosed stations. When a bus arrives, sliding partitions open to allow riders to board from a platform that is level with the bus floor. The traffic-free thoroughfares, quick boarding times, and modern, comfortable stations resemble light-rail systems more than the chaos of typical bus travel. In Bogotá, Colombia, which has had seven Bus Rapid Transit lines in operation since 2001, the buses handle 1.6 million trips a day. Its success has allowed the city to remove 7,000 private buses from the city, reducing consumption of bus fuel and its associated pollution by more than 59 percent.
Marine zoning is a bold remedy for sick seas
By Sarah Simpson
These days not even many politicians deny that the oceans are ill. Protecting the health of coastal waters is now a matter of national policy in dozens of countries, including the U.S., and world leaders are beginning to prescribe a revolutionary remedy that conservationists have been promoting for years: marine planning and zoning.
The idea is a natural extension of management policies that have guided the development of cities and landscapes for nearly a century. Porn shops aren’t next to preschools, after all, and drilling rigs aren’t the centerpieces of national parks. Similarly, zoning advocates envision a mosaic of regional maps in which every watery space on the planet is designated for a particular purpose. Drilling and mining would be allowed only in certain parts of the ocean; fishing in others. The most critically threatened areas would be virtually off-limits.
Whereas people can easily find maps telling them what they can do where on land, the marine realm is a hodgepodge of rules emanating from an army of agencies, each one managing a single use or symptom. In the U.S., for example, one body regulates commercial fishing, usually a single species at a time. Another group manages toxic substances, still another seabed mining, and so on—some 20 federal agencies in all. They tend to make decisions without regard to what the others are doing, explains Duke University marine ecologist Larry B. Crowder. “Imagine all of the medical specialists visiting a patient in intensive care one at a time and never talking to one another,” he says. “It’s a wonder that the oceans aren’t in worse shape than they are now.”
Ocean advocates such as Crowder eagerly await the final recommendations of a special task force President Barack Obama charged with presenting a plan for overhauling management of U.S. waters, which extend 200 nautical miles offshore. The scope of such an undertaking is huge: the U.S. controls 4.4 million square miles of seascape, making the country’s underwater real estate 25 percent larger than its landmass. The committee’s preliminary report, released in September, suggests that the best way to minimize harmful human impacts on the oceans is to manage regions rather than symptoms.
Many environmentalists are hopeful that such plans will be implemented through the marine equivalent of municipal zoning, which would give them some influence in areas where they now have none. In zones where conservation is designated as the dominant activity, fishing and industrial activities such as mining would no longer have free rein. Under current rules, about the only way a conservation group can block a project it deems harmful—say, a new site for offshore drilling—is through expensive litigation.
So far, though, the president’s task force has been careful not to suggest that ocean zoning will be the only treatment plan, in great part because any effort to restrict commercial interests is bound to meet stiff opposition. “Zoning isn’t anybody’s favorite exercise,” notes John C. Ogden, director of the Florida Institute of Oceanography at the University of South Florida at Tampa. “Someone’s ox is always getting gored.” Most resistant to such change will most likely be the traditional users of the open ocean—namely, commercial fisheries and the petroleum industry. “They’ve had the place to themselves for a long time,” Ogden says.
Ogden and others are quick to point out, however, that zoning practices can benefit commerce as much as conservation. By giving up access to certain areas, industries gain the security of knowing their activities would be licensed in a more predictable and less costly manner than they are today, explains Josh Eagle, associate professor at the University of South Carolina School of Law. Now an oil company can apply for permits to drill virtually anywhere, but it takes on a significant financial risk each time. The business may dump millions of dollars into researching a new facility only to have a lawsuit derail it at the last moment. When opposing parties have more or less equal voices early in the planning process, Eagle says, they are less inclined to block one another’s activities once zones are drawn on a map.
Whether the final report of the president’s task force will promote ocean zoning explicitly is uncertain. But the group has already promised to overhaul the structure of ocean governance by proposing the creation of a National Ocean Council, whose job it will be to coordinate efforts of the myriad federal agencies now in charge.
The move comes just in time. Just as society is beginning to appreciate the enormous efforts it will take to preserve the health of the oceans, it must ask more of them—more energy, more food, and better resilience to coastal development and climate change. The reason the oceans are in trouble is not what people put in and take out. It is a failure of governments to manage these activities properly. Says Crowder: “We have to treat the oceans holistically, not one symptom at a time.”
The Power of Garbage
Trapped lightning could help zap trash and generate electricity
By John Pavlus
Trash is loaded with the energy trapped in its chemical bonds. Plasma gasification, a technology that has been in development for decades, could finally be ready to extract it.
In theory, the process is simple. Torches pass an electric current through a gas (often ordinary air) in a chamber to create a superheated plasma—an ionized gas with a temperature upward of 7,000 degrees Celsius, hotter than the surface of the sun. When this occurs naturally we call it lightning, and plasma gasification is literally lightning in a bottle: the plasma’s tremendous heat dissociates the molecular bonds of any garbage placed inside the chamber, converting organic compounds into syngas (a combination of carbon monoxide and hydrogen) and trapping everything else in an inert vitreous solid called slag. The syngas can be used as fuel in a turbine to generate electricity. It can also be used to create ethanol, methanol and biodiesel. The slag can be processed into materials suitable for use in construction.
In practice, the gasification idea has been unable to compete economically with traditional municipal waste processing. But the maturing technology has been coming down in cost, while energy prices have been on the rise. Now “the curves are finally crossing—it’s becoming cheaper to take the trash to a plasma plant than it is to dump it in a landfill,” says Louis Circeo, director of Plasma Research at the Georgia Tech Research Institute. Earlier this summer garbage-disposal giant Waste Management partnered with InEnTec, an Oregon-based start-up, to begin commercializing the latter’s plasma-gasification processes. And major pilot plants capable of processing 1,000 daily tons of trash or more are under development in Florida, Louisiana and California.
Plasma isn’t perfect. The toxic heavy metals sequestered in slag pass the Environmental Protection Agency’s leachability standards (and have been used in construction for years in Japan and France) but still give pause to communities considering building the plants. And although syngas-generated electricity has an undeniably smaller carbon footprint than coal—“For every ton of trash you process with plasma, you reduce the amount of CO2 going into the atmosphere by about two tons,” Circeo says—it is still a net contributor of greenhouse gases.
“It is too good to be true,” Circeo admits, “but the EPA has estimated that if all the municipal solid waste in the U.S. were processed with plasma to make electricity, we could produce between 5 and 8 percent of our total electrical needs—equivalent to about 25 nuclear power plants or all of our current hydropower output.” With the U.S. expected to generate a million tons of garbage every day by 2020, using plasma to reclaim some of that energy could be too important to pass up.
More Ideas to watch
By John Pavlus
Cement as a Carbon Sponge
Traditional cement production creates at least 5 percent of global carbon dioxide emissions, but new materials could create carbon-neutral cement. Start-up Novacem, supported by Imperial College London, uses magnesium oxide to make cement that naturally absorbs CO2 as it hardens. California-based Calera uses seawater to sequester carbon emissions from a nearby power plant in cement.
The New Honeybee
Colony collapse disorder (CCD) has killed more than a third of honeybee colonies since 2006. Farmers who depend on bees to pollinate such crops as almonds, peaches and apples are looking to the blue orchard bee to pick up the slack.
One efficient Osmia lignaria can pollinate as much territory as 50 honeybees, but the bees are harder to cultivate because of their solitary nature. These pinch hitters won’t completely replace honeybees, but as scientists continue to grapple with CCD, they could act as an agricultural safety net.
As the world’s freshwater supply becomes scarcer and food production needs balloon, salt-tolerant crops could ease the burden. Researchers at Australia’s University of Adelaide used genetic engineering to enhance a model crop’s natural ability to prevent saline buildup in its leaves, allowing the plant to thrive in conditions that would typically wither it. If the same gene tweak works in cereal crops such as rice and wheat—the researchers are testing them now—fallow lands destroyed by drought or overirrigation could become new breadbaskets.
The Omnipotence Machines
Tiny, ubiquitous sensors will allow us to index the physical world the way the Web maps cyberspace
By Gregory Mone
Earlier this year Hewlett-Packard announced the launch of its Central Nervous System for the Earth (CeNSE) project, a 10-year effort to embed up to a trillion pushpin-size sensors across the planet. Technologists say that the information gathered by this kind of ubiquitous sensing network could change our knowledge of the world as profoundly as the Internet has changed business. “People had no idea the Web was coming,” says technology forecaster Paul Saffo. “We are at that moment now with ubiquitous sensing. There is quite an astonishing revolution just around the corner.”
The spread of versatile sensors, or “motes,” and the ability of computers to analyze and either recommend or initiate responses to the data they generate, will not merely enhance our understanding of nature. It could lead to buildings that manage their own energy use, bridges that flag engineers when in need of repair, cars that track traffic patterns and detect potholes, and home security systems that distinguish between the footfalls of an intruder and the dog, to name a few.
CeNSE is the boldest project yet announced, but HP is not the only organization developing the technology to make ubiquitous sensing possible. Intel is also designing novel sensor packages, as are numerous university labs.
For all the momentum in the field, though, this sensor-filled future is by no means inevitable. These devices will need to generate rich, reliable data and be rugged enough to survive tough environments. The sensor packages themselves will be small, but the computing effort required will be enormous. All the information they gather will have to be transmitted, hosted on server farms, and analyzed. Finally, someone is going to have to pay for it all. “There is the fundamental question of economics,” notes computer scientist Deborah Estrin of the University of California, Los Angeles. “Every sensor is a nonzero cost. There is maintenance, power, keeping them calibrated. You don’t just strew them around.”
In fact, HP senior researcher Peter Hartwell acknowledges that for CeNSE to hit its goals, the sensors will need to be nearly free. That is one of the reasons why HP is designing a single, do-everything, pushpin-size package stacked with a variety of gauges—light, temperature, humidity, vibration and strain, among others—instead of a series of devices for different tasks. Hartwell says that focusing on one versatile device will drive up volume, reducing the cost for each unit, but it could also allow HP to serve several clients at once with the same sensors.
Consider his chief engineering project, an ultrasensitive accelerometer. Housed inside a chip, the sensor tracks the motion of a tiny, internal movable platform relative to the rest of the chip. It can measure changes in acceleration 1,000 times as accurately as the technology in the Nintendo Wii.
Hartwell imagines situating one of these pins every 16 feet along a highway. Thanks to the temperature, humidity and light sensors, the motes could serve as mini weather stations. But the accelerometers’ vibration data could also be analyzed to determine traffic conditions—roughly how many cars are moving past and how quickly. The local highway department would be interested in this information, he guesses, but there are potential consumer applications, too. “Your wireless company might want to take that information and tell you how to get to the airport the fastest,” Hartwell says.
All of this gathering and transmission of data requires power, of course, and to guarantee an extended life, the HP pushpin will not rely solely on batteries. “It is going to have some sort of energy-scavenging ability,” Hartwell says. “Maybe a solar panel or a thermoelectric device to help keep the battery charged.”
With the power hurdle in mind, other groups are forgoing batteries altogether. At Intel Labs in Seattle, engineer Josh Smith has developed a sensor package that runs on wireless power. Like the HP pushpin, Intel’s WISP, or Wireless Identification and Sensing Platform, will include a variety of gauges, but it will also draw energy from the radio waves emitted by long-range radio-frequency ID chip readers. Smith says a single reader, plugged into a wall outlet, can already power and communicate with a network of prototype WISPs five to 10 feet away—a distance that should increase.
Smith cites many of the same infrastructure-related possibilities as Hartwell, along with a number of other uses. If WISPs were placed on standard household items such as cups, these tags could inform doctors about the rehabilitation progress of stroke victims. If the cups the patient normally uses remain stationary, Smith explains, then the individual probably is not up and moving around.
The potential applications for ubiquitous sensing are so broad—a physicist recently contacted him about using WISPs to monitor the temperature outside a proposed neutrino detector—that, as with the Internet, Smith says it is impossible to foresee them all. “In terms of the impact it is going to have on our lives,” Hartwell adds, “you haven’t seen anything yet.”
The Do-Anything Robot
Your PC can accomplish any computing task you ask of it. Why isn’t the same true for robots
By Gregory Mone
Robots have proved to be valuable tools for soldiers, surgeons and homeowners hoping to keep the carpet clean. But in each case, they are designed and built specifically for the job. Now there is a movement under way to build multipurpose machines—robots that can navigate changing environments such as offices or living rooms and work with their hands.
All-purpose robots are not, of course, a new vision. “It’s been five or 10 years from happening for about 50 years,” says Eric Berger, co-director of the Personal Robotics Program at Willow Garage, a Silicon Valley start-up. The delay is in part because even simple tasks require a huge set of capabilities. For a robot to fetch a mug, for example, it needs to make sense of data gathered by a variety of sensors—laser scanners identifying potential obstacles, cameras searching for the target, force feedback in the fingers that grasp the mug, and more. Yet Berger and other experts are confident that real progress could be made in the next decade.
The problem, according to Willow Garage, is the lack of a common platform for all that computational effort. Instead of building on the capabilities of a single machine, everyone is designing robots, and the software to control them, from the ground up. To help change this, Willow Garage is currently producing 25 copies of its model PR2 (for “Personal Robot 2”), a two-armed, wheeled machine that can unplug an appliance, open doors and move through a room. Ten of the robots will stay in-house, but 10 more will go to outside research groups, and everyone will pool their advances. This way, Berger says, if you want to build the robotic equivalent of a Twitter, you won’t start by constructing a computer: “you build the thing that’s new.”
The military, short on linguists, is building smart phone–based devices to do the job
By Gregory Mone
Sakhr Software, a company that builds automatic language translators, recently unveiled a prototype smart phone application that transforms spoken English phrases into spoken Arabic, and vice versa, in near real time. The technology isn’t quite ready for your next trip to Cairo, but thanks to recent advances in machine-translation techniques, plus the advent of higher-fidelity microphones and increasing processing power in smart phones, this mobile technology could soon allow two people speaking different languages to have basic conversations.
Before the 1990s automatic translation meant programming in an endless list of linguistic rules, a technique that proved too labor-intensive and insufficiently accurate. Today’s leading programs—developed by BBN Technologies, IBM, Sakhr and others as part of a Defense Advanced Research Projects Agency effort to eliminate the military’s need for human translators—rely on machine-learning techniques instead. The software works from a database of parallel texts—for example, War and Peace in two different languages, translated United Nations speeches, and documents pulled off the Web. Algorithms identify short matching phrases across sources, and the software uses them to build statistical models that link English phrases to Arabic ones.
John Makhoul, BBN’s chief scientist, says the current technology is at its best when confined to subject areas with specific phrases and terminology—translating a weather report from English into French, for example, or helping soldiers gather basic biographical information from people in the field. Makhoul envisions the first consumer applications, five years from now, being similarly constrained. A tourism-related translation app on a smart phone could help an American in Florence get directions from a non-English-speaking local, but they won’t chat about Renaissance art. “It is not going to work perfectly,” he says, “but it will do a pretty good job.”
Know if Disease Grows Inside You
Complex diseases have complex causes. Luckily, they also leave a multitude of traces
By Melinda Wenner
With the exception of certain infectious diseases, few of humanity’s ailments have cures. More than 560,000 Americans will die of cancer this year, and despite the 250,000 coronary bypass surgeries doctors perform annually, heart disease is still the country’s number-one killer.
The hardest diseases to cure are the ones that take the longest to develop. They are the end result of decades of complex molecular interactions inside your body. Yet this complexity also presents an opportunity. Scientists have discovered that these interactions leave discernible fingerprints on the body. By unweaving the complex tapestry of molecular clues—changes in the body’s proteins, nucleic acids and metabolites, collectively called biomarkers—doctors hope they will soon be able to not only detect disease but predict a coming illness in time to take action.
Biomarkers are not new. Since 1986 doctors have monitored prostate cancer by measuring blood levels of the protein known as prostate-specific antigen (PSA). But tests that rely on a single biomarker to detect disease are rare, because most disorders involve intricate changes in a collection of biomarkers.
Take schizophrenia: in January 2010 scientists will release a biomarker test that distinguishes schizophrenia from other psychiatric conditions. The test, which is being commercialized by Rules-Based Medicine, a laboratory in Austin, Tex., is based on the characteristics of about 40 blood-based proteins.
To find potentially useful biomarkers, researchers collect blood samples from thousands of healthy people and analyze them. Biomarker levels in these samples provide a baseline reading. Then they do the same for people with a specific condition such as diabetes or breast cancer. If reproducible differences emerge between the groups, scientists can use the patterns in the disease group to diagnose the same condition in others. By collecting samples over time, researchers can also go back and analyze early samples from individuals who later become ill to identify patterns indicative of early disease or high disease risk.
Biophysical Corporation, a sister company to Rules-Based Medicine, is one of several companies that has developed blood-based biomarker tests and marketed them to the public [see “The Ultimate Blood Test,” by Philip Yam; Scientific American, June 2006]. The company searches for up to 250 biomarkers suggestive of cancer, inflammatory conditions, heart disease and other illnesses. Mark Chandler, Biophysical’s chair and CEO, says that the real value of the tests lies in long-term monitoring. A person could “get a test monthly, just a finger stick, that would be able to say, we have had a serious change here that is indicative of an early-stage cancer,” he explains.
Yet not all experts are convinced that the age of biomarkers is at hand. Cheryl Barton, an independent U.K.-based pharmaceutical consultant who authored a Business Insights market analysis report on biomarkers in 2006, says she remains “a little bit skeptical about how clinically useful they are.” A study of 5,000 subjects published in the Journal of the American Medical Association in July 2009 found that six cardiovascular biomarkers were only marginally better at predicting heart disease than were standard cardiovascular risk factors, such as whether the subjects smoked or had diabetes.
Adding to the overall difficulty, a person might suffer from two or more diseases—prostate cancer and heart disease, for example. No one knows how multiple diseases might affect overall biomarker signatures or how profiles will change as other diseases develop. “When you get to be 65 or 70, almost everybody has other conditions,” Chandler says. “We don’t know how to deal with that right now.” And scientists still need to discern which biomarkers are truly relevant to disease—a difficult task when working with blood, which contains tens of thousands of proteins at concentrations spanning more than 10 orders of magnitude.
Some companies have simplified the problem by avoiding blood altogether. LabCorp recently commercialized a biomarker test that analyzes colon cells in stool for the chemical signatures indicative of colorectal cancer. “The stool is in intimate contact with the lining of the colon, so it becomes much more highly populated with these rare molecules than would get into the bloodstream from colon cancer,” says Barry Berger, chief medical officer of Exact Sciences, a Madison, Wis.–based biotechnology company that developed the test technology.
In time, scientists are confident that they will eventually crack the more difficult problem of finding distinct disease signatures in the noisy data. “The evolutionary process, being complex and unknown, does not always give us an easy route,” Berger notes, “but it definitely gives us lots of opportunities.”
Satellites Diagnose Disease Outbreaks
Space-based data are helping to track and predict the spread of deadly diseases ?
By Katherine Harmon
Many contagious diseases spread through carriers such as birds and mosquitoes. These vectors in turn move with heat and rainfall. With this in mind, researchers have begun to use satellite data to monitor the environmental conditions that lead to disease. “Ideally, we could predict conditions that would result in some of these major outbreaks of cholera, malaria, even avian flu,” says Tim Ford of the University of New England at Biddeford and co-author of a paper on the subject published this past September in Emerging Infectious Diseases.
Satellite data have already been used to map the advance of the H5N1 avian influenza in Asia. The domestic duck, a common inhabitant of Southeast Asia’s rice paddies, is one of the main carriers of the disease. Xiangming Xiao, associate director of the University of Oklahoma’s Center for Spatial Analysis, uses satellite images to map agricultural patterns in the region. These maps show where the ducks are most likely to live and thus where the avian influenza is most likely to spread.
Migratory birds also carry the virus, but their travel patterns are more difficult to predict. Xiao and his colleagues combine the satellite imagery with satellite-gathered surface-temperature data to estimate the birds’—and thereby the virus’s—trajectory. Computer models then link these environmental drivers to the spread of the flu in human populations.
Of course, not all of the work can be outsourced to orbiting observatories. Xiao says that judging the severity of avian flu’s spread from satellite imaging required knowing details about the human populations as well—for instance, how likely certain communities were to raise ducks for poultry consumption. “Satellite monitoring has a capacity to provide consistent observation,” Xiao says. “On the other hand, the in situ observations are still very, very important, so the key is to combine those together. That is a real challenge.”
More Ideas to watch
By Melinda Wenner
Emergency technicians could prevent up to 35 percent of prehospital trauma deaths if they had better and cheaper ways to prevent blood loss. Now a University of Maryland–affiliated start-up called Trauma Solutions has developed a synthetic hydrogel that can clot blood by prompting the body to make fibrin, a protein that seals wounds and stops bleeding. Future iterations could simultaneously release such medicines as antibiotics and painkillers. Each application will cost about $5, compared with some natural blood-clotting substances that cost upward of $500.
Liver damage is a major side effect of HIV/AIDS and tuberculosis drugs, yet few developing countries have enough trained scientists or equipment to monitor it. Nonprofit Cambridge, Mass.–based Diagnostics For All has developed an inexpensive fingernail-size device made almost entirely of paper that monitors liver damage using a single drop of blood. Channels in the paper guide blood to regions that change color depending on the levels of two damage-related liver enzymes.
Streptococcus mutans bacteria in the mouth decay teeth by converting sugars into enamel-eroding lactic acid. Florida-based Oragenics has genetically engineered a new strain of bacteria that converts sugars to trace amounts of alcohol instead. Because the new strain permanently displaces natural S. mutans, the therapy, which is currently in clinical trials, will be available as a one-time prescription that will protect teeth for life.
The engineering solutions to combat climate change already exist. Politicians must be brave enough to use them before it’s too late
One word sums up the attitude of engineers towards climate change: frustration. Political inertia following the high-profile failure of 2009’s Copenhagen climate conference has coupled with a chorus of criticism from a vocal minority of climate-change sceptics. Add the current economic challenges and the picture looks bleak. Our planet is warming and we are doing woefully little to prevent it getting worse.
Engineers know there is so much more that we could do. While the world’s politicians have been locked in predominantly fruitless talks, engineers have been developing the technologies we need to bring down emissions and help create a more stable future.
Wind, wave and solar power, zero-emissions transport, low-carbon buildings and energy-efficiency technologies have all been shown feasible. To be rolled out on a global scale, they are just waiting for the political will. Various models, such as the European Climate Foundation’s Roadmap 2050, show that implementing these existing technologies would bring about an 85 per cent drop in carbon emissions by 2050. The idea that we need silver-bullet technologies to be developed before the green technology revolution can happen is a myth. The revolution is waiting to begin.
The barriers preventing the creation of a low-carbon society are not technological but political and financial. That’s why at a landmark London conference convened by the UK’s Institution of Mechanical Engineers, 11 national engineering institutions representing 1.2 million engineers from across the globe, under the banner of the Future Climate project, made a joint call for action at December’s COP17 climate change conference in Durban, South Africa.
The statement calls on governments to move from warm words to solid actions. They need to introduce legislation and financial support to get these technologies out of the workshop and into our homes and businesses and onto our roads. Targeted regulation and taxation will also drive innovation. This will require bold politics, and spending at a time when money is scarce. It is far from unaffordable, however. The UK’s Committee on Climate Change, which advises the British government, continues to support the view of the Stern report – an assessment of the climate change challenge in the UK – that the move to a low-carbon society will cost no more than 1 per cent of GDP by 2050.
Resistance to wind turbines and the power lines they feed, nuclear power and electric cars, as well as the economic costs, all make public opinion a powerful brake on change. However the alternative seems certain to be worse. It is not only the challenges of a deteriorating climate: with inaction comes a great risk to our economy in the long term. The green technology revolution, just like the industrial revolution before it, will give jobs to those countries which have created the right conditions for it to flourish.
Which countries these will be is still an open question. India, Germany, Australia and the UK were among the nations signed up to the Future Climate statement, whereas the world’s largest greenhouse gas emitters – China and the US – were not. When it comes to investment in clean technology, however, that’s not the whole story.
Although China is continuing to build coal-fired electricity plants at an alarming rate to power its rapid economic growth, the UN Environment Programme confirmed last month that it is now by far the world’s biggest investor in renewable energy. Last year, China’s wind, solar and biomass power industries received $49 billion of new investment, a third of the global total, and it now has the largest installed wind capacity in the world. When predicting who the front runner in this next great technological revolution will be, it is difficult to see past the emerging superpower to the east.
The US is going in the opposite direction. A natural gas rush driven by the development of controversial “fracking” techniques over the past decade has echoes of the oil rush that transformed Texas a century ago. The Financial Times reports that just one company, BHP Billiton, is investing as much as $79 billion in US shale gas fields – over three times the amount invested in all US renewables in a year. This will secure cheap energy in the short term, but it is a finite resource and ultimately a dead end. In due course we could face the interesting prospect of the US turning to China to acquire its wind turbine technology.
Investment in renewable energy is vital for a prosperous, low-carbon society. However, decision-makers cannot ignore the elephant in the room – nuclear power. The enormous cost of implementing 100 per cent renewable power is not realistic for most nations, so nuclear offers our best chance of making a low-carbon society achievable and affordable. Yet the incident at Fukushima earlier this year has reinforced some long-standing concerns.
Unlike road use or smoking, nuclear power stirs anxieties in many of us that are out of proportion with its true risks. This is not to be complacent about the potential danger of a nuclear plant, but it is striking that nuclear power has killed fewer than 5000 people in its entire history. Compare that with coal mining, which in just one year and in one country – China in 2006 – killed 4700.
Germany’s decision to phase out all nuclear power as a result of Fukushima will most likely have unintended consequences. The Association of German Engineers has estimated that it will cost €53 billion every year in Germany to close down its nuclear generation and switch to 100 per cent renewable energy. It will be interesting to see how public opinion, now so clearly against nuclear power, responds as the economic costs become apparent.
Any technological revolution requires two crucial ingredients – engineers to design, develop and manufacture the technology, and politicians to help create the legislative, behavioural and societal environment that allows change to happen. Today’s engineers have fulfilled their side of the bargain. It is time for our politicians to show their mettle.
Colin Brown is director of engineering at the UK’s Institution of Mechanical Engineers
Joel Shurkin, contributor
Travelling through the Arctic is notoriously difficult and climate change is making it even harder. But there is a way to rise above the problem: the latest generation of lighter-than-air vehicles. Canadian company Discovery Air has signed a contract with the UK’s Hybrid Air Vehicles (HAV) to buy around 45 new hybrid air vehicles. These aircraft will be used across Canada’s Northwest Territories.
Whether taking out lumber from the forests or helping people access remote villages, transportation in Arctic Canada can be extrememly daunting. Most transportation is either by air, which is expensive, by boat, or by ice road. Rising winter temperatures, due to climate change, are likely to make Canada’s ice roads less stable and reduce the amount of time in winter in which they can safely be used.
Gordon Taylor, marketing director for HAV, says the vessels are technically neither airships nor blimps. While they do make use of non-explosive helium for lift, they also get substantial lift from the aerodynamic design of the fuselage.
HAV already has a major contract for hybrid vehicles with the US Defence Department for long-endurance surveillance vessels.
The vessels Discovery Air has ordered are HAV’s model 366, which Taylor says can carry 50 tonnes if they take off horizontally like an airplane and around 30 tonnes if they take off vertically. Not even the largest helicopters in the world can match that, explains Taylor.
One hundred and ten metres long, the vessels can reach altitudes of almost 3000 metres and can take off and land almost anywhere. The cargo will fit in the fuselage for very long trips or can hang beneath the ship for shorter ones. Later models can also be flown remotely.
September 13, 2011, 1:09 PM — You can buy that cool tablet today, and its useful life is probably three years on the outside. Something new and cool will be available in 2014 (no pre-announcements here, just predictions) and you’ll want to buy it. Perhaps you’ll use a vendor’s trade-in program to do something with the old one — after you’ve conveniently moved the data to your new machine. We hope.
There’s a huge opening for someone to get rich, developing a usable taxonomy of parts and materials so that products can be safely and profitably devolved. The way you do it is clear: find a method to describe parts in such a way that they can be taken apart and recycled or safely disposed of. The avalanche of tech products is unlikely to stop, and we expect even less time with them before the new thing arrives to tempt us.
You bought. Someone now has your old machine, with its data removed. What’s done with it is then, is something ranging from devolution to landfill fodder. Inside the derelict are a number of precious metals, and depending on the battery technology, a lump of lithium, nickel, and/or other metals. Many smaller bits inside will become reduced to smaller and smaller bits until they’re either disposed of in a pile (in the ocean, landfill, etc.) or smelted and separated into base elements. It’s an inefficient and labor-intensive process. Plastics can be reused, as well as the stickers and box that an item arrived in.
Lots of derelict products are shipped to SE Asia, where the labor cost of this inefficient process helps compensate by being comparatively low. It also leads to huge piles of ex-computer gear parts that pollute the groundwater in hideous ways. People are poisoned in the scavenging process, not to mention the evil piles of computer dung that are nuclear waste without the isotopes.
What’s needed is a way to mark directly, every part in a machine. Some parts will be more lucratively recycled. Importantly, those parts that are environmentally damaging, or those that require special devolution processes can be aggregated so that they don’t cause interim pollution, and recyclers can benefit from scale of devolution of hazardous materials.
Today, we use primitive marks to denote very basic (typically plastic) product composition. We have hazardous materials markers and identification and other markings to identify objects that can be either recycled or are hazardous/dangerous-to-handle.
My suggestion: use advanced barcodes to identify everything by a recycling mark that can be rapidly identified for devolution. The marking doesn’t have to be on an easily visible area, but it needs to be revealed somehow. The marks can be tiny, almost microscopic, yet recognized by modern bar code scanners. They could identify either specific categories of product materials, or by actual part number.
In the first case, generic markers can identify tens of millions of generic product identifications, making devolution and separation into elements for recycling vastly simpler than it is today. Specific identification then differentiates subsystems and elements that need specific handling requirements, or perhaps have vendor/manufacturer-specific (even mandated) devolution processes (including rewards).
Another reward potential is that most consumer and industrial products could benefit from the same marking scheme that would permit rapid and accurate product devolution. Junkyards across the world are full of unidentifiable bits and pieces of products gone by, ranging from building cranes to old Volkswagens to refrigerators and no one knows what this stuff is. There are various tests for precious metals (often using primitive magnets) and certain plastics, but many materials aren’t easily identified. So they rot, rust, and ooze back into the environment. Materials identification methodologies won’t be tough to deploy, and a government mandate seems unnecessary because the motivation to make money from recycled materials exists now.
If we don’t do this, then the chances of high-efficiency recycling becomes reduced vastly, and piles of useless and hazardous ex-computer junk become taller. Just as every bill of materials includes parts and sources, we could devolve products when their lifecycle is over systematically. What’s needed is an agreement to employ this methodology to the production process: deproduction. The devil of the details will come. Barcodes exist. Now we need a product identification taxonomy, a method to affix material markings, and a database access method that tells the devolvers how to make money.
We have iPhones, iPods and iPads. Why not an “iBank?”
This wouldn’t be an electronic gizmo that’s obsolete in a year, though. It would be a public-private partnership to bolster America’s infrastructure. It will create jobs, cut the deficit and repair what needs to be fixed all over the country.
An infrastructure bank, or iBank, solves a lot of problems without busting the budget. Instead of providing direct government grants or earmarks for specific projects, loans are made by a government-banking entity.
The U.S. is inexcusably late to the game on this time-tested idea. The European Investment Bank has financed some $350 billion in projects from 2005 through 2009. China spent 9 percent of its gross domestic product — also roughly $350 billion — to build subways, highways and high-speed rail in 2009 alone. Brazil invested $240 billion over the past three years.
The idea is not without high-level support. President Obama recently called for the creation of an iBank. In backing a U.S. iBank, Senator John Kerry of Massachusetts testified last year that “a national infrastructure bank will make Americans builders again.”
If the iBank became reality — and really it’s a necessity to compete in a globalized economy — there’s no shortage of projects. According to the American Society of Civil Engineers, more than $2 trillion is needed to fix U.S. bridges, dams, waterways and wastewater plants.
The sheer scale of a big fix is staggering: Some 69,000 bridges need to be repaired. The outdated electrical grid needs to be modernized everywhere. You can build solar plants and windmills all you want, but if you have no power lines to transport the electrons from the deserts and plains, you’re whistling in the wind.
Several spin-offs of an iBank have been floating around for years, and the idea already has support across the political spectrum. A “Clean Energy Bank” would fund solar energy equipment. Sen. Bernie Sanders of Vermont, supports legislation that would install 10 million roof solar panels. Sen. Mark Kirk of Illinois proposed a “Lincoln Legacy” infrastructure bill.
How is the iBank different from just handing out the money to each Congressional district and letting the local representative decide where the money should go?
In Kerry’s vision, federal dollars would be matched with private dollars from pension funds and endowments. Kerry told the Time’s Joe Klein recently that “a $10 billion federal contribution will leverage about $640 billion in private investments.” Kerry claims he has support from business, labor and Republican Senators.
Instead of doling out pork-barrel funding for bridges to nowhere, an independent board would decide which projects are needed most. It’s the inverse of a military base closing commission. Instead of shutting down facilities, this entity would greenlight and finance the most-worthy projects.
One thing an iBank wouldn’t be is another big-check stimulus plan, which Congress passed in 2009. That nearly $800 billion package was a huge fiscal band-aid to help states, school districts and wage earners through the recession. Yes, there were some public works projects that created short-term jobs, but the bulk of the money went to tax relief and the states.
The U.S. needs a new approach to economic triage. The June jobs report was nothing short of dismal as employment growth hit a wall with only 18,000 new jobs coming on the market.
Crumbling infrastructure will cost the U.S. economy nearly 1 million jobs and shave $3.1 trillion from gross domestic product by 2020, the Society of Civil Engineers estimates.
What about the budget? Isn’t there a disconnect between the current passion for cutting the federal deficit and spending money to fix America?
There’s little question that putting people to work will help the economy. Working people pay income, sales and property taxes, which flow back into communities. The steadily employed buy homes, vehicles and appliances. Increased tax revenue in turn reduces the deficit.
The iBank may be able to accomplish what a decade of personal income and estate-tax cuts didn’t: Provide the necessary public-private capital to revive the economy. Not even Harry Potter can make magic work on the U.S. economy without some significant infrastructure investment.
This technology has HUGE non-military potential, from portable self-powered comm towers to situation monitoring and aerial recon, to portable power generation;
The famed Pentagon Q-branch boffinry hothouse, DARPA, has unveiled another ambitious plan to further US military-technical dominance. It has given $400m to American weapons globocorp Lockheed to develop a solar-powered robot radar airship, able to lurk in the stratosphere for a year at a time, potentially tracking individual people walking about on the ground across areas 1200km wide.
Yesterday’s contract announcement was for Phase 3 of DARPA’s Integrated Sensor Is Structure (ISIS) project, in which a flying sub-scale demonstrator will be built to prove that the concept can work as planned. Phases 1 and 2 consisted mostly of design studies and materials work.
The idea of ISIS is to hugely improve on what a normal airship can do, by using the ship itself as a radar antenna rather than carrying a separate piece of machinery – hence the name. DARPA believe this will hugely increase the size of radar antenna a stratospheric airship can carry, which in turn means the radar would deliver much better sensor resolution for much less power.
The lowered power requirements of the ISIS radar-ship, DARPA believes, will mean it can run on solar power. Excess energy generated during the day will be stored by cracking water into hydrogen: at night, this will be burned in fuel cells to keep the ship flying and its radar shining even in darkness.
DARPA calculate that the ship should be able to cruise at 60 knots or sprint at 100, which will let it deploy from the US to a global troublespot in 10 days. It will then be able to hold station easily in the stratospheric “wind bucket” found at 65,000 to 70,000 feet, scanning the ground beneath it with its all-seeing radar mega-eye.
The performance of the massive scanner, according to DARPA, should be such that it can track unobscured “dismounts [people walking] across the entire line of sight” – in other words out to the horizon, which at operational height will be 600km away.
That said, the contract announcement suggests a slight bit of neck-winding, referring to an ability to track “all ground targets” to 300km. Closer in, the Pentagon boffins think, it will be capable of tracking such small objects even through overhanging foliage. Performance against easier airborne targets – planes, missiles etc. – would definitely be right out to the horizon at 600km.
If the ISIS can do all that DARPA suggest, it will handily trump most of the other aerial scanners in use by the US forces, including AWACS sky-scanner planes, the smaller E-2 Hawkeye AWACS that flies from US carriers, Joint STARS ground-sweeping tank sniffers, and the JLENS moored-balloon radar plan. The potential would be there perhaps to do without all these things, simply assigning a single ISIS ship in place of the several AWACS or whatever you formerly needed so as to keep one up on patrol.
An ISIS airship would potentially be vulnerable to enemy action, but at 70,000 feet only quite serious enemies – the sort who could also threaten AWACS or JSTARS aircraft – would have any chance of hitting it. And those planes carry large crews, whereas the ISIS is unmanned.
So this is potentially big news for the US military, the more so in that ISIS has now made it to Phase 3 – we’re no longer talking just about design studies here. The privacy/surveillance issues – the chance that ISIS spy-ships might lurk one day above US or allied territory, tracking every vehicle or even every person walking about – could be even more significant. Forget about numberplate cameras or face tracking; you’d have to live underground to avoid this sort of thing.
For those who’d like to know more, there’s a pdf on ISIS from DARPA here. ®
Combine sunlight and sewage and what do you get? Sanitation, of course.
Michael Hoffmann at the California Institute of Technology has been experimenting with solar-powered water treatment on a small scale. Now he plans to incorporate this technology into a portable toilet.
Sunlight powers an electrochemical reaction with human waste in water that generates microbe-killing oxidants and releases hydrogen gas. The researchers plan to collect the hydrogen in a fuel cell to power a light or possibly even a self-cleaning mechanism.
He received a grant this week from the Bill and Melinda Gates Foundation to build a prototype. He says he can build one toilet for $2000 and hopes to reduce the cost through design refinement and mass production.
This grant is part of the Gates Foundation’s latest global public health initiative to improve sanitation.
Several other awarded projects propose to build toilets that generate energy for the community, either processing solid waste into biological charcoal or vaporising it into plasma that generates hydrogen and carbon monoxide to run a fuel cell.
According to World Health Organization estimates, 2.6 billion people – about 40 per cent of the world’s population – do not have access to sanitation.
Its Japanese developers call it the ‘Futuristic Circular Flying Object’ and it’s designed to go where humans can’t.
The radio-controlled sphere, roughly the size of a basketball, was built for search and rescue operations, specifically to fly in and out of buildings weakened by earthquakes or other natural disasters.
The device uses its onboard camera to transmit live images of whatever it sees.
Scroll down for video
Radio-controlled: The ‘Futuristic Circular Flying Object’ uses an onboard camera to transmit live images of whatever it sees
The sphere was built for search and rescue operations, specifically to fly in and out of buildings weakened by earthquakes or other natural disasters
Flying object: The device zips through the air, glides smoothly around corners, and negotiates staircases with ease, all the while emitting a soft hum
The black, open-work ball looks like a futuristic work of art, but it can hover for up to eight minutes and fly at 37mph — although it does slow down for open windows.
Fumiyuki Sato, at the Japanese Defense Ministry’s Technical Research and Development Institute, invented and built the vehicle for roughly 110,000 yen (£865 / $1,390) with parts purchased off the shelf at consumer electronics stores.
He said: ‘Because of its spherical shape, it can land in various positions and tumble to move around the ground.’
It zips through the air, glides smoothly around corners, and negotiates staircases with ease, all the while emitting a soft hum.
Inventor: Fumiyuki Sato, at the Japanese Defense Ministry’s Technical Research and Development Institute, invented and built the vehicle for roughly £865
Slick: The black, open-work ball looks like a futuristic work of art, but it can hover for up to eight minutes and fly at up to 37mph
Resourceful: Mr Sato built the sphere with parts purchased off the shelf at consumer electronics stores
Measuring 42cm, it boasts eight manoeuvrable rudders, 16 spoilers and three gyro sensors to keep it upright. It is made of lightweight carbon fiber and styrene components for a total weight of 340grams.
If its lithium batteries lose power, it’s been designed simply to roll to a stop to minimise the chance of damage.
‘When fully developed, it can be used at disaster sites, or anti-terrorism operations or urban warfare,’ Mr Sato said.
Meanwhile, he added, there’s the pure fun of testing it.
ScienceDaily (June 26, 2011) — In a paper published in Nature Photonics, U of T Engineering researchers report a new solar cell that may pave the way to inexpensive coatings that efficiently convert the sun’s rays to electricity.
The U of T researchers, led by Professor Ted Sargent, report the first efficient tandem solar cell based on colloidal quantum dots (CQD). “The U of T device is a stack of two light-absorbing layers — one tuned to capture the sun’s visible rays, the other engineered to harvest the half of the sun’s power that lies in the infrared,” said lead author Dr. Xihua Wang.
“We needed a breakthrough in architecting the interface between the visible and infrared junction,” said Sargent, a Professor of Electrical and Computer Engineering at the University of Toronto, who is also the Canada Research Chair in Nanotechnology. “The team engineered a cascade — really a waterfall — of nanometers-thick materials to shuttle electrons between the visible and infrared layers.”
According to doctoral student Ghada Koleilat, “We needed a new strategy — which we call the Graded Recombination Layer — so that our visible and infrared light-harvesters could be linked together efficiently, without any compromise to either layer.”
The team pioneered solar cells made using CQD, nanoscale materials that can readily be tuned to respond to specific wavelengths of the visible and invisible spectrum. By capturing such a broad range of light waves — wider than normal solar cells — tandem CQD solar cells can in principle reach up to 42 per cent efficiencies. The best single-junction solar cells are constrained to a maximum of 31 per cent efficiency. In reality, solar cells that are on the roofs of houses and in consumer products have 14 to 18 per cent efficiency. The work expands the Toronto team’s world-leading 5.6 per cent efficient colloidal quantum dot solar cells.
“Building efficient, cost-effective solar cells is a grand global challenge. The University of Toronto is extremely proud of its world-class leadership in the field,” said Professor Farid Najm, Chair of The Edward S. Rogers Sr. Department of Electrical & Computer Engineering.
Sargent is hopeful that in five years solar cells using the graded recombination layer published in the Nature Photonics paper will be integrated into building materials, mobile devices, and automobile parts.
“The solar community — and the world — needs a solar cell that is over 10% efficient, and that dramatically improves on today’s photovoltaic module price points,” said Sargent. “This advance lights up a practical path to engineering high-efficiency solar cells that make the best use of the diverse photons making up the sun’s broad palette.”
The publication was based in part on work supported by an award made by the King Abdullah University of Science and Technology (KAUST), by the Ontario Research Fund Research Excellence Program, and by the Natural Sciences and Engineering Research Council (NSERC) of Canada. Equipment from Angstrom Engineering and Innovative Technology enabled the research.