Category Archives: Information rules

IBM software patch upgrades power grid to version 2.0

IBM software patch upgrades power grid to version 2.0 | ExtremeTech.

In the past 120 years, the world’s aging energy grid has not seen much innovation. Companies are still making implementation decisions based on principles that were developed in the grid’s infancy. As the world faces down a growing energy problem in the light of there now being almost 7 billion people walking the planet, companies are deciding that it is time to bring about some change to take advantage of modern technology to help with resource conservation. A consortium that includes IBM, the Pacific Northwest National Laboratory, and other power companies has decided that a software update is the first major step to both regulate power consumption, and integrate new renewable energy technologies; “Grid 2.0? if you will. The consortium is getting ready to install the system across five states in the Pacific Northwest, after a successful trial run in Washington state last year.

The main idea behind the software platform is simple: Give the power companies the ability to control energy consumption at peak times by making small changes in each home that will equal big savings when looking at the zoomed-out, macro picture of the grid. This is accomplished by installing “smart” thermostats in the homes of customers who opt into the program. By giving these consumers rebates and different incentives, the providers increase adoption to help make the project viable. Simply put, the power company is able to remotely tweak your thermostat, ultimately reducing the energy your home is using. Taking a page out of the airline executives book that saved his company millions by taking away one olive in each salad it served during flight, the power providers seek to save energy by the volume of micro-changes it makes in smart thermostats since they are saving a large amount of energy overall.

In addition to consumption control, IBM is helping the consortium tie other energy sources such as wind and solar into the grid to store them for use during periods of high demand. Integration of renewable energy sources is a big part of the consortium’s overall plan for the future of the grid. On paper, this plan certainly looks like it could be a winner, but there are two large problems to overcome: First, dealing with Big Brother — second, dealing with infrastructure.

I’m afraid I can’t do that, Dave

George Orwell's 1984 (Big Brother)Let’s start with the most obvious problem there is with this plan, the idea that a “Big Brother” company is going to be able to take control of a user’s thermostat and other household appliances. There are some serious concerns that have been voiced with this idea. Exactly how much control will the company assert? If they make a change to your thermostat and you change it back, will the software reassert the temperature change? Could this lead to energy “caps” like consumers already have with home and mobile internet connections? It sure would be a bummer to hit a cap on one of the coldest days of the year and not be able to heat your home.

The consumer is said to benefit from this plan by enjoying a flat rate based on average consumption of the grid overall. At the end of a billing cycle, if a home has used significantly less than that average, the consumer will get a rebate. There is a problem here as well as companies could regulate the power usage so that users never fall below that point. This might sound overly cynical and rather “Skynet” in nature, but these possibilities are there, and a result would be subject to heavy governmental regulation that could bog down the process until it’s no longer viable. There would need to be some real transparency and an implementation of some sort of real-time information on the power usage of a home. The consumers have the right to see how the software run by their power company is regulating their homes. Google had a great project called Google Power Meter, now retired, which allowed users to see how much energy they were burning. Something like that is what is needed for this overhaul to work.

Supersize me

It can be argued that this is what is needed to help curb the high rate of energy consumption, especially in the US. Environmentalists hold that without this kind of plan in place, people will inherently wasteful because that is what they have been taught to do. They may have a point as many of the “dumb” thermostats in homes are left unset despite some having some intricate scheduling systems. The issue is that any kind of control asserted by a company might be looked upon as draconian in parts of the country used to expressing individual freedoms.

The other hurdle to leap over is the practical aspect of the power grid’s aging infrastructure. Simply put, some of it is ancient and costly to replace. It is a sound idea for this plan to start with software that will take advantage of existing infrastructure, because the cost of a complete overhaul would be staggering. Add in the fact that the software project was funded in large part by the economic Recovery Act of 2009 in the US, there is a looming question of who will pay. Where does the power company responsibility come in as far as the monetary side of an overhaul?

Wind farmThere are some logistical issues as well with the integration of renewable energy sources. Windmill farms are usually pretty distant from heavily populated areas, solar panel arrays the same. Getting the power from these sources to the consumer is an issue because the current wiring in use is “lossy,” wasting precious energy in transport. One has to think of the grid as a large factory system with supply and demand and logistical challenges that rival UPS. Any overhaul would require some research into carrying power over long distances without losing much volume.

Problems aside, this is as good of a plan as any out there at the moment to help reduce energy consumption. Make no mistake, this is a problem that we as a planet must face and work together on. We are in the infancy of the work that needs to be done. The exciting part is that through this kind of work comes great innovations that can carry over into other technology, bringing new advances in ways that have not be thought of yet. The pursuit to overcome the hurdles of this project will be nothing but positive on the whole. It will be exciting to see where this project goes.

20 Ways to Build a Cleaner, Healthier, Smarter World

World Changing Ideas: 20 Ways to Build a Cleaner, Healthier, Smarter World: Scientific American.

What would happen if solar panels were free? What if it were possible to know everything about the world—not the Internet, but the living, physical world—in real time? What if doctors could forecast a disease years before it strikes? This is the promise of the World Changing Idea: a vision so simple yet so ambitious that its full impact is impossible to predict. Scientific American’s editorial and advisory boards have chosen projects in five general categories—Energy, Transportation, Environment, Electronics and Robotics, and Health and Medicine—that highlight the power of science and technology to improve the world. Some are in use now; others are emerging from the lab. But all of them show that innovation is the most promising elixir for what ails us.  —The Editors

The No-Money-Down Solar Plan
A new wave of start-ups wants to install rooftop solar panels on your house. Upfront cost: nothing
By Christopher Mims

The biggest thing stopping the sun is money. Installing a rooftop array of solar panels large enough to produce all of the energy required by a building is the equivalent of prepaying its electricity bill for the next seven to 10 years—and that’s after federal and state incentives. A new innovation in financing, however, has opened up an additional possibility for homeowners who want to reduce their carbon footprint and lower their electric bills: get the panels for free, then pay for the power as you go.

The system works something like a home mortgage. Organizations and individuals looking for a steady return on their investment, typically banks or municipal bond holders, use a pool of cash to pay for the solar panels. Directly or indirectly, homeowners buy the electricity produced by their own rooftop at a rate that is less, per kilowatt-hour, than they would pay for electricity from the grid. Investors get a safe investment—the latest generation of solar-panel technology works dependably for years—and homeowners get a break on their monthly bills, not to mention the satisfaction of significantly reducing their carbon footprint. “This is a way to get solar without putting any money down and to start saving money from day one. That’s a first,” says SolarCity co-founder Peter Rive.

SolarCity is the largest installer of household solar panels to have adopted this strategy. Founded in 2006 by two brothers who are also Silicon Valley–based serial entrepreneurs, SolarCity leases its panels to homeowners but gives the electricity away for free. The net effect is a much reduced utility bill (customers still need utility-delivered power when the sun isn’t out) plus a monthly SolarCity bill. The total for both comes out to less than the old bill. SunRun in San Francisco offers consumers a similar package, except that the company sells customers the electricity instead of leasing them the panels.

Cities such as Berkeley and Boulder are pioneering their own version of solar-panel financing by loaning individuals the entire amount required to pay for solar panels and installation. The project is paid for by municipal bonds, and the homeowner pays back the loan over 20 years as a part of the property tax bill. The effect is the same whichever route a consumer takes: the new obligation, in the form of taxes, a lease or a long-term contract for electricity, ends up costing less than the existing utility bill.

“What we’re really seeing is a transition in how we think about buying energy goods and services,” says Daniel M. Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley. Kammen, who did the initial analysis on Berkeley’s financing model, believes that by turning to financing, consumers can overcome the inherent disadvantage renewables have when compared with existing energy sources: the infrastructure for power from the grid has already been paid for and, in many cases, has been subsidized for decades.

All three approaches are rapidly expanding across the country. Despite the Berkeley program being less than two years old, 10 different states have passed legislation allowing their cities to set up a Berkeley-style bond-financed loan program. With the passage of the Waxman-Markey climate bill, the option for cities to set up these programs would become federal law. SunEdison in Maryland is currently active in nine states. SolarCity, which has more than 4,000 customers, is active in California, Arizona and Oregon and has promised to announce additional states after the new year.

Right now it is not possible to lower the overall cost of rooftop solar to “grid parity,” that is, to the same price as electricity from local utility companies, without federal subsidies such as the investment tax credit, which lowers the tax bill of banks financing these projects. Those subsidies, which amount to 30 percent of the cost of a solar installation, are guaranteed for at least eight years. By then, SolarCity and its competitors claim they won’t need them.

“Grid parity is driven by multiple factors,” says Attila Toth, vice president of marketing at SunEdison, including the cost of capital, the cost of panels and their installation, and the intensity of sunlight in a given region. “It will occur in different states at different times, but, for example, we expect that California will be one of the first states in the U.S. to get to grid parity, sometime between three and five years from now.”

While the cost of electricity from fossil fuels has increased 3 to 5 percent a year for the past decade, the cost of solar panels has fallen on average 20 percent for every doubling of its installed base. Grid parity is where these trend lines cross—after that, solar has the potential to power more than just homes. It’s hardly a coincidence that Elon Musk, head of electric car company Tesla Motors, sits on SolarCity’s board of directors.

More Ideas to watch
by Christopher Mims

The Gasoline Garden
It is the next step for biofuels: genetically engineered plant life that produces hydrocarbons as a by-product of its normal metabolism. The result will be fuel—common gasoline, even—using nothing but sunlight and CO2. In July, Exxon Mobil announced plans to spend more than $600 million in pursuit of algae that can accomplish the task. Joule Biotechnologies claims to have already succeeded, although the company has yet to reveal any details of its proprietary system.

Hot Nukes
Uranium and plutonium are not the only fuels that can power a nuclear reactor. With an initial kick from more traditional fissile materials, thorium can set up a self-sustaining “breeder” reaction that produces uranium 233, which is well suited to nuclear power generation. The process has the added benefit of being resistant to nuclear proliferation, because its end products emit enough gamma rays to make the fuel dangerous to handle and easy to track.

Save Energy with Information
Studies show that simply making customers aware of their energy use lowers it
by 5 to 15 percent. Smart meters allow customers to track their energy consumption minute by minute and appliance by appliance. Countless start-ups are offering the devices, and Google and Microsoft are independently partnering with local utilities to allow individuals to monitor their power usage over the Web.

Wind Power from the Stratosphere
According to a Stanford University study released in July, the high-altitude winds that constantly blow tens of thousands of feet above the earth hold enough energy to supply all of human civilization 100 times over. California’s Sky WindPower has proposed harvesting this energy by building fleets of giant, airborne, ground-tethered windmills, while Italy’s Kite Gen proposes to accomplish the same feat using kites.

Delivering the U.S. from Oil
Plug-in hybrid trucks are improving the long view of the short haul
By Amanda Schupak

Cargo trucks gulp about 40 percent of the fuel pumped in the U.S. While most consumer attention focuses on improving the fuel economy of consumer vehicles, a major opportunity goes rumbling by. “Folks do not realize that the fuel use of even a small truck is equal to many, many cars,” says Bill Van Amburg, senior vice president of Calstart, a clean transportation technology nonprofit, and director of the Hybrid Truck Users Forum. “A utility truck as a hybrid would reduce more petroleum than nine Priuses.”

Some 1,300 commercial hybrids on the road today get up to twice the fuel efficiency of their conventional counterparts. But these traditional hybrids are inherently limited. They make more efficient use of petroleum-based fuel by capturing some of the energy lost during braking.

Plug-in hybrids, on the other hand, draw energy from the grid. They can drive for miles—in many cases, an entire day’s route—without using any fossil fuel at all. This shifts energy demand away from petroleum and toward grid-based sources. (Last year zero-carbon renewables and nuclear supplied 30 percent of all electric power in the U.S.)

In many ways, plug-in hybrid technology makes more sense for delivery trucks than for consumer sedans. A cargo truck runs a short daily route that includes many stops to aid in regenerative braking. Most of the U.S. Postal Service’s 200,000-plus mail trucks, for example, travel fewer than 20 miles a day. In addition, fleet vehicles return nightly to storage lots that have ready access to the 120- or 240-volt outlets required to charge them.

The Department of Energy recently launched the nation’s largest commercial plug-in hybrid program, a $45.4-million project to get 378 medium-duty vehicles on the road in early 2011. The trucks, which will go to 50 municipal and utility fleets, will feature a power system from Eaton, a large manufacturer of electrical components, on a Ford F-550 chassis. (For its part, Ford will wait for the market to prove itself before designing its own commercial plug-ins.) “These are going to start breaking free in 2011,” says Paul Scott, president of the Electric Vehicle Association of Southern California.

Start-up company Bright Automotive has a more ambitious plan. It aims to replace at least 50,000 trucks with plug-in hybrids by 2014. Bright’s IDEA prototype travels 40 miles on battery power before switching to a four-cylinder engine that gets 40 miles to the gallon. The streamlined aluminum body has the payload of a postal truck yet is far more aerodynamic. The truck weighs as much as a midsize sedan.

John E. Waters, Bright Automotive’s founder and the former developer of the battery system for General Motors?’s groundbreaking EV1 electric car, says that each IDEA would save 1,500 gallons of fuel and 16 tons of carbon dioxide emissions a year over a standard utility truck. Waters says he is ready to begin assembly in his U.S. plant once a pending $450-million federal loan comes through.

Despite the appeal of the carbon savings, the fleet owners who are the trucks’ primary customers have more practical considerations. Bright’s executives are coy about the IDEA’s eventual price tag but assert that a customer with 2,000 trucks driving 80 miles a day five days a week could save $7.2 million a year. Right now that is probably not enough to justify large-scale purchases without additional rebates—or a price on carbon. Van Amburg estimates that going hybrid currently adds $30,000 to $50,000 in upfront costs per vehicle, although that figure should come down as production volumes increase.

Improved battery technology will also help. Today the IDEA’s 13-kilowatt-hour lithium-ion battery pack accounts for nearly a quarter of the vehicle’s total cost. Much of the research being done for the batteries going into the Chevy Volt? and other consumer plug-ins should also be applicable to commercial batteries. “For all the good we all want to do,” says David Lauzun, Bright’s vice president of product development, “these vehicles will not take over the world until it becomes the economic choice—‘I have to have them because it saves me money.’”

Bus Rapid Transit
Subwaylike bus lines mobilize the urban future
By Michael Moyer

For the first time in human civilization, more people now live in urban areas than in the countryside. This shift creates a number of dilemmas, not least of which is how to move people within the world’s rapidly growing metropolises. Pollution and traffic point away from car-based options, while light-rail systems are slow to construct and prohibitively expensive. One disarmingly simple—and cheap—possibility is Bus Rapid Transit, which is engineered to operate like a subway on wheels. In these systems, concrete dividers on existing roads separate high-capacity buses from the rest of traffic. Riders pay before boarding, then wait in enclosed stations. When a bus arrives, sliding partitions open to allow riders to board from a platform that is level with the bus floor. The traffic-free thoroughfares, quick boarding times, and modern, comfortable stations resemble light-rail systems more than the chaos of typical bus travel. In Bogotá, Colombia, which has had seven Bus Rapid Transit lines in operation since 2001, the buses handle 1.6 million trips a day. Its success has allowed the city to remove 7,000 private buses from the city, reducing consumption of bus fuel and its associated pollution by more than 59 percent.

Ocean Overhaul
Marine zoning is a bold remedy for sick seas
By Sarah Simpson

These days not even many politicians deny that the oceans are ill. Protecting the health of coastal waters is now a matter of national policy in dozens of countries, including the U.S., and world leaders are beginning to prescribe a revolutionary remedy that conservationists have been promoting for years: marine planning and zoning.

The idea is a natural extension of management policies that have guided the development of cities and landscapes for nearly a century. Porn shops aren’t next to preschools, after all, and drilling rigs aren’t the centerpieces of national parks. Similarly, zoning advocates envision a mosaic of regional maps in which every watery space on the planet is designated for a particular purpose. Drilling and mining would be allowed only in certain parts of the ocean; fishing in others. The most critically threatened areas would be virtually off-limits.

Whereas people can easily find maps telling them what they can do where on land, the marine realm is a hodgepodge of rules emanating from an army of agencies, each one managing a single use or symptom. In the U.S., for example, one body regulates commercial fishing, usually a single species at a time. Another group manages toxic substances, still another seabed mining, and so on—some 20 federal agencies in all. They tend to make decisions without regard to what the others are doing, explains Duke University? marine ecologist Larry B. Crowder. “Imagine all of the medical specialists visiting a patient in intensive care one at a time and never talking to one another,” he says. “It’s a wonder that the oceans aren’t in worse shape than they are now.”

Ocean advocates such as Crowder eagerly await the final recommendations of a special task force President Barack Obama charged with presenting a plan for overhauling management of U.S. waters, which extend 200 nautical miles offshore. The scope of such an undertaking is huge: the U.S. controls 4.4 million square miles of seascape, making the country’s underwater real estate 25 percent larger than its landmass. The committee’s preliminary report, released in September, suggests that the best way to minimize harmful human impacts on the oceans is to manage regions rather than symptoms.

Many environmentalists are hopeful that such plans will be implemented through the marine equivalent of municipal zoning, which would give them some influence in areas where they now have none. In zones where conservation is designated as the dominant activity, fishing and industrial activities such as mining would no longer have free rein. Under current rules, about the only way a conservation group can block a project it deems harmful—say, a new site for offshore drilling—is through expensive litigation.

So far, though, the president’s task force has been careful not to suggest that ocean zoning will be the only treatment plan, in great part because any effort to restrict commercial interests is bound to meet stiff opposition. “Zoning isn’t anybody’s favorite exercise,” notes John C. Ogden, director of the Florida Institute of Oceanography at the University of South Florida at Tampa. “Someone’s ox is always getting gored.” Most resistant to such change will most likely be the traditional users of the open ocean—namely, commercial fisheries and the petroleum industry. “They’ve had the place to themselves for a long time,” Ogden says.

Ogden and others are quick to point out, however, that zoning practices can benefit commerce as much as conservation. By giving up access to certain areas, industries gain the security of knowing their activities would be licensed in a more predictable and less costly manner than they are today, explains Josh Eagle, associate professor at the University of South Carolina School of Law. Now an oil company can apply for permits to drill virtually anywhere, but it takes on a significant financial risk each time. The business may dump millions of dollars into researching a new facility only to have a lawsuit derail it at the last moment. When opposing parties have more or less equal voices early in the planning process, Eagle says, they are less inclined to block one another’s activities once zones are drawn on a map.

Whether the final report of the president’s task force will promote ocean zoning explicitly is uncertain. But the group has already promised to overhaul the structure of ocean governance by proposing the creation of a National Ocean Council, whose job it will be to coordinate efforts of the myriad federal agencies now in charge.

The move comes just in time. Just as society is beginning to appreciate the enormous efforts it will take to preserve the health of the oceans, it must ask more of them—more energy, more food, and better resilience to coastal development and climate change. The reason the oceans are in trouble is not what people put in and take out. It is a failure of governments to manage these activities properly. Says Crowder: “We have to treat the oceans holistically, not one symptom at a time.”

The Power of Garbage
Trapped lightning could help zap trash and generate electricity
By John Pavlus

Trash is loaded with the energy trapped in its chemical bonds. Plasma gasification, a technology that has been in development for decades, could finally be ready to extract it.

In theory, the process is simple. Torches pass an electric current through a gas (often ordinary air) in a chamber to create a superheated plasma—an ionized gas with a temperature upward of 7,000 degrees Celsius, hotter than the surface of the sun. When this occurs naturally we call it lightning, and plasma gasification is literally lightning in a bottle: the plasma’s tremendous heat dissociates the molecular bonds of any garbage placed inside the chamber, converting organic compounds into syngas (a combination of carbon monoxide and hydrogen) and trapping everything else in an inert vitreous solid called slag. The syngas can be used as fuel in a turbine to generate electricity. It can also be used to create ethanol, methanol and biodiesel. The slag can be processed into materials suitable for use in construction.

In practice, the gasification idea has been unable to compete economically with traditional municipal waste processing. But the maturing technology has been coming down in cost, while energy prices have been on the rise. Now “the curves are finally crossing—it’s becoming cheaper to take the trash to a plasma plant than it is to dump it in a landfill,” says Louis Circeo, director of Plasma Research at the Georgia Tech Research Institute. Earlier this summer garbage-disposal giant Waste Management partnered with InEnTec, an Oregon-based start-up, to begin commercializing the latter’s plasma-gasification processes. And major pilot plants capable of processing 1,000 daily tons of trash or more are under development in Florida, Louisiana and California.

Plasma isn’t perfect. The toxic heavy metals sequestered in slag pass the Environmental Protection Agency?’s leachability standards (and have been used in construction for years in Japan and France) but still give pause to communities considering building the plants. And although syngas-generated electricity has an undeniably smaller carbon footprint than coal—“For every ton of trash you process with plasma, you reduce the amount of CO2 going into the atmosphere by about two tons,” Circeo says—it is still a net contributor of greenhouse gases.

“It is too good to be true,” Circeo admits, “but the EPA has estimated that if all the municipal solid waste in the U.S. were processed with plasma to make electricity, we could produce between 5 and 8 percent of our total electrical needs—equivalent to about 25 nuclear power plants or all of our current hydropower output.” With the U.S. expected to generate a million tons of garbage every day by 2020, using plasma to reclaim some of that energy could be too important to pass up.

More Ideas to watch
By John Pavlus

Cement as a Carbon Sponge
Traditional cement production creates at least 5 percent of global carbon dioxide emissions, but new materials could create carbon-neutral cement. Start-up Novacem, supported by Imperial College London, uses magnesium oxide to make cement that naturally absorbs CO2 as it hardens. California-based Calera uses seawater to sequester carbon emissions from a nearby power plant in cement.

The New Honeybee
Colony collapse disorder (CCD) has killed more than a third of honeybee colonies since 2006. Farmers who depend on bees to pollinate such crops as almonds, peaches and apples are looking to the blue orchard bee to pick up the slack.

One efficient Osmia lignaria can pollinate as much territory as 50 honeybees, but the bees are harder to cultivate because of their solitary nature. These pinch hitters won’t completely replace honeybees, but as scientists continue to grapple with CCD, they could act as an agricultural safety net.

Saltwater Crops
As the world’s freshwater supply becomes scarcer and food production needs balloon, salt-tolerant crops could ease the burden. Researchers at Australia’s University of Adelaide used genetic engineering to enhance a model crop’s natural ability to prevent saline buildup in its leaves, allowing the plant to thrive in conditions that would typically wither it. If the same gene tweak works in cereal crops such as rice and wheat—the researchers are testing them now—fallow lands destroyed by drought or overirrigation could become new breadbaskets.

The Omnipotence Machines
Tiny, ubiquitous sensors will allow us to index the physical world the way the Web maps cyberspace
By Gregory Mone

Earlier this year Hewlett-Packard announced the launch of its Central Nervous System for the Earth (CeNSE) project, a 10-year effort to embed up to a trillion pushpin-size sensors across the planet. Technologists say that the information gathered by this kind of ubiquitous sensing network could change our knowledge of the world as profoundly as the Internet has changed business. “People had no idea the Web was coming,” says technology forecaster Paul Saffo?. “We are at that moment now with ubiquitous sensing. There is quite an astonishing revolution just around the corner.”

The spread of versatile sensors, or “motes,” and the ability of computers to analyze and either recommend or initiate responses to the data they generate, will not merely enhance our understanding of nature. It could lead to buildings that manage their own energy use, bridges that flag engineers when in need of repair, cars that track traffic patterns and detect potholes, and home security systems that distinguish between the footfalls of an intruder and the dog, to name a few.

CeNSE is the boldest project yet announced, but HP is not the only organization developing the technology to make ubiquitous sensing possible. Intel is also designing novel sensor packages, as are numerous university labs.

For all the momentum in the field, though, this sensor-filled future is by no means inevitable. These devices will need to generate rich, reliable data and be rugged enough to survive tough environments. The sensor packages themselves will be small, but the computing effort required will be enormous. All the information they gather will have to be transmitted, hosted on server farms, and analyzed. Finally, someone is going to have to pay for it all. “There is the fundamental question of economics,” notes computer scientist Deborah Estrin of the University of California, Los Angeles. “Every sensor is a nonzero cost. There is maintenance, power, keeping them calibrated. You don’t just strew them around.”

In fact, HP senior researcher Peter Hartwell acknowledges that for CeNSE to hit its goals, the sensors will need to be nearly free. That is one of the reasons why HP is designing a single, do-everything, pushpin-size package stacked with a variety of gauges—light, temperature, humidity, vibration and strain, among others—instead of a series of devices for different tasks. Hartwell says that focusing on one versatile device will drive up volume, reducing the cost for each unit, but it could also allow HP to serve several clients at once with the same sensors.

Consider his chief engineering project, an ultrasensitive accelerometer. Housed inside a chip, the sensor tracks the motion of a tiny, internal movable platform relative to the rest of the chip. It can measure changes in acceleration 1,000 times as accurately as the technology in the Nintendo Wii?.

Hartwell imagines situating one of these pins every 16 feet along a highway. Thanks to the temperature, humidity and light sensors, the motes could serve as mini weather stations. But the accelerometers’ vibration data could also be analyzed to determine traffic conditions—roughly how many cars are moving past and how quickly. The local highway department would be interested in this information, he guesses, but there are potential consumer applications, too. “Your wireless company might want to take that information and tell you how to get to the airport the fastest,” Hartwell says.

All of this gathering and transmission of data requires power, of course, and to guarantee an extended life, the HP pushpin will not rely solely on batteries. “It is going to have some sort of energy-scavenging ability,” Hartwell says. “Maybe a solar panel or a thermoelectric device to help keep the battery charged.”

With the power hurdle in mind, other groups are forgoing batteries altogether. At Intel Labs in Seattle, engineer Josh Smith? has developed a sensor package that runs on wireless power. Like the HP pushpin, Intel’s WISP, or Wireless Identification and Sensing Platform, will include a variety of gauges, but it will also draw energy from the radio waves emitted by long-range radio-frequency ID chip readers. Smith says a single reader, plugged into a wall outlet, can already power and communicate with a network of prototype WISPs five to 10 feet away—a distance that should increase.

Smith cites many of the same infrastructure-related possibilities as Hartwell, along with a number of other uses. If WISPs were placed on standard household items such as cups, these tags could inform doctors about the rehabilitation progress of stroke victims. If the cups the patient normally uses remain stationary, Smith explains, then the individual probably is not up and moving around.

The potential applications for ubiquitous sensing are so broad—a physicist recently contacted him about using WISPs to monitor the temperature outside a proposed neutrino detector—that, as with the Internet, Smith says it is impossible to foresee them all. “In terms of the impact it is going to have on our lives,” Hartwell adds, “you haven’t seen anything yet.”

The Do-Anything Robot
Your PC can accomplish any computing task you ask of it. Why isn’t the same true for robots
By Gregory Mone

Robots have proved to be valuable tools for soldiers, surgeons and homeowners hoping to keep the carpet clean. But in each case, they are designed and built specifically for the job. Now there is a movement under way to build multipurpose machines—robots that can navigate changing environments such as offices or living rooms and work with their hands.

All-purpose robots are not, of course, a new vision. “It’s been five or 10 years from happening for about 50 years,” says Eric Berger, co-director of the Personal Robotics Program at Willow Garage, a Silicon Valley start-up. The delay is in part because even simple tasks require a huge set of capabilities. For a robot to fetch a mug, for example, it needs to make sense of data gathered by a variety of sensors—laser scanners identifying potential obstacles, cameras searching for the target, force feedback in the fingers that grasp the mug, and more. Yet Berger and other experts are confident that real progress could be made in the next decade.

The problem, according to Willow Garage, is the lack of a common platform for all that computational effort. Instead of building on the capabilities of a single machine, everyone is designing robots, and the software to control them, from the ground up. To help change this, Willow Garage is currently producing 25 copies of its model PR2 (for “Personal Robot 2”), a two-armed, wheeled machine that can unplug an appliance, open doors and move through a room. Ten of the robots will stay in-house, but 10 more will go to outside research groups, and everyone will pool their advances. This way, Berger says, if you want to build the robotic equivalent of a Twitter, you won’t start by constructing a computer: “you build the thing that’s new.”

Pocket Translator
The military, short on linguists, is building smart phone–based devices to do the job
By Gregory Mone

Sakhr Software, a company that builds automatic language translators, recently unveiled a prototype smart phone application that transforms spoken English phrases into spoken Arabic, and vice versa, in near real time. The technology isn’t quite ready for your next trip to Cairo, but thanks to recent advances in machine-translation techniques, plus the advent of higher-fidelity microphones and increasing processing power in smart phones, this mobile technology could soon allow two people speaking different languages to have basic conversations.

Before the 1990s automatic translation meant programming in an endless list of linguistic rules, a technique that proved too labor-intensive and insufficiently accurate. Today’s leading programs—developed by BBN Technologies?, IBM, Sakhr and others as part of a Defense Advanced Research Projects Agency effort to eliminate the military’s need for human translators—rely on machine-learning techniques instead. The software works from a database of parallel texts—for example, War and Peace in two different languages, translated United Nations speeches, and documents pulled off the Web. Algorithms identify short matching phrases across sources, and the software uses them to build statistical models that link English phrases to Arabic ones.

John Makhoul, BBN’s chief scientist, says the current technology is at its best when confined to subject areas with specific phrases and terminology—translating a weather report from English into French, for example, or helping soldiers gather basic biographical information from people in the field. Makhoul envisions the first consumer applications, five years from now, being similarly constrained. A tourism-related translation app on a smart phone could help an American in Florence get directions from a non-English-speaking local, but they won’t chat about Renaissance art. “It is not going to work perfectly,” he says, “but it will do a pretty good job.”

Know if Disease Grows Inside You
Complex diseases have complex causes. Luckily, they also leave a multitude of traces
By Melinda Wenner

With the exception of certain infectious diseases, few of humanity’s ailments have cures. More than 560,000 Americans will die of cancer this year, and despite the 250,000 coronary bypass surgeries doctors perform annually, heart disease is still the country’s number-one killer.

The hardest diseases to cure are the ones that take the longest to develop. They are the end result of decades of complex molecular interactions inside your body. Yet this complexity also pre­sents an opportunity. Scientists have discovered that these interactions leave discernible fingerprints on the body. By unweaving the complex tapestry of molecular clues—changes in the body’s proteins, nucleic acids and metabolites, collectively called biomarkers—doctors hope they will soon be able to not only detect disease but predict a coming illness in time to take action.

Biomarkers are not new. Since 1986 doctors have monitor­ed prostate cancer by measuring blood levels of the protein known as prostate-specific antigen (PSA). But tests that rely on a single biomarker to detect disease are rare, because most disorders involve intricate changes in a collection of biomarkers.

Take schizophrenia: in January 2010 scientists will release a biomarker test that distinguishes schizophrenia from other psychiatric conditions. The test, which is being commercialized by Rules-Based Medicine, a laboratory in Austin, Tex., is based on the characteristics of about 40 blood-based proteins.

To find potentially useful biomarkers, researchers collect blood samples from thousands of healthy people and analyze them. Biomarker levels in these samples provide a baseline reading. Then they do the same for people with a specific condition such as diabetes or breast cancer. If reproducible differences emerge between the groups, scientists can use the patterns in the disease group to diagnose the same condition in others. By collecting samples over time, researchers can also go back and analyze early samples from individuals who later become ill to identify patterns indicative of early disease or high disease risk.

Biophysical Corporation, a sister company to Rules-Based Medicine, is one of several companies that has developed blood-based biomarker tests and marketed them to the public [see “The Ultimate Blood Test,” by Philip Yam; Scientific American, June 2006]. The company searches for up to 250 biomarkers suggestive of cancer, inflammatory conditions, heart disease and other illnesses. Mark Chandler, Biophysical’s chair and CEO, says that the real value of the tests lies in long-term monitoring. A person could “get a test monthly, just a finger stick, that would be able to say, we have had a serious change here that is indicative of an early-stage cancer,” he explains.

Yet not all experts are convinced that the age of biomarkers is at hand. Cheryl Barton, an independent U.K.-based pharmaceutical consultant who authored a Business Insights market analysis report on biomarkers in 2006, says she remains “a little bit skeptical about how clinically useful they are.” A study of 5,000 subjects published in the Journal of the American Medical Association in July 2009 found that six cardiovascular biomarkers were only marginally better at predicting heart disease than were standard cardiovascular risk factors, such as whether the subjects smoked or had diabetes.

Adding to the overall difficulty, a person might suffer from two or more diseases—prostate cancer and heart disease, for example. No one knows how multiple diseases might affect overall biomarker signatures or how profiles will change as other diseases develop. “When you get to be 65 or 70, almost everybody has other conditions,” Chandler says. “We don’t know how to deal with that right now.” And scientists still need to discern which biomarkers are truly relevant to disease—a difficult task when working with blood, which contains tens of thousands of proteins at concentrations spanning more than 10 orders of magnitude.

Some companies have simplified the problem by avoiding blood altogether. LabCorp recently commercialized a biomarker test that analyzes colon cells in stool for the chemical signatures indicative of colorectal cancer. “The stool is in intimate contact with the lining of the colon, so it becomes much more highly populated with these rare molecules than would get into the bloodstream from colon cancer,” says Barry Berger, chief medical officer of Exact Sciences, a Madison, Wis.–based biotechnology company that developed the test technology.

In time, scientists are confident that they will eventually crack the more difficult problem of finding distinct disease signatures in the noisy data. “The evolutionary process, being complex and unknown, does not always give us an easy route,” Berger notes, “but it definitely gives us lots of opportunities.”

Satellites Diagnose Disease Outbreaks
Space-based data are helping to track and predict the spread of deadly diseases ?
By Katherine Harmon

Many contagious diseases spread through carriers such as birds and mosquitoes. These vectors in turn move with heat and rainfall. With this in mind, researchers have begun to use satellite data to monitor the environmental conditions that lead to disease. “Ideally, we could predict conditions that would result in some of these major outbreaks of cholera, malaria, even avian flu,” says Tim Ford of the University of New England at Biddeford and co-author of a paper on the subject published this past September in Emerging Infectious Diseases.

Satellite data have already been used to map the advance of the H5N1 avian influenza in Asia. The domestic duck, a common inhabitant of Southeast Asia’s rice paddies, is one of the main carriers of the disease. Xiangming Xiao, associate director of the University of Oklahoma?’s Center for Spatial Analysis, uses satellite images to map agricultural patterns in the region. These maps show where the ducks are most likely to live and thus where the avian influenza is most likely to spread.

Migratory birds also carry the virus, but their travel patterns are more difficult to predict. Xiao and his colleagues combine the satellite imagery with satellite-gathered surface-temperature data to estimate the birds’—and thereby the virus’s—trajectory. Computer models then link these environmental drivers to the spread of the flu in human populations.

Of course, not all of the work can be outsourced to orbiting observatories. Xiao says that judging the severity of avian flu’s spread from satellite imaging required knowing details about the human populations as well—for instance, how likely certain communities were to raise ducks for poultry consumption. “Satellite monitoring has a capacity to provide consistent observation,” Xiao says. “On the other hand, the in situ observations are still very, very important, so the key is to combine those together. That is a real challenge.”

More Ideas to watch
By Melinda Wenner

Quick Clots
Emergency technicians could prevent up to 35 percent of prehospital trauma deaths if they had better and cheaper ways to prevent blood loss. Now a University of Maryland–affiliated start-up called Trauma Solutions has developed a synthetic hydrogel that can clot blood by prompting the body to make fibrin, a protein that seals wounds and stops bleeding. Future iterations could simultaneously release such medicines as antibiotics and painkillers. Each application will cost about $5, compared with some natural blood-clotting substances that cost upward of $500.

Lab-on-a-Stamp
Liver damage is a major side effect of HIV/AIDS and tuberculosis drugs, yet few developing countries have enough trained scientists or equipment to monitor it. Nonprofit Cambridge, Mass.–based Diagnostics For All has developed an inexpensive fingernail-size device made almost entirely of paper that monitors liver damage using a single drop of blood. Channels in the paper guide blood to regions that change color depending on the levels of two damage-related liver enzymes.

Bacterial Toothpaste
Streptococcus mutans bacteria in the mouth decay teeth by converting sugars into enamel-eroding lactic acid. Florida-based Oragenics has genetically engineered a new strain of bacteria that converts sugars to trace amounts of alcohol instead. Because the new strain permanently displaces natural S. mutans, the therapy, which is currently in clinical trials, will be available as a one-time prescription that will protect teeth for life.

| Read Comments (29)

 

'disturbing' levels of cyber-raids

Top GCHQ spook warns of ‘disturbing’ levels of cyber-raids • The Register.

With a crunch conference on government cyber-security starting tomorrow, the director of government spook den GCHQ, Iain Lobban, said Britain had faced a “disturbing” number of digital attacks in recent months.

Attackers had targeted citizens’ data, credit card numbers and industry secrets, Lobban said.

“I can attest to attempts to steal British ideas and designs – in the IT, technology, defence, engineering and energy sectors as well as other industries – to gain commercial advantage or to profit from secret knowledge of contractual arrangements,” the eavesdropping boss added in his article for The Times.

According to Foreign Secretary William Hague there were more than 600 “malicious” attacks on government systems every day, while criminals could snap up Brits’ stolen card details online for just 70 pence a throw.

The statement was paired with the announcement of a £650m investment in cyber-security over the next four years, with both Hague and Lobbman arguing that industry and government need to work together to pull off a safe, resilient system.

Countries that could not protect their banking systems and intellectual property will be at a serious disadvantage in future, Hague told The Times.

The government could have its work cut out, though: security software maker Symantec today suggests that businesses are cutting back on cyber-security and are less aware of and engaged with the big threats than they were last year. Symantec was specifically staring at industries integral to national security.

It found that only 82 percent of them participated in government protection programmes, down 18 points since last year.

Symantec reckoned that reduced manpower meant companies had less time to focus on big structural threats.

“The findings of this survey are somewhat alarming, given recent attacks like Nitro and Duqu that have targeted critical infrastructure providers,” said Dean Turner, a director at Symantec.

“Having said that, limitations on manpower and resources as mentioned by respondents help explain why critical infrastructure providers have had to prioritise and focus their efforts on more day-to-day cyber threats.” ®

New bankruptcy ripples may emerge

Insight: New bankruptcy ripples may emerge | Reuters.

<span class="articleLocation”>Three years after the collapse of Lehman Brothers touched off a tidal wave of bankruptcy filings, corporate failures may be about to pick up again, with some big-name companies among those struggling for survival.

Companies in a range of businesses, including hair salons, restaurants, renewable energy, and the paper industry, have tumbled into Chapter 11 in the past few months.

The weak economy, lackluster consumer spending, a shaky junk-bond market and increasingly tight lending practices are also threatening struggling companies in industries as diverse as shipping, tourism, media, energy and real estate.

AMR Corp’s American Airlines may need to go to court to restructure its labor contracts, though a spokesman for the airline reiterated on Monday that bankruptcy is not the company’s goal or preference.

Kodak confirmed that a law firm known for taking companies through bankruptcy has been advising on strategy as attempts to overcome the loss of its traditional photography business falter. It has denied any intention of filing for bankruptcy.

Some bankruptcy and restructuring experts warn a fresh U.S. recession could trigger a string of failures to rival the one that followed Lehman Brothers, which in 2008 filed the biggest bankruptcy in U.S. history.

“It’s getting busier for everyone I know,” said Jay Goffman, global head of the Corporate Restructuring Group at law firm Skadden Arps, Slate, Meagher & Flom. “I think 2012 will be a busy year and 2013 and 2014 will be extraordinarily busy years in restructuring.”

No one is currently predicting a second Lehman-type collapse. Its $639 billion bankruptcy came after a loss of confidence in the investment bank as asset values plummeted, leading to the drying up of credit lines.

In fact, predicting a bankruptcy wave at all is a tricky task, experts say. It could depend on several unknowns: how much money banks and other institutions are willing to lend troubled companies, whether the economy lands in a double-dip recession and what happens in the European debt crisis.

The sovereign debt crisis in Europe could be the most important X factor. Even the experts who say that a bankruptcy crisis is not coming because current low interest rates make it easy for companies to get cash to finance their way out of trouble, say that the euro zone’s problems could trigger defaults here.

“It is possible that one or two sovereign debt defaults would increase the pressure we’d feel in the U.S. credit market. Then we might see an environment like we had in 2008,” said Peter Fitzsimmons, president for North America for turnaround advisory firm AlixPartners LLP.

MORE FILINGS

Chapter 11 filings are picking up, bankruptcy data show. Ten companies with at least $100 million in assets filed for bankruptcy in September, the most since 17 filed in April, which was the busiest month since 2009, according to Bankruptcydata.com.

For a graphic click here link.reuters.com/nuw34sp:

Recent failures included renewable energy companies Evergreen Solar and Solyndra. The latter collapsed in a politically-charged bankruptcy after taking a $535 million loan from the federal government.

Other recent bankruptcies include glossy magazine paper manufacturer NewPage Corp, which was the largest bankruptcy of the year and the largest non-financial company filing since 2009; Graceway Pharmaceuticals, which makes skin creams; Hussey Copper Corp., which makes the copper bars used in switchboards, and the Dallas Stars of the National Hockey League.

So far this month, five companies with more than $100 million in assets have filed, including the Friendly’s ice cream chain – and wireless broadband company Open Range Communications Inc.

It is difficult to predict trends in filings. For example, experts who focused on macroeconomic credit indicators and default projections in 2006 or 2007 wouldn’t in many cases have been prepared for the severity of failures that followed.

In 2009, General Motors, Chrysler Group, LyondellBasell Industries and General Growth Properties all filed for bankruptcy, contributing to a record number of filings and topped the list of largest bankruptcies ever.

At the same time, some experts were predicting an even deeper and longer list of corporate collapses. But within a year of bankruptcy filings breaking records, banks and other financial institutions were buying debt and lending, making it easy for companies to finance their way out of trouble.

Two months after Lehman failed, the U.S. Federal Reserve slashed rates to near zero. Once confidence began to return to the debt markets, investors flocked to high-yield bonds sold by ailing companies, allowing them to refinance.

Other failing companies were able to “amend and extend” – or to critics, “amend and pretend” – by striking new borrowing terms with lenders that delayed debt maturities in the hopes the economy would rebound smartly and business would pick up.

Those measures often avoided operational overhauls, creating what some experts called “zombie companies” that cut staff and prices to survive, but were too sick to invest in new projects.

Bankruptcy court allows troubled companies to shed debt and also become more operationally efficient as they renegotiate labor contracts, as airlines have done, or reject pricey store leases, which retailers often do.

But these changes do not always work, especially when companies find little support among suppliers or creditors for their turnaround plans. Bankrupt book chain Borders, for instance, recently closed its doors after failing to find a buyer.

In addition, confidence in the economy and easy access to debt allowed companies to complete restructurings in 2009 and 2010 with business plans and debt loads that were based on an economic pickup that has now faltered. That could create the potential for trouble at companies that have already restructured once.

SIGNS OF TROUBLE

Restructuring advisers agree that a dimming economic outlook will force lenders to make some tough calls about troubled companies. Those who see a broader wave of bankruptcies expect the economy to dip back into recession as the U.S. government cuts spending and Europe’s debt problems worsen.

They also look beyond the equity market for less visible signs of trouble. They see a junk-bond market that has suffered its worst sell-off since the Fed cut rates to near zero in 2008 and falling loan market prices as lenders reduce their exposure to weak borrowers.

There are even troubling signs coming from otherwise sanguine rating agencies that assess corporate debt. Moody’s noted that the number of downgraded liquidity ratings for troubled companies rose for a third straight month in September, an ominous sign that was similar to the third quarter of 2007 when the economy last slid into recession.

Indeed, one analyst said the Evergreen Solar bankruptcy as well as the recent filing of restaurant operator Real Mex Restaurants Inc show that weak companies are finding it hard to borrow. Both failed to reach the kind of refinancing deal with creditors that until recently was saving many troubled companies from Chapter 11.

“The idea that a couple of companies can’t even go to existing lenders for a real lifeline is quite telling right now,” said Kevin Starke, an analyst with CRT Capital Group, a brokerage that specializes in distressed securities.

LACK OF HOME RUNS

Still, not everyone is convinced more bankruptcies are on the way. Jim Hogan, the head of GE Capital’s restructuring finance unit who works with a lot of medium-sized companies, said he expects only a gradual increase in business, limited to the weakest industries.

“I’m not telling anyone internally I’m expecting some big home runs for us,” Hogan said.

Some said the current rise in bankruptcy filings is routine as fatigued lenders pull the plug on deadbeat companies. While debt and equity markets may have recently been in a swoon, many credit indicators generally show Corporate America to be in decent health.

For example, corporate balance sheets are stuffed with cash, and the rate of corporate loan defaults is expected to end the year at 0.23 percent, well below the historical average of 3.57 percent, according to Standard & Poors.

One of the biggest concerns of recent years, a looming “wall of maturities” of bonds that come due in the next few years, has largely been refinanced, according to Moody’s.

Despite this, the level of debt held by consumers, the federal government and the corporate sector weighs heavily on the economy and will likely spell trouble for some major companies.

“You have this huge overhang of debt. You don’t see a significant amount of improvement in the economy. How long can that continue?,” said Jay Indyke, chair of the bankruptcy and restructuring practice at law firm Cooley LLP.

(This story corrects Jay Goffman’s title in paragraph 7 to head of restructuring, from co-head)

(Reporting by Tom Hals in Wilmington, Delaware, Susan Zeidler in Los Angeles, Caroline Humer in New York; Additional reporting by Nick Brown in New York; Editing by Martha Graybow and Martin Howell)

Chronicling the Ecological Impact of C. Columbus

Chronicling the Ecological Impact of Columbus’ Journey | Magazine.

Columbus’ discovery of the New World unleashed centuries of geopolitical turmoil. But humans weren’t the only creatures whose fortunes were forever altered. Entire species of plants and animals either thrived or suffered as well. In the book 1493, author (and Wired contributor) Charles C. Mann traces the far-reaching biological consequences of Columbus’ journey across the ocean blue. “There is a Rube Goldberg aspect to this,” Mann says. “Things are connected in ways that you would never expect.” And just as with human societies, some organisms came out on top, while others were radically subjugated. Here are a few key flora and fauna and how they weathered the storm.

  1. PLANTAINS ENABLE FIRE ANTS. The African plantain is plagued by insects called scale. Back in Africa, however, predators help combat these scavengers. But when the fruit was brought to Hispaniola, it received no such aid. So the bugs proliferated—along with fire ants, which fed on the other insects’ sugary excrement. Both pests thrived until their unchecked appetites destroyed the local plantain crop.
  2. RUBBER CONQUERS ORCHIDS. For centuries, orchids thrived in the jungles of Southeast Asia. The damp terrain and omnipresent mist provided the perfect environment for the moisture-loving epiphytes. But when rubber trees from the Amazon rain forest were imported to southern China, their thirst for water dried out the soil. The once-plentiful morning fog began to disappear. Soon the orchids started to as well.
  3. EARTHWORMS STARVE TREES AND POWER UP MAIZE. Before being brought to the US, the common earthworm aided farmers in England by humbly tilling their soil. But once transplanted, the wrigglers’ tu nneling disrupted the nutrient-absorbing fungi on the roots of sugar maples, causing the trees’ decline. And by aerating the newly cleared land, the worm allowed crops like maize to grow year-round.
  4. POTATOES BATTLE NEW FOE. In its Andean motherland, the resilient potato grew in all shapes and sizes. But as the mighty tuber spread across the globe, its varieties dwindled to a monoculture—an easy target for opponents in adopted lands. None was quite so vicious as the Colorado potato beetle. Carried to North America in the manes of traveling horses, the bug became a permanent scourge to the plant in regions around the world.

How to Talk to a Climate Skeptic

How to Talk to a Climate Skeptic: Responses to the most common skeptical arguments on global warming | A Grist Special Series | Grist.

Below is a complete listing of the articles in “How to Talk to a Climate Skeptic,” a series by Coby Beck containing responses to the most common skeptical arguments on global warming. There are four separate taxonomies; arguments are divided by:

Individual articles will appear under multiple headings and may even appear in multiple subcategories in the same heading.

Stages of Denial

  1. There’s nothing happening
    1. Inadequate evidence
    2. Contradictory evidence
    3. No consensus
  2. We don’t know why it’s happening
    1. Models don’t work
    2. Prediction is impossible
    3. We can’t be sure
  3. Climate change is natural
    1. It happened before
    2. It’s part of a natural change
    3. It’s not caused by CO2
  4. Climate change is not bad
    1. The effects are good
  5. Climate change can’t be stopped
    1. Too late
    2. It’s someone else’s problem
    3. Economically infeasible

Scientific Topics

  1. Temperature
  2. Atmosphere
  3. Extreme events
    1. Temperature records
  4. Cryosphere
    1. Glaciers
    2. Sea ice
    3. Ice sheets
  5. Oceans
  6. Modeling
    1. Scenarios
    2. Uncertainties
  7. Climate forcings
    1. Solar influences
    2. Greenhouse gases
    3. Aerosols
  8. Paleo climate
    1. Holocene
    2. Ice ages
    3. Geologic history
  9. Scientific process

Types of Argument

  1. Uninformed
  2. Misinformed
  3. Cherry Picking
  4. Urban Myths
  5. FUD
  6. Non Scientific

Levels of Sophistication

  1. Silly
  2. Naive
  3. Specious
  4. Scientific

UBS: Our risk systems did detect £1.3bn rogue trader – We just didn't do anything about it

UBS: Our risk systems did detect £1.3bn rogue trader – ComputerworldUK.com.

But the warnings were ignored, bank claims

UBS has insisted its IT systems did detect unusual and unauthorised trading activity, before rogue trader Kweku Adoboli ran up a $2 billion (£1.3 billion) loss on the bank’s derivatives desk.

Interim chief executive Sergio Ermotti, who is running the company following Oswald Grubel’s resignation last month, sent a memo to employees saying the bank is aware that its systems did detect the rogue activity.


In the memo, seen by the Wall Street Journal, Ermotti wrote: “Our internal investigation indicates that risk and operational systems did detect unauthorised or unexplained activity but this was not sufficiently investigated nor was appropriate action taken to ensure existing controls were enforced.”

He added: “We have to be straight with ourselves. In no circumstances should something like this ever occur. The fact that it did is evidence of a failure to exercise appropriate controls.”

The news comes as the heads of UBS’ global equities business, Francois Gouws and Yassine Bouhara, also resigned.

Meanwhile, regulatory and internal investigations continue into the problem.

“Criminal law prevents us from disclosing detailed information at the moment,” Ermotti wrote. He added that the company was taking “immediate and decisive action based on the findings of our own review of what happened”.

The bank insisted it was working to improve its risk and control framework, adding that it was clarifying rules and processes.

Rogue trader: How UBS systems and controls failed to stop a £1.3bn loss

Quake-prone Japanese Area Runs Disaster System on Force.com

Quake-prone Japanese Area Runs Disaster System on Force.com | PCWorld.

A coastal region of Japan due for a major earthquake and possible tsunamis has implemented a cloud-based disaster management system run by Salesforce.com.

Shizuoka Prefecture, on Japan’s eastern coast in the central region of the country, lies curled around an undersea trough formed by the junction of two tectonic plates. It has been rocked by repeated large temblors in past centuries, collectively called “Tokai earthquakes,” and the central government has warned that with underground stresses high another is imminent.

The local prefectural government began to build a new disaster management system last year, the initial version of which went live in July. It is based on Salesforce.com’s platform-as-a-service offering, Force.com, which hosts hundreds of thousands of applications.

“It would have cost a lot more to run our own servers and network, and if a disaster happened managing something like that would be very difficult, especially if the prefecture office was damaged,” said Keisuke Uchiyama, a Shizuoka official who works with the system.

Japanese prefectures are the rough equivalent of states.

The system is currently hosted on Salesforce.com’s servers in the U.S. and goes live when an official disaster warning is issued by the government. It links up information about key infrastructure such as roads, heliports and evacuation centers.

Salesforce.com says it combines GIS (geographic information system) data with XML sent from Japan’s Meteorological Agency. Users can also send email updates from the field using their mobile phones, with GPS coordinates and pictures attached.

Uchiyama said the original plan was to allow open access, but budget cuts forced that to be postponed and it is now available only to government workers and disaster-related groups. The system was implemented with a budget of about 200 million yen (US$2.6 million) over its first two years, down from an original allotment of about 500 million yen over three years.

He said it was used to keep track of the situation last week when a powerful typhoon swept through central Japan.

The obvious downside to a hosted system is that key infrastructure is often destroyed during natural disasters. After the powerful earthquake and tsunami that hit Japan’s northeastern coast in March, some seaside towns were completely devastated and went weeks without basics like power or mobile phone service. Local communities turned to word-of-mouth and public bulletin boards to spread information and search for survivors.

“If the network gets cut, it’s over,” said Uchiyama.

IMF says US and Europe risk double-dip recession

US and Europe risk double-dip recession, warns IMF | Business | guardian.co.uk.

International Monetary Fund’s World Economic Outlook says slow, bumpy recovery could be jeopardised by Europe’s debt crisis or over-hasty attempts to cut America’s budget deficit

IMF cuts growth forecast for UK

Wall Street protest

A protest on Wall Street. Confidence has fallen and the risks are on the downside, the IMF said in its half-yearly report. Photograph: Keystone/Rex Features

The International Monetary Fund warned on Tuesday that the United States and the eurozone risk being plunged back into recession unless policymakers tackle the problems facing the world’s two biggest economic forces.

In its half-yearly health check, the Washington-based fund said the global economy was “in a dangerous place” and that its forecast of a slow, bumpy recovery would be jeopardised by a deepening of Europe’s sovereign debt crisis or over-hasty attempts to rein in America’s budget deficit.

“Global activity has weakened and become more uneven, confidence has fallen sharply recently, and downside risks are growing,” the IMF said as it cut its global growth forecast for both 2011 and 2012.

The IMF also cut its growth forecasts for the UK economy and advised George Osborne to ease the pace of deficit reduction in the event of any further downturn in activity.

The IMF’s World Economic Outlook cited the Japanese tsunami and the rise in oil prices prompted by the unrest in north Africa and the Middle East as two of a “barrage” of shocks to hit the international economy in 2011. It said it now expected the global economy to expand by 4% in both 2011 and 2012, cuts of 0.3 points and 0.5 points since it last published forecasts three months ago.

“The structural problems facing the crisis-hit advanced economies have proven even more intractable than expected, and the process of devising and implementing reforms even more complicated. The outlook for these economies is thus for a continuing, but weak and bumpy, expansion,” the IMF said.

Speaking at a press conference in Washington, Olivier Blanchard, the IMF’s economic counsellor, said there was “a widespread perception” that policymakers in the euro area had lost control of the crisis.

“Europe must get its act together,” Blanchard said, adding that it was “absolutely essential” that measures agreed by policymakers in July, including a bigger role for the European Financial Stability Fund (EFSF), should be made operational soon.

“The eurozone is a major source of worry. This is a call to arms,” he said.

Blanchard said the fund was cutting its growth forecasts because the two balancing acts needed to ensure recovery from the recession of 2008-09 have stalled. Governments were cutting budget deficits but the private sector was failing to make up for the lost demand. Meanwhile, the global imbalances between deficit countries such as the US and surplus countries such as China looked like getting worse rather than better.

“Markets have become more sceptical about the ability of governments to stabilise their public debt. Worries have spread from countries on the periphery of Europe to countries in the core, and to others, including Japan and the US, Blanchard said.

He added that there was a risk of low growth, fiscal, and financial weaknesses could easily feed on each other.

“Lower growth makes fiscal consolidation harder. And fiscal consolidation may lead to even lower growth. Lower growth weakens banks. And weaker banks lead to tighter bank lending and lower growth.” As a result, there were “clear downside risks” to the fund’s new forecasts.

Developing nations lead the way

In its report, the IMF said it expected the strong performance of the leading emerging nations to be the main driving force behind growth in the world economy. China’s growth rate is forecast to ease back slightly, from 9.5% in 2011 to 9% in 2012, while India is predicted to expand by 7.5% in 2012 after 7.8% growth in 2011.

Sub-Saharan Africa is expected to continue to post robust growth, up from 5.2% in 2011 to 5.8% in 2012.

The rich developed countries, by contrast, are forecast to grow by just under 2%, slightly faster than the 1.6% pencilled in by the IMF for 2011.

“However, this assumes that European policymakers contain the crisis in the euro periphery area, that US policymakers strike a judicious balance between support for the economy and medium-term fiscal consolidation, and that volatility in global financial markets does not escalate.”

“The risks are clearly to the downside,” the IMF added, pointing to two particular concerns – that policymakers in the eurozone lose control of the sovereign debt crisis, and that the US economy could weaken as a result of political impasse in Washington, a deteriorating housing market or a slide in shares on Wall Street. It said the European Central Bank should consider cutting interest rates and that the Federal Reserve should stand ready to provide more “unconventional support”.

It said: “Either of these two eventualities would have severe implications for global growth. The renewed stress could undermine financial markets and institutions in advanced economies, which remain unusually vulnerable. Commodity prices and global trade and capital flows would likely decline abruptly, dragging down growth in developing countries.”

The IMF said that in its downside scenario, the eurozone and the US could fall back into recession, with activity some three percentage points lower in 2012 than envisaged. Currently, the fund is expecting the US to grow by 1.8% in 2012 and the eurozone by 1.1%.

“In the euro area, the adverse feedback loop between weak sovereign and financial institutions needs to be broken. Fragile financial institutions must be asked to raise more capital, preferably through private solutions. If these are not available, they will have to accept injections of public capital or support from the EFSF, or be restructured or closed.”

The IMF urged Republicans and Democrats in Washington to settle their differences: “Deep political differences leave the course of US policy highly uncertain. There is a serious risk that hasty fiscal cutbacks will further weaken the outlook without providing the long-term reforms required to reduce debt to more sustainable levels.”

Post-9/11 U.S. intelligence reforms take root but problems remain

Post-9/11 U.S. intelligence reforms take root, problems remain | Reuters.

(Reuters) – U.S. intelligence agencies will forever be scarred by their failure to connect the dots and detect the September 11 plot, but a decade later efforts to break down barriers to information-sharing are taking root.

Changing a culture of “need-to-know” to “need-to-share” does not come easily in spy circles. Some officials say they worry, a decade later, about a future attack in which it turns out that U.S. spy agencies had clues in their vast vaults of data but did not put them together, or even know they existed.

Yet significant changes, both big and small, have broken down barriers between agencies, smoothed information-sharing and improved coordination, U.S. intelligence experts say.

From issuing a blue badge to everyone working in the sprawling intelligence community to symbolize a common identity, to larger moves of mixing employees from different agencies, the goal is singular — to prevent another attack.

“We’re much further ahead,” David Shedd, Defense Intelligence Agency deputy director, said of the ability to connect the dots compared with 10 years ago. Still, signs of a plot to attack the United States could be missed again.

“My worst fear, and I suspect probably one that would come true, is that in any future would-be or actual attack, God forbid, we will be able to find the dots again somewhere because of simply how much data is collected,” Shedd said.

The political response to the failure to stop the attack was the 2002 creation of the Department of Homeland Security, pulling together 22 agencies to form the third largest U.S. Cabinet department behind the Pentagon and Veterans Affairs.

That was followed by the creation in late 2004 of the Director of National Intelligence to oversee all the spy agencies, as recommended by the bipartisan 9/11 commission.

Previously, the CIA director held a dual role of also overseeing the multitude of intelligence agencies. But in the aftermath of the 2001 attacks, policymakers decided that was too big of a job for one person to do effectively.

‘THERE ARE PROBLEMS’

Critics argued then and now that the reforms were the government’s usual response to crises — create more bureaucracy. But others see much-needed change.

“It has been a tremendous improvement,” said Lee Hamilton, who was the 9/11 commission vice chair. “It’s not seamless, there are problems, and we’ve still got a ways to go.”

The 2001 attacks involving airliners hijacked by al Qaeda operatives killed nearly 3,000 people in New York, Pennsylvania and the Pentagon. Various U.S. intelligence and law enforcement agencies had come across bits of information suggesting an impending attack but failed to put the pieces together.

The CIA had information about three of the 19 hijackers at least 20 months before the attacks; the National Security Agency had information linking one of the hijackers with al Qaeda leader Osama bin Laden’s network; the CIA knew one hijacker had entered the United States but did not tell the FBI; and an FBI agent warned of suspicious Middle Eastern men taking flying lessons.

Have the reforms made America safer? Officials say yes, and point to the U.S. operation that killed bin Laden in Pakistan in May that demanded coordination among intelligence agencies and the military. But there is an inevitable caveat: no one can guarantee there will never be another attack on U.S. soil.

On Christmas Day 2009, a Nigerian man linked to an al Qaeda off-shoot tried unsuccessfully to light explosives sewn into his underwear on a flight to Detroit from Amsterdam. It turned out U.S. authorities had pockets of information about him.

President Barack Obama used a familiar September 11 phrase to describe the 2009 incident as “a failure to connect the dots of intelligence that existed across our intelligence community.”

Roger Cressey, a former White House National Security Council counterterrorism official, resurrected another September 11 phrase: “It was a failure of imagination.”

The intelligence community had not seen al Qaeda in the Arabian Peninsula, a Yemen-based al Qaeda off-shoot, as capable of striking the U.S. homeland. If the “underwear bomber” threat had originated in Pakistan “they would have gone to battle stations immediately,” Cressey said.

Some proposed changes in how authorities would respond to another successful attack still are pending. For example, creation of a common communication system for police, firefighters and other emergency personnel remains tangled up in political wrangling in Congress over how to implement it.

“This is a no-brainer,” Hamilton said. “The first responders at the scene of a disaster ought to be able to talk with one another. They cannot do it today in most jurisdictions.”

Former leaders of the 9/11 commission issued a report card saying nine of its 41 recommendations remain unfinished.

WHERE’S THE POWER?

The Office of the Director of National Intelligence has experienced growing pains as overseer of the 17 spy agencies, churning through four chiefs in six years.

Tensions over turf, confusion about the DNI’s role, and problems herding agencies with very powerful chiefs of their own all came to a crescendo when retired Admiral Dennis Blair, the third DNI, tried to assert authority over CIA station chiefs, who represent the agency in different countries.

“The position of chief of station is one of the crown jewels of the CIA, and they don’t want anyone playing with their crown jewels,” said Mark Lowenthal, a former senior U.S. intelligence official.

After a dust-up with CIA Director Leon Panetta, who now is defense secretary, it was Blair who was sent packing.

“I think the mistake that some have made is to have viewed the DNI and the Director of CIA as an either/or proposition rather than the power of the two working together,” the DIA’s Shedd said in an interview in his office.

“There is a history of where that hasn’t worked so well, I believe it is working much better today,” said Shedd, who has worked at the DNI, CIA and National Security Council.

Intelligence experts say in the current administration, Obama’s top homeland security and counterterrorism adviser John Brennan arguably has more power than any of them because he has the president’s ear. It’s a reminder that, bureaucratic reform or no, personalities count in making national security policy.

The improved sharing of secret data has led to yet another set of problems. The deluge of bits and bytes has subjected intelligence analysts to information overload as they try to sift through it all for relevant pieces.

“Our analysts still are spending way too much time on finding the information rather than on the analysis of the information,” Shedd said. “There is just too much data to go find it all.”

The intelligence community wants a system developed that would automatically process information from multiple agencies and then make the connections for the analysts.

But greater inroads into sharing data across agencies does not guarantee that another attack will be averted.

The threat has evolved and officials now are increasingly concerned about a “lone wolf” plot by an individual, not tied to any militant group, that may be more difficult to uncover.

“Those threats will not come to our attention because of an intelligence community intercept,” said John Cohen, a senior Department of Homeland Security counterterrorism official.

“They will come to our attention because of an alert police officer, an alert deputy sheriff, an alert store owner, an alert member of the public sees something that is suspicious and reports it,” Cohen said.

One measure of the success of post-9/11 reforms is that a decade later the United States has not had a similar attack.

“Now that could be luck, that could be skill, we don’t really know,” Hamilton said. “But in all likelihood what we have done, including the establishment of the Department of Homeland Security and the transformation in intelligence and FBI, has certainly been helpful.”

(Editing by Warren Strobel and Will Dunham)