Category Archives: NETWORKS!

U.S. Defense Lawyers Are Crippling Nation's ability to wage Cyberwar

Cyberwar, Lawyers, and the U.S.: Denial of Service – By Stewart Baker | Foreign Policy.

Lawyers don’t win wars. But can they lose one?

We’re likely to find out, and soon. Lawyers across the U.S. government have raised so many show-stopping legal questions about cyberwar that they’ve left the military unable to fight or even plan for a war in cyberspace. But the only thing they’re likely to accomplish is to make Americans less safe.

No one seriously denies that cyberwar is coming. Russia pioneered cyberattacks in its conflicts with Georgia and Estonia, and cyberweapons went mainstream when the developers of Stuxnet sabotaged Iran’s Natanz uranium-enrichment plant, setting back the Islamic Republic’s nuclear weapons program more effectively than a 500-pound bomb ever could. In war, weapons that work get used again.

Unfortunately, it turns out that cyberweapons may work best against civilians. The necessities of modern life — pipelines, power grids, refineries, sewer and water lines — all run on the same industrial control systems that Stuxnet subverted so successfully. These systems may be even easier to sabotage than the notoriously porous computer networks that support our financial and telecommunications infrastructure.

And the consequences of successful sabotage would be devastating. The body charged with ensuring the resilience of power supplies in North America admitted last year that a coordinated cyberattack on the continent’s power system “could result in long-term (irreparable) damage to key system components” and could “cause large population centers to lose power for extended periods.” Translated from that gray prose, this means that foreign militaries could reduce many of U.S. cities to the state of post-Katrina New Orleans — and leave them that way for months.

Can the United States keep foreign militaries out of its networks? Not today. Even America’s premier national security agencies have struggled to respond to this new threat. Very sophisticated network defenders with vital secrets to protect have failed to keep attackers out. RSA is a security company that makes online credentials used widely by the Defense Department and defense contractors. Hackers from China so badly compromised RSA’s system that the company was forced to offer all its customers a new set of credentials. Imagine the impact on Ford’s reputation if it had to recall and replace every Ford that was still on the road; that’s what RSA is experiencing now.

HBGary, another well-respected security firm, suffered an attack on its system that put thousands of corporate emails in the public domain, some so embarrassing that the CEO lost his job. And Russian intelligence was able to extract large amounts of information from classified U.S. networks — which are not supposed to touch the Internet — simply by infecting the thumb drives that soldiers were using to move data from one system to the next. Joel Brenner, former head of counterintelligence for the Office of the Director of National Intelligence, estimates in his new book, America the Vulnerable, that billions of dollars in research and design work have been stolen electronically from the Defense Department and its contractors.

In short, even the best security experts in and out of government cannot protect their own most precious secrets from network attacks. But the attackers need not stop at stealing secrets. Once they’re in, they can just as easily sabotage the network to cause the “irreparable” damage that electric-grid guardians fear.

No agency has developed good defenses against such attacks. Unless the United States produces new technologies and new strategies to counter these threats, the hackers will get through. So far, though, what the United States has mostly produced is an outpouring of new law-review articles, new legal opinions, and, remarkably, new legal restrictions.

Across the federal government, lawyers are tying themselves in knots of legalese. Military lawyers are trying to articulate when a cyberattack can be classed as an armed attack that permits the use of force in response. State Department and National Security Council lawyers are implementing an international cyberwar strategy that relies on international law “norms” to restrict cyberwar. CIA lawyers are invoking the strict laws that govern covert action to prevent the Pentagon from launching cyberattacks.

Justice Department lawyers are apparently questioning whether the military violates the law of war if it does what every cybercriminal has learned to do — cover its tracks by routing attacks through computers located in other countries. And the Air Force recently surrendered to its own lawyers, allowing them to order that all cyberweapons be reviewed for “legality under [the law of armed conflict], domestic law and international law” before cyberwar capabilities are even acquired.

The result is predictable, and depressing. Top Defense Department officials recently adopted a cyberwar strategy that simply omitted any plan for conducting offensive operations, even as Marine Gen. James Cartwright, then vice chairman of the Joint Chiefs of Staff, complained publicly that a strategy dominated by defense would fail: “If it’s OK to attack me and I’m not going to do anything other than improve my defenses every time you attack me, it’s very difficult to come up with a deterrent strategy.”

Today, just a few months later, Cartwright is gone, but the lawyers endure. And apparently the other half of the U.S. cyberwar strategy will just have to wait until the lawyers can agree on what kind of offensive operations the military is allowed to mount.

***We’ve been in this spot before. In the first half of the 20th century, the new technology of air power transformed war at least as dramatically as information technology has in the last quarter-century. Then, as now, our leaders tried to use the laws of war to stave off the worst civilian harms that this new form of war made possible.

Tried and failed.

By the 1930s, everyone saw that aerial bombing would have the capacity to reduce cities to rubble in the next war. Just a few years earlier, the hellish slaughter in the trenches of World War I had destroyed the Victorian world; now air power promised to bring the same carnage to soldiers’ homes, wives, and children.

In Britain, some leaders expressed hardheaded realism about this grim possibility. Former Prime Minister Stanley Baldwin, summing up his country’s strategic position in 1932, showed a candor no recent American leader has dared to match. “There is no power on Earth that can protect [British citizens] from being bombed,” he said. “The bomber will always get through…. The only defense is in offense, which means that you have got to kill more women and children more quickly than the enemy if you want to save yourselves.”

The Americans, however, still hoped to head off the nightmare. Their tool of choice was international law. (Some things never change.) When war broke out in Europe on Sept. 1, 1939, President Franklin D. Roosevelt sent a cable to all the combatants seeking express limits on the use of air power. Citing the potential horrors of aerial bombardment, he called on all combatants to publicly affirm that their armed forces “shall in no event, and under no circumstances, undertake the bombardment from the air of civilian populations or of unfortified cities.”

Roosevelt had a pretty good legal case. The 1899 Hague conventions on the laws of war, adopted as the Wright brothers were tinkering their way toward Kitty Hawk, declared that in bombardments, “all necessary steps should be taken to spare as far as possible edifices devoted to religion, art, science, and charity, hospitals, and places where the sick and wounded are collected, provided they are not used at the same time for military purposes.” The League of Nations had also declared that in air war, “the intentional bombing of civilian populations is illegal.”

But FDR didn’t rely just on law. He asked for a public pledge that would bind all sides in the new war — and, remarkably, he got it. The horror at aerial bombardment of civilians ran so deep in that era that Britain, France, Germany, and Poland all agreed to FDR’s bargain, before nightfall on Sept. 1, 1939.

Nearly a year later, with the Battle of Britain raging in the air, the Luftwaffe was still threatening to discipline any pilot who bombed civilian targets. The deal had held. FDR’s accomplishment began to look like a great victory for the international law of war — exactly what the lawyers and diplomats now dealing with cyberwar hope to achieve.

But that’s not how this story ends.

On the night of Aug. 24, 1940, a Luftwaffe air group made a fateful navigational error. Aiming for oil terminals along the Thames River, they miscalculated, instead dropping their bombs in the civilian heart of London.

It was a mistake. But that’s not how British Prime Minister Winston Churchill saw it. He insisted on immediate retaliation. The next night, British bombers hit (arguably military) targets in Berlin for the first time. The military effect was negligible, but the political impact was profound. German Luftwaffe commander Hermann Göring had promised that the Luftwaffe would never allow a successful attack on Berlin. The Nazi regime was humiliated, the German people enraged. Ten days later, Adolf Hitler told a wildly cheering crowd that he had ordered the bombing of London: “Since they attack our cities, we will extirpate theirs.”

The Blitz was on.

In the end, London survived. But the extirpation of enemy cities became a permanent part of both sides’ strategy. No longer an illegal horror to be avoided at all costs, the destruction of enemy cities became deliberate policy. Later in the war, British strategists would launch aerial attacks with the avowed aim of causing “the destruction of German cities, the killing of German workers, and the disruption of civilized life throughout Germany.” So much for the Hague conventions, the League of Nations resolution, and even the explicit pledges given to Roosevelt. All these “norms” for the use of air power were swept away by the logic of the technology and the predictable psychology of war.

***American lawyers’ attempts to limit the scope of cyberwar are just as certain to fail as FDR’s limits on air war — and perhaps more so.

It’s true that half a century of limited war has taught U.S. soldiers to operate under strict restraints, in part because winning hearts and minds has been a higher priority than destroying the enemy’s infrastructure. But it’s unwise to put too much faith in the notion that this change is permanent. Those wars were limited because the stakes were limited, at least for the United States. Observing limits had a cost, but one the country could afford. In a way, that was true for the Luftwaffe, too, at least at the start. They were on offense, and winning, after all. But when the British struck Berlin, the cost was suddenly too high. Germans didn’t want law and diplomatic restraint; they wanted retribution — an eye for an eye. When cyberwar comes to America and citizens start to die for lack of power, gas, and money, it’s likely that they’ll want the same.

More likely, really, because Roosevelt’s bargain was far stronger than any legal restraints we’re likely to see on cyberwar. Roosevelt could count on a shared European horror at the aerial destruction of cities. The modern world has no such understanding — indeed, no such shared horror — regarding cyberwar. Quite the contrary. For some of America’s potential adversaries, the idea that both sides in a conflict could lose their networked infrastructure holds no horror. For some, a conflict that reduces both countries to eating grass sounds like a contest they might be able to win.

What’s more, cheating is easy and strategically profitable. America’s compliance will be enforced by all those lawyers. Its adversaries’ compliance will be enforced by, well, by no one. It will be difficult, if not impossible, to find a return address on their cyberattacks. They can ignore the rules and say — hell, they are saying — “We’re not carrying out cyberattacks. We’re victims too. Maybe you’re the attacker. Or maybe it’s Anonymous. Where’s your proof?”

Even if all sides were genuinely committed to limiting cyberwar, as they were in 1939, history shows that it only takes a single error to break the legal limits forever. And error is inevitable. Bombs dropped by desperate pilots under fire go astray — and so do cyberweapons. Stuxnet infected thousands of networks as it searched blindly for Iran’s uranium-enrichment centrifuges. The infections lasted far longer than intended. Should we expect fewer errors from code drafted in the heat of battle and flung at hazard toward the enemy?

Of course not. But the lesson of all this for the lawyers and the diplomats is stark: Their effort to impose limits on cyberwar is almost certainly doomed.

No one can welcome this conclusion, at least not in the United States. The country has advantages in traditional war that it lacks in cyberwar. Americans are not used to the idea that launching even small wars on distant continents may cause death and suffering at home. That is what drives the lawyers — they hope to maintain the old world. But they’re being driven down a dead end.

If America wants to defend against the horrors of cyberwar, it needs first to face them, with the candor of a Stanley Baldwin. Then the country needs to charge its military strategists, not its lawyers, with constructing a cyberwar strategy for the world we live in, not the world we’d like to live in.

That strategy needs both an offense and a defense. The offense must be powerful enough to deter every adversary with something to lose in cyberspace, so it must include a way to identify attackers with certainty. The defense, too, must be realistic, making successful cyberattacks more difficult and less effective because resilience and redundancy has been built into U.S. infrastructure.

Once the United States has a strategy for winning a cyberwar, it can ask the lawyers for their thoughts. But it can’t be done the other way around.

In 1941, the British sent their most modern battleship, the Prince of Wales, to Southeast Asia to deter a Japanese attack on Singapore. For 150 years, having the largest and most modern navy was all that was needed to project British power around the globe. Like the American lawyers who now oversee defense and intelligence, British admirals preferred to believe that the world had not changed. It took Japanese bombers 10 minutes to put an end to their fantasy, to the Prince of Wales, and to hundreds of brave sailors’ lives.

We should not wait for our own Prince of Wales moment in cyberspace.

A second Great Depression: Eight drastic policy measures necessary to prevent global economic collapse?

A second Great Depression: Eight drastic policy measures necessary to prevent global economic collapse. – By Nouriel Roubini – Slate Magazine.

The latest economic data suggest that recession is returning to most advanced economies, with financial markets now reaching levels of stress unseen since the collapse of Lehman Bros. in 2008. The risks of an economic and financial crisis even worse than the previous one—now involving not just the private sector, but also near-insolvent governments—are significant. So, what can be done to minimize the fallout of another economic contraction and prevent a deeper depression and financial meltdown?

First, we must accept that austerity measures, necessary to avoid a fiscal train wreck, have recessionary effects on output. So, if countries in the Eurozone’s periphery such as Greece or Portugal are forced to undertake fiscal austerity, countries able to provide short-term stimulus should do so and postpone their own austerity efforts. These countries include the United States, the United Kingdom, Germany, the core of the Eurozone, and Japan. Infrastructure banks that finance needed public infrastructure should be created as well.

Second, while monetary policy has limited impact when the problems are excessive debt and insolvency rather than illiquidity, credit easing, rather than just quantitative easing, can be helpful. The European Central Bank should reverse its mistaken decision to hike interest rates. More monetary and credit easing is also required for the U.S. Federal Reserve, the Bank of Japan, the Bank of England, and the Swiss National Bank. Inflation will soon be the last problem that central banks will fear, as renewed slack in goods, labor, real estate, and commodity markets feeds disinflationary pressures.

Third, to restore credit growth, Eurozone banks and banking systems that are undercapitalized should be strengthened with public financing in a European Union-wide program. To avoid an additional credit crunch as banks deleverage, banks should be given some short-term forbearance on capital and liquidity requirements. Also, since the U.S. and EU financial systems remain unlikely to provide credit to small and medium-size enterprises, direct government provision of credit to solvent but illiquid SMEs is essential.

Fourth, large-scale liquidity provision for solvent governments is necessary to avoid a spike in spreads and loss of market access that would turn illiquidity into insolvency. Even with policy changes, it takes time for governments to restore their credibility. Until then, markets will keep pressure on sovereign spreads, making a self-fulfilling crisis likely.

Today, Spain and Italy are at risk of losing market access. Official resources need to be tripled— through a larger European Financial Stability Facility, Eurobonds, or massive ECB action—to avoid a disastrous run on these sovereigns.

Advertisement

Fifth, debt burdens that cannot be eased by growth, savings, or inflation must be rendered sustainable through orderly debt restructuring, debt reduction, and conversion of debt into equity. This needs to be carried out for insolvent governments, households, and financial institutions alike.

Sixth, even if Greece and other peripheral Eurozone countries are given significant debt relief, economic growth will not resume until competitiveness is restored. And, without a rapid return to growth, more defaults—and social turmoil—cannot be avoided.

There are three options for restoring competitiveness within the Eurozone, all requiring a real depreciation—and none of which is viable:

  • A sharp weakening of the euro toward parity with the U.S. dollar, which is unlikely, as the United States is weak, too.
  • A rapid reduction in unit labor costs, via acceleration of structural reform and productivity growth relative to wage growth, is also unlikely, as that process took 15 years to restore competitiveness to Germany.
  • A five-year cumulative 30 percent deflation in prices and wages—in Greece, for example—which would mean five years of deepening and socially unacceptable depression. Even if feasible, this amount of deflation would exacerbate insolvency, given a 30 percent increase in the real value of debt.

Because these options cannot work, the sole alternative is an exit from the Eurozone by Greece and some other current members. Only a return to a national currency—and a sharp depreciation of that currency—can restore competitiveness and growth.

Leaving the common currency would, of course, threaten collateral damage for the exiting country and raise the risk of contagion for other weak Eurozone members. The balance-sheet effects on euro debts caused by the depreciation of the new national currency would thus have to be handled through an orderly and negotiated conversion of euro liabilities into the new national currencies. Appropriate use of official resources, including for recapitalization of Eurozone banks, would be needed to limit collateral damage and contagion.

Seventh, the reasons for advanced economies’ high unemployment and anemic growth are structural, including the rise of competitive emerging markets. The appropriate response to such massive changes is not protectionism. Instead, the advanced economies need a medium-term plan to restore competitiveness and jobs via massive new investments in high-quality education, job training and human-capital improvements, infrastructure, and alternative/renewable energy. Only such a program can provide workers in advanced economies with the tools needed to compete globally.

Eighth, emerging-market economies have more policy tools left than advanced economies do, and they should ease monetary and fiscal policy. The International Monetary Fund and the World Bank can serve as lender of last resort to emerging markets at risk of losing market access, conditional on appropriate policy reforms. And countries like China that rely excessively on net exports for growth should accelerate reforms, including more rapid currency appreciation, in order to boost domestic demand and consumption.

The risks ahead are not just of a mild double-dip recession, but of a severe contraction that could turn into the Great Depression II, especially if the Eurozone crisis becomes disorderly and leads to a global financial meltdown. Wrong-headed policies during the first Great Depression led to trade and currency wars, disorderly debt defaults, deflation, rising income and wealth inequality, poverty, desperation, and social and political instability that eventually led to the rise of authoritarian regimes and World War II. The best way to avoid the risk of repeating such a sequence is bold and aggressive global policy action now.

Read this story at Project Syndicate.

IMF says US and Europe risk double-dip recession

US and Europe risk double-dip recession, warns IMF | Business | guardian.co.uk.

International Monetary Fund’s World Economic Outlook says slow, bumpy recovery could be jeopardised by Europe’s debt crisis or over-hasty attempts to cut America’s budget deficit

IMF cuts growth forecast for UK

Wall Street protest

A protest on Wall Street. Confidence has fallen and the risks are on the downside, the IMF said in its half-yearly report. Photograph: Keystone/Rex Features

The International Monetary Fund warned on Tuesday that the United States and the eurozone risk being plunged back into recession unless policymakers tackle the problems facing the world’s two biggest economic forces.

In its half-yearly health check, the Washington-based fund said the global economy was “in a dangerous place” and that its forecast of a slow, bumpy recovery would be jeopardised by a deepening of Europe’s sovereign debt crisis or over-hasty attempts to rein in America’s budget deficit.

“Global activity has weakened and become more uneven, confidence has fallen sharply recently, and downside risks are growing,” the IMF said as it cut its global growth forecast for both 2011 and 2012.

The IMF also cut its growth forecasts for the UK economy and advised George Osborne to ease the pace of deficit reduction in the event of any further downturn in activity.

The IMF’s World Economic Outlook cited the Japanese tsunami and the rise in oil prices prompted by the unrest in north Africa and the Middle East as two of a “barrage” of shocks to hit the international economy in 2011. It said it now expected the global economy to expand by 4% in both 2011 and 2012, cuts of 0.3 points and 0.5 points since it last published forecasts three months ago.

“The structural problems facing the crisis-hit advanced economies have proven even more intractable than expected, and the process of devising and implementing reforms even more complicated. The outlook for these economies is thus for a continuing, but weak and bumpy, expansion,” the IMF said.

Speaking at a press conference in Washington, Olivier Blanchard, the IMF’s economic counsellor, said there was “a widespread perception” that policymakers in the euro area had lost control of the crisis.

“Europe must get its act together,” Blanchard said, adding that it was “absolutely essential” that measures agreed by policymakers in July, including a bigger role for the European Financial Stability Fund (EFSF), should be made operational soon.

“The eurozone is a major source of worry. This is a call to arms,” he said.

Blanchard said the fund was cutting its growth forecasts because the two balancing acts needed to ensure recovery from the recession of 2008-09 have stalled. Governments were cutting budget deficits but the private sector was failing to make up for the lost demand. Meanwhile, the global imbalances between deficit countries such as the US and surplus countries such as China looked like getting worse rather than better.

“Markets have become more sceptical about the ability of governments to stabilise their public debt. Worries have spread from countries on the periphery of Europe to countries in the core, and to others, including Japan and the US, Blanchard said.

He added that there was a risk of low growth, fiscal, and financial weaknesses could easily feed on each other.

“Lower growth makes fiscal consolidation harder. And fiscal consolidation may lead to even lower growth. Lower growth weakens banks. And weaker banks lead to tighter bank lending and lower growth.” As a result, there were “clear downside risks” to the fund’s new forecasts.

Developing nations lead the way

In its report, the IMF said it expected the strong performance of the leading emerging nations to be the main driving force behind growth in the world economy. China’s growth rate is forecast to ease back slightly, from 9.5% in 2011 to 9% in 2012, while India is predicted to expand by 7.5% in 2012 after 7.8% growth in 2011.

Sub-Saharan Africa is expected to continue to post robust growth, up from 5.2% in 2011 to 5.8% in 2012.

The rich developed countries, by contrast, are forecast to grow by just under 2%, slightly faster than the 1.6% pencilled in by the IMF for 2011.

“However, this assumes that European policymakers contain the crisis in the euro periphery area, that US policymakers strike a judicious balance between support for the economy and medium-term fiscal consolidation, and that volatility in global financial markets does not escalate.”

“The risks are clearly to the downside,” the IMF added, pointing to two particular concerns – that policymakers in the eurozone lose control of the sovereign debt crisis, and that the US economy could weaken as a result of political impasse in Washington, a deteriorating housing market or a slide in shares on Wall Street. It said the European Central Bank should consider cutting interest rates and that the Federal Reserve should stand ready to provide more “unconventional support”.

It said: “Either of these two eventualities would have severe implications for global growth. The renewed stress could undermine financial markets and institutions in advanced economies, which remain unusually vulnerable. Commodity prices and global trade and capital flows would likely decline abruptly, dragging down growth in developing countries.”

The IMF said that in its downside scenario, the eurozone and the US could fall back into recession, with activity some three percentage points lower in 2012 than envisaged. Currently, the fund is expecting the US to grow by 1.8% in 2012 and the eurozone by 1.1%.

“In the euro area, the adverse feedback loop between weak sovereign and financial institutions needs to be broken. Fragile financial institutions must be asked to raise more capital, preferably through private solutions. If these are not available, they will have to accept injections of public capital or support from the EFSF, or be restructured or closed.”

The IMF urged Republicans and Democrats in Washington to settle their differences: “Deep political differences leave the course of US policy highly uncertain. There is a serious risk that hasty fiscal cutbacks will further weaken the outlook without providing the long-term reforms required to reduce debt to more sustainable levels.”

Post-9/11 U.S. intelligence reforms take root but problems remain

Post-9/11 U.S. intelligence reforms take root, problems remain | Reuters.

(Reuters) – U.S. intelligence agencies will forever be scarred by their failure to connect the dots and detect the September 11 plot, but a decade later efforts to break down barriers to information-sharing are taking root.

Changing a culture of “need-to-know” to “need-to-share” does not come easily in spy circles. Some officials say they worry, a decade later, about a future attack in which it turns out that U.S. spy agencies had clues in their vast vaults of data but did not put them together, or even know they existed.

Yet significant changes, both big and small, have broken down barriers between agencies, smoothed information-sharing and improved coordination, U.S. intelligence experts say.

From issuing a blue badge to everyone working in the sprawling intelligence community to symbolize a common identity, to larger moves of mixing employees from different agencies, the goal is singular — to prevent another attack.

“We’re much further ahead,” David Shedd, Defense Intelligence Agency deputy director, said of the ability to connect the dots compared with 10 years ago. Still, signs of a plot to attack the United States could be missed again.

“My worst fear, and I suspect probably one that would come true, is that in any future would-be or actual attack, God forbid, we will be able to find the dots again somewhere because of simply how much data is collected,” Shedd said.

The political response to the failure to stop the attack was the 2002 creation of the Department of Homeland Security, pulling together 22 agencies to form the third largest U.S. Cabinet department behind the Pentagon and Veterans Affairs.

That was followed by the creation in late 2004 of the Director of National Intelligence to oversee all the spy agencies, as recommended by the bipartisan 9/11 commission.

Previously, the CIA director held a dual role of also overseeing the multitude of intelligence agencies. But in the aftermath of the 2001 attacks, policymakers decided that was too big of a job for one person to do effectively.

‘THERE ARE PROBLEMS’

Critics argued then and now that the reforms were the government’s usual response to crises — create more bureaucracy. But others see much-needed change.

“It has been a tremendous improvement,” said Lee Hamilton, who was the 9/11 commission vice chair. “It’s not seamless, there are problems, and we’ve still got a ways to go.”

The 2001 attacks involving airliners hijacked by al Qaeda operatives killed nearly 3,000 people in New York, Pennsylvania and the Pentagon. Various U.S. intelligence and law enforcement agencies had come across bits of information suggesting an impending attack but failed to put the pieces together.

The CIA had information about three of the 19 hijackers at least 20 months before the attacks; the National Security Agency had information linking one of the hijackers with al Qaeda leader Osama bin Laden’s network; the CIA knew one hijacker had entered the United States but did not tell the FBI; and an FBI agent warned of suspicious Middle Eastern men taking flying lessons.

Have the reforms made America safer? Officials say yes, and point to the U.S. operation that killed bin Laden in Pakistan in May that demanded coordination among intelligence agencies and the military. But there is an inevitable caveat: no one can guarantee there will never be another attack on U.S. soil.

On Christmas Day 2009, a Nigerian man linked to an al Qaeda off-shoot tried unsuccessfully to light explosives sewn into his underwear on a flight to Detroit from Amsterdam. It turned out U.S. authorities had pockets of information about him.

President Barack Obama used a familiar September 11 phrase to describe the 2009 incident as “a failure to connect the dots of intelligence that existed across our intelligence community.”

Roger Cressey, a former White House National Security Council counterterrorism official, resurrected another September 11 phrase: “It was a failure of imagination.”

The intelligence community had not seen al Qaeda in the Arabian Peninsula, a Yemen-based al Qaeda off-shoot, as capable of striking the U.S. homeland. If the “underwear bomber” threat had originated in Pakistan “they would have gone to battle stations immediately,” Cressey said.

Some proposed changes in how authorities would respond to another successful attack still are pending. For example, creation of a common communication system for police, firefighters and other emergency personnel remains tangled up in political wrangling in Congress over how to implement it.

“This is a no-brainer,” Hamilton said. “The first responders at the scene of a disaster ought to be able to talk with one another. They cannot do it today in most jurisdictions.”

Former leaders of the 9/11 commission issued a report card saying nine of its 41 recommendations remain unfinished.

WHERE’S THE POWER?

The Office of the Director of National Intelligence has experienced growing pains as overseer of the 17 spy agencies, churning through four chiefs in six years.

Tensions over turf, confusion about the DNI’s role, and problems herding agencies with very powerful chiefs of their own all came to a crescendo when retired Admiral Dennis Blair, the third DNI, tried to assert authority over CIA station chiefs, who represent the agency in different countries.

“The position of chief of station is one of the crown jewels of the CIA, and they don’t want anyone playing with their crown jewels,” said Mark Lowenthal, a former senior U.S. intelligence official.

After a dust-up with CIA Director Leon Panetta, who now is defense secretary, it was Blair who was sent packing.

“I think the mistake that some have made is to have viewed the DNI and the Director of CIA as an either/or proposition rather than the power of the two working together,” the DIA’s Shedd said in an interview in his office.

“There is a history of where that hasn’t worked so well, I believe it is working much better today,” said Shedd, who has worked at the DNI, CIA and National Security Council.

Intelligence experts say in the current administration, Obama’s top homeland security and counterterrorism adviser John Brennan arguably has more power than any of them because he has the president’s ear. It’s a reminder that, bureaucratic reform or no, personalities count in making national security policy.

The improved sharing of secret data has led to yet another set of problems. The deluge of bits and bytes has subjected intelligence analysts to information overload as they try to sift through it all for relevant pieces.

“Our analysts still are spending way too much time on finding the information rather than on the analysis of the information,” Shedd said. “There is just too much data to go find it all.”

The intelligence community wants a system developed that would automatically process information from multiple agencies and then make the connections for the analysts.

But greater inroads into sharing data across agencies does not guarantee that another attack will be averted.

The threat has evolved and officials now are increasingly concerned about a “lone wolf” plot by an individual, not tied to any militant group, that may be more difficult to uncover.

“Those threats will not come to our attention because of an intelligence community intercept,” said John Cohen, a senior Department of Homeland Security counterterrorism official.

“They will come to our attention because of an alert police officer, an alert deputy sheriff, an alert store owner, an alert member of the public sees something that is suspicious and reports it,” Cohen said.

One measure of the success of post-9/11 reforms is that a decade later the United States has not had a similar attack.

“Now that could be luck, that could be skill, we don’t really know,” Hamilton said. “But in all likelihood what we have done, including the establishment of the Department of Homeland Security and the transformation in intelligence and FBI, has certainly been helpful.”

(Editing by Warren Strobel and Will Dunham)

Can we count on cell networks in disasters?

Can we count on cell networks in disasters? | Signal Strength – CNET News.

Andrea Mancuso was working just north of the World Trade Center on September 11, 2001, when two planes struck the towers. Soon after, she was the only person around who seemed to have cell phone service.

“I walked from downtown to Lincoln Center (about 4.5 miles) before I was able to hail a cab with four strangers,” she said. “Everyone was upset, and no one had a cell phone signal except me. I passed my phone around like a hot potato all the way to Harlem. Everyone including the cab driver graciously and tearfully called their families.”

Her story, of course, is not unique. For hours, family members and co-workers frantically tried to contact people they knew in Lower Manhattan.

The network failure could partially be pinned on infrastructure damage. Cell towers were destroyed in the attacks, along with switching equipment used for landline phones. But another cause of the problem was the huge surge in traffic from people trying to find loved ones or letting others know they were OK.

Since 9/11, wireless networks have been tested time and again, and their performance has been shaky. A major blackout in the Northeast in 2003, Hurricane Katrina in 2005, and the Minneapolis bridge collapse in 2007 put strains on local networks. Cellular service in New York City even ground to a halt last month because of a minor earthquake centered several hundred miles away.

Undoubtedly, with each crisis, operators have learned more about what they can do to keep service up and running. But there’s a flip side to that growing expertise: we’re more dependent than ever on cell phones.

In September 2001, there were between 118 million and 128 million wireless subscribers who owned cell phones, according to data compiled by the CTIA Wireless Association. At the end of 2010, it was 302 million, or more than 96 percent of Americans. To keep up with the surge, wireless operators have spent billions of dollars upgrading their networks, adding more than 125,000 new cell sites since the end of 2001.

 

Related stories:
Hurricane Irene’s challenge for cell phone networks
Cell service jammed after East Coast earthquake
Wireless operators accelerate upgrade plans

But is that enough? Despite anecdotal evidence to the contrary, industry experts interviewed by CNET believe that carriers have made huge strides on network reliability by doing simple things like expansion and adding generator backups, and complex things like forming emergency response units. They believe we’re in far better shape than we were 10 years ago…to a point.

“There’s no question the networks are in a much better position today than they were in 2001 to handle a significant crisis, even in the face of a staggering increase in users,” said Charles Golvin, an analyst at Forrester Research. “But it’s also important for people to make contingency plans for communications, in case the network isn’t working.”

On the plus side
There’s reason for that guarded optimism. Carriers have added spectrum and high-capacity connections from their cell towers to the wired networks that transport voice and data traffic. And they’ve hardened their networks with equipment that can withstand heavy wind and rain, as well as ensure that the equipment remains functional when commercial power is lost.

In fact, all the major wireless carriers have increased the number of cell sites with backup power supplies. They’ve also increased the number of cell sites on wheels that they can roll into locations that have had infrastructure damage.

“A good proportion of the cellular bay stations across all the major carriers now have some kind of battery or generator for backup power,” said Gerard Hallaren, an equities analyst at JRPG Research. “That wasn’t the case back in 2001.”

Verizon Wireless, for example, has for years been installing backup generators and batteries to many of its cell sites. During the 2003 blackout that kept much of the Northeast in the dark for hours, Verizon’s customers could still communicate when customers from other carriers could not.

There’s also that human element, in terms of specialized units that can quickly move into an area.

“In the event of a natural or man-made disaster, like 9/11, we have multiple groups within AT&T who are equipped to respond quickly to repair and restore network capabilities,” AT&T spokesman Mark Siegel said. “Our network disaster recovery team is a great example of that. They were deployed after 9/11 to assist in restorations, and we’ve invested $600 million in our NDR team since its formation.”

There are also new services. Enhanced 911, which allows 911 operators to locate callers on a cell phone, is a prime example. A decade ago, cell phones in the U.S. didn’t yet support the technology. Today, every phone sold in the U.S. is capable of providing location information to emergency 911 operators.

While it was gaining popularity in Europe and elsewhere in 2001, SMS text-messaging services in the U.S. weren’t used much by wireless subscribers because they worked only within carrier networks. This meant that on September 11, 2001, if someone wanted to send a text message to a family member or loved one, they were able to send it only to someone who subscribed to the same carrier.

A few months later, in November 2001, carriers began to connect their networks for text messaging, allowing subscribers on different networks to exchange texts. Today, more than 187 billion text messages cross U.S. wireless networks each month.

Text messaging has become a critical form of communication during a crisis. In the lead-up to Hurricane Irene on the East Coast last month, wireless operators and public-safety officials were asking consumers to use text messaging during the storm instead of making voice calls to help alleviate network congestion.

Text messaging is a better way of communicating in a crisis for several reasons. To start, the messages are small and consume only a small amount of network resources. Second, messages are sent on a cell phone’s signaling channel. This means that they’re in a separate “lane” from voice and data messages, so they may have a clear path when the voice network is congested. And if the network is too congested even to send a text, the message can be stored. When service resumes, the message is sent.

The technology has become such a ubiquitous and reliable form of communication during an emergency that the Federal Communication Commission is working on rules to allow 911 call centers to accept SMS text messages, as well as photos, videos, and data communications, to provide more information to first responders for assessing and responding to emergencies.

Congestion problem
The biggest problem wireless networks face today in a crisis is a rapid increase in usage. The networks don’t have enough capacity to handle the surge in call volume. Cellular networks are designed to handle a certain amount of calls in each cell site or region, with wireless operators carefully calculating how much usage is needed to serve the average usage volume while having just enough capacity to handle spikes in demand.

The problem occurs when a disaster hits, and thousands of people all at once pick up their phones to call someone, send a text message, update Twitter, and so on. There simply isn’t enough capacity in the network to allow everyone in a cell site to make a phone call at the same time.

Steve Largent, the president of the CTIA Wireless Association, argues that more wireless spectrum is needed to ensure that more “lanes” for data can be opened up for wireless operators to direct traffic to during a crisis.

“Crisis situations are a perfect example of why it’s so important that the government makes more wireless spectrum available,” he said.

While more spectrum could help, it’s unclear if it would ever be cost-effective for wireless operators to configure their networks to withstand the highest demand for network resources. Analyst Gerard Hallaren said most networks are designed to handle only about 20 percent to 40 percent of maximum traffic, with 40 percent being on the conservative side.

“It’s just economic insanity for any carrier to try to solve the congestion problem,” he said. “It’s cost-prohibitive to build a network that could serve 330 million at the same time. A service like that would cost hundreds of dollars a month, and people are not willing to pay that much for cell phone service.”

That said, the carriers say they’ve made improvements to their networks and are trying to alleviate the issue.

“We continue to build redundancies into our network and increase capacity so that it is not overwhelmed by ‘sudden calling events,'” AT&T’s Siegel said.

New generations of cellular technology have also helped make wireless more available during a crisis. The move from 2G to 3G, and now to 4G, will offer carriers more efficiencies in how they use their spectrum, which could also be a benefit during an emergency to alleviate network congestion.

The other major difference between September 2001 and now is that the mobile Internet as we know it today did not exist. Third-generation, or so called 3G, wireless networks were not deployed, and most people did not have access to the Internet from their cell phones. Facebook, Twitter, and other social-networking apps that people access easily from their cell phones today to share pictures, updates, and other information weren’t even invented back then. While this traffic also increases the load on networks, sometimes it’s easier for users to get through to these sites than to make voice connections via their cell phones.

“People have so many more ways of communicating with each other now to tell someone where they are or that they are all right,” Forrester’s Golvin said. “Having these communication alternatives is a huge improvement over where we were a decade ago.”

Public-safety officials had their own communications challenges responding to the terrorist attacks on September 11, 2001. Tomorrow CNET will explain the problems first responders had that day and why the public-safety community is still waiting for their own wireless network.

Raindrop Tracker Point to Better Environmental Awareness

Go with the flow : Nature : Nature Publishing Group.

It might seem impossible to get lost in the modern world with its ubiquity of digital maps, but there is more than one way to be lost. Truly knowing where you are goes beyond pinpointing your position. It means knowing where your water comes from and where it goes, where your electricity is generated and where your rubbish ends up. It means being aware of what plants and animals live nearby and what kind of soil lies beneath your feet.

For example, an undergraduate at a rainy Butler University in Indianapolis, Indiana, can use his or her smartphone to instantly calculate a route to the nearest Starbucks coffee shop. But chances are that he or she remains ignorant of how the rain flows through the city on its way to the White River, the Mississippi and, finally, the Gulf of Mexico.

Enter Raindrop, a phone application that combines sewer and watercourse maps with the software that makes getting a caffeine fix so easy. Tap the map and watch the path of a single raindrop flow from your location through streams, culverts and pipes into the river. The app, due to launch next month, was funded by the US National Oceanic and Atmospheric Administration and put together by a team led by ecologist Timothy Carter at Butler. It is currently limited to Indianapolis, but similar efforts could be designed for other cities.

A better appreciation of watercourses and other hidden networks can only strengthen human connections to ecosystems, biogeochemical cycles and resource flows, and will arguably make people more likely to support science and environmental causes. Making available the data that science and society produce in these innovative ways can help people to find themselves — even if they had no idea that they were lost.

We Need a Materials Taxonomy to Solve the final steps in the recycling chain | ITworld

Want to be a billionaire and a hero? Solve the final steps in the recycling chain | ITworld.

Want to be a billionaire and a hero? Solve the final steps in the recycling chain

Your challenge: Develop a usable taxonomy of parts and materials so that products can be safely and profitably devolved.

By Tom Henderson  Add a new comment

 

You can buy that cool tablet today, and its useful life is probably three years on the outside. Something new and cool will be available in 2014 (no pre-announcements here, just predictions) and you’ll want to buy it. Perhaps you’ll use a vendor’s trade-in program to do something with the old one — after you’ve conveniently moved the data to your new machine. We hope.

[DEMO 2011: EcoATM recycles gadgets, gives cash | IT recycling charities need your monitors]

There’s a huge opening for someone to get rich, developing a usable taxonomy of parts and materials so that products can be safely and profitably devolved. The way you do it is clear: find a method to describe parts in such a way that they can be taken apart and recycled or safely disposed of. The avalanche of tech products is unlikely to stop, and we expect even less time with them before the new thing arrives to tempt us.

You bought. Someone now has your old machine, with its data removed. What’s done with it is then, is something ranging from devolution to landfill fodder. Inside the derelict are a number of precious metals, and depending on the battery technology, a lump of lithium, nickel, and/or other metals. Many smaller bits inside will become reduced to smaller and smaller bits until they’re either disposed of in a pile (in the ocean, landfill, etc.) or smelted and separated into base elements. It’s an inefficient and labor-intensive process. Plastics can be reused, as well as the stickers and box that an item arrived in.

Lots of derelict products are shipped to SE Asia, where the labor cost of this inefficient process helps compensate by being comparatively low. It also leads to huge piles of ex-computer gear parts that pollute the groundwater in hideous ways. People are poisoned in the scavenging process, not to mention the evil piles of computer dung that are nuclear waste without the isotopes.
What’s needed is a way to mark directly, every part in a machine. Some parts will be more lucratively recycled. Importantly, those parts that are environmentally damaging, or those that require special devolution processes can be aggregated so that they don’t cause interim pollution, and recyclers can benefit from scale of devolution of hazardous materials.

Today, we use primitive marks to denote very basic (typically plastic) product composition. We have hazardous materials markers and identification and other markings to identify objects that can be either recycled or are hazardous/dangerous-to-handle.

My suggestion: use advanced barcodes to identify everything by a recycling mark that can be rapidly identified for devolution. The marking doesn’t have to be on an easily visible area, but it needs to be revealed somehow. The marks can be tiny, almost microscopic, yet recognized by modern bar code scanners. They could identify either specific categories of product materials, or by actual part number.

In the first case, generic markers can identify tens of millions of generic product identifications, making devolution and separation into elements for recycling vastly simpler than it is today. Specific identification then differentiates subsystems and elements that need specific handling requirements, or perhaps have vendor/manufacturer-specific (even mandated) devolution processes (including rewards).

Another reward potential is that most consumer and industrial products could benefit from the same marking scheme that would permit rapid and accurate product devolution. Junkyards across the world are full of unidentifiable bits and pieces of products gone by, ranging from building cranes to old Volkswagens to refrigerators and no one knows what this stuff is. There are various tests for precious metals (often using primitive magnets) and certain plastics, but many materials aren’t easily identified. So they rot, rust, and ooze back into the environment. Materials identification methodologies won’t be tough to deploy, and a government mandate seems unnecessary because the motivation to make money from recycled materials exists now.

If we don’t do this, then the chances of high-efficiency recycling becomes reduced vastly, and piles of useless and hazardous ex-computer junk become taller. Just as every bill of materials includes parts and sources, we could devolve products when their lifecycle is over systematically. What’s needed is an agreement to employ this methodology to the production process: deproduction. The devil of the details will come. Barcodes exist. Now we need a product identification taxonomy, a method to affix material markings, and a database access method that tells the devolvers how to make money.

An Impeccable Financial Disaster

An Impeccable Disaster – NYTimes.com.

 

On Thursday Jean-Claude Trichet, the president of the European Central Bank or E.C.B. — Europe’s equivalent to Ben Bernanke — lost his sang-froid. In response to a question about whether the E.C.B. is becoming a “bad bank” thanks to its purchases of troubled nations’ debt, Mr. Trichet, his voice rising, insisted that his institution has performed “impeccably, impeccably!” as a guardian of price stability.

Fred R. Conrad/The New York Times

Paul Krugman

Readers’ Comments

Indeed it has. And that’s why the euro is now at risk of collapse.

Financial turmoil in Europe is no longer a problem of small, peripheral economies like Greece. What’s under way right now is a full-scale market run on the much larger economies of Spain and Italy. At this point countries in crisis account for about a third of the euro area’s G.D.P., so the common European currency itself is under existential threat.

And all indications are that European leaders are unwilling even to acknowledge the nature of that threat, let alone deal with it effectively.

I’ve complained a lot about the “fiscalization” of economic discourse here in America, the way in which a premature focus on budget deficits turned Washington’s attention away from the ongoing jobs disaster. But we’re not unique in that respect, and in fact the Europeans have been much, much worse.

Listen to many European leaders — especially, but by no means only, the Germans — and you’d think that their continent’s troubles are a simple morality tale of debt and punishment: Governments borrowed too much, now they’re paying the price, and fiscal austerity is the only answer.

Yet this story applies, if at all, to Greece and nobody else. Spain in particular had a budget surplus and low debt before the 2008 financial crisis; its fiscal record, one might say, was impeccable. And while it was hit hard by the collapse of its housing boom, it’s still a relatively low-debt country, and it’s hard to make the case that the underlying fiscal condition of Spain’s government is worse than that of, say, Britain’s government.

So why is Spain — along with Italy, which has higher debt but smaller deficits — in so much trouble? The answer is that these countries are facing something very much like a bank run, except that the run is on their governments rather than, or more accurately as well as, their financial institutions.

Here’s how such a run works: Investors, for whatever reason, fear that a country will default on its debt. This makes them unwilling to buy the country’s bonds, or at least not unless offered a very high interest rate. And the fact that the country must roll its debt over at high interest rates worsens its fiscal prospects, making default more likely, so that the crisis of confidence becomes a self-fulfilling prophecy. And as it does, it becomes a banking crisis as well, since a country’s banks are normally heavily invested in government debt.

Now, a country with its own currency, like Britain, can short-circuit this process: if necessary, the Bank of England can step in to buy government debt with newly created money. This might lead to inflation (although even that is doubtful when the economy is depressed), but inflation poses a much smaller threat to investors than outright default. Spain and Italy, however, have adopted the euro and no longer have their own currencies. As a result, the threat of a self-fulfilling crisis is very real — and interest rates on Spanish and Italian debt are more than twice the rate on British debt.

Which brings us back to the impeccable E.C.B.

What Mr. Trichet and his colleagues should be doing right now is buying up Spanish and Italian debt — that is, doing what these countries would be doing for themselves if they still had their own currencies. In fact, the E.C.B. started doing just that a few weeks ago, and produced a temporary respite for those nations. But the E.C.B. immediately found itself under severe pressure from the moralizers, who hate the idea of letting countries off the hook for their alleged fiscal sins. And the perception that the moralizers will block any further rescue actions has set off a renewed market panic.

Adding to the problem is the E.C.B.’s obsession with maintaining its “impeccable” record on price stability: at a time when Europe desperately needs a strong recovery, and modest inflation would actually be helpful, the bank has instead been tightening money, trying to head off inflation risks that exist only in its imagination.

And now it’s all coming to a head. We’re not talking about a crisis that will unfold over a year or two; this thing could come apart in a matter of days. And if it does, the whole world will suffer.

So will the E.C.B. do what needs to be done — lend freely and cut rates? Or will European leaders remain too focused on punishing debtors to save themselves? The whole world is watching.

Huge blackout takes down Southern California, Mexican border

Huge blackout hits Southern California, Mexican border | Reuters.

Traffic and pedestrians move through a powerless intersection following a power outage in Cardiff, California, September 8, 2011. REUTERS-Mike Blake

Traffic and pedestrians move through a powerless intersection following a power outage in Cardiff, California, September 8, 2011.

Credit: Reuters/Mike Blake

SAN DIEGO | Fri Sep 9, 2011 9:22am EDT

(Reuters) – A massive blackout caused by “human failure” left nearly 5 million people without power in parts of California, Arizona and Mexico on Thursday, and officials said many residents may be out of service for a day or more.

The outage, apparently triggered by an employee who carried out a procedure at a substation in Arizona, snarled traffic on Southern California freeways, knocked out water supplies in parts of San Diego County and Tijuana and sent some elderly residents to emergency rooms.

San Diego International Airport canceled all outbound flights, traffic came to a standstill as the city’s street lights quit and about 70 people had to be rescued by the city’s fire department from stalled elevators.

San Diego schools were ordered closed until Monday as utilities could not guarantee they would be able to turn on the lights in classrooms.

“There was a very major outage, a region-wide outage,” San Diego Gas and Electric President Mike Niggli said. “There’s no doubt this has never happened before to our system.”

But police in California’s second-largest city, located between Los Angeles and the Mexican border, reported no major problems, and hospitals successfully switched to backup power, the Scripps Health chain said.

‘HUMAN FAILURE’

The ill-fated procedure in Arizona first caused the failure of a high-power line supplying electricity to Southern California before unleashing a domino effect across the Southwest, officials said.

That in turn led to a blockage at California’s San Onofre nuclear energy plant, a second major source of power to the San Diego area, San Diego Gas and Electric said.

San Diego Gas and Electric said in a tweet that all 1.4 million of its customers in the San Diego area were without power. Blackouts also affected 3.5 million people in Baja California, according to local emergency services and state authorities.

The city of Yuma, Arizona, reported that more than 50,000 people had lost power.

“There appears to be two failures here — one is human failure and the other is a system failure. Both of those will be addressed,” said Damon Gross, a spokesman for Arizona utility APS.

By early evening, crews had restored service in the section of the line that triggered the massive event and had begun to restore power to parts of San Diego County.

Electricity returned late on Thursday to the central San Diego neighborhood of Normal Heights, where many families earlier in the evening had embraced the darkness by throwing outdoor barbecue parties on their front lawns.

By 11 p.m., San Diego Gas and Electric reported that power was restored to 165,000 customers in San Diego and Orange counties. But the utility warned that all power would not be restored overnight and urged customers to conserve energy.

‘LINES EVERYWHERE NOW’

Mexico’s Federal Electricity Commission said 180,000 customers had been brought back online in Baja California. The commission said it was making progress in getting power back on in state capital Mexicali, Ensenada and Rosarito.

Stuck without refrigeration, employees at the Cardiff Seaside Market, a grocery and specialty food store in Cardiff-by-the-Sea, north of San Diego, started grilling their inventory of fresh steaks and tuna in the parking lot and selling it cooked to passersby for cash.

Meanwhile, a line of about 50 customers waited at the front door for their turn to be led inside by a clerk to do their shopping in groups of two or three at a time.

“It’s real hectic, there’s lines everywhere now. But the customers are happy, everyone’s patient, everyone’s in a good mood, and we’re serving them as quickly as we can,” manager John Shamam, 33, said as he served up a plate of tuna.

Many of the Tweets from San Diego residents revolved around air conditioning. “I’m going to die of heat in this house with no AC!” wrote Ashleigh Marie. “What am I supposed to dooo.”

But San Diego resident Kiersten White tweeted that the power outage “makes me glad I don’t have air conditioning to begin with … nothing to miss!”

Other people were having a harder time. “Trapped outside of our rooms at the hotel,” tweeted Rob Myers, visiting from Washington.

Blackouts hit Mexico’s Northern Baja California state in the afternoon, knocking out power to hundreds of maquiladora export assembly plants in the sprawling industrial powerhouse of Tijuana, south of San Diego.

The blackouts knocked out stoplights at intersections across Tijuana, causing traffic snarl ups, and also cut power to hospitals and government offices. The border crossing at Otay Mesa was closed to all but pedestrian traffic.

 

****************************

Fri Sep 9, 2011 9:05am EDT

 * Blackout left nearly 5 million without power
 * San Onofre nuclear plant remains shut
 (Adds background, quotes)
 NEW YORK, Sept 9 (Reuters) - Power companies in Southern
California restored electricity to most customers by early
Friday after a massive blackout on Thursday left nearly 5
million people in parts of California, Arizona and Mexico in
the dark.
 Although the Sept. 8 outage, apparently caused by human
error, was just a tenth the size of the 2003 blackout that left
about 50 million people without power in the eastern United
States and Canada, it will surely rank as one of the biggest
blackouts in recent history - certainly one of the biggest
caused by human error.
 Sempra Energy's (SRE.N) San Diego Gas & Electric power
company said it restored power to its 1.4 million customers at
3:25 a.m. Western time on Friday.
 That was almost 12 hours after a major electric
transmission system outage in western Arizona and the loss of a
key connection with the 2,150-megawatt San Onofre nuclear power
plant in California resulted in the most widespread power
outage in the company's history, SDG&E said.
 Blackouts also affected 3.5 million people in Baja
California, according to local officials. [ID:nN1E78729HS]
 San Onofre, which is operated by Edison International's
(EIX.N) Southern California Edison, shut on Thursday and
remained out of service early Friday, according to the U.S.
Nuclear Regulatory Commission.
 "Restoring power in the aftermath of the loss of the entire
local grid serving San Diego and southern Orange counties was a
monumental task," David Geier, SDG&E vice president of electric
operations, said in a release.
 "The restoration process, however, has left our local power
grid very fragile and we are asking our customers to conserve
electricity throughout the day Friday," Geier said.
 SDG&E and the California ISO, which operates the power grid
for much of the state, said they would focus on maintaining and
ensuring the integrity of the local power system for the next
few days before determining the sequence of events that led to
the outage and establishing practices and procedures to ensure
that outages such as the Sept. 8 event are not repeated.
 "There appears to be two failures here -- one is human
failure and the other is a system failure. Both of those will
be addressed," said Damon Gross, a spokesman for Pinnacle West
Capital's (PNW.N) Arizona utility Arizona Public Service.
 (Reporting by Scott DiSavino; Editing by Alden Bentley)

Just another Global Dialog Project Sites site