Category Archives: DMAN

How to make green steel

CultureLab: How to make steel go green – with songs!.

Michael Marshall, environment reporter

greensteel2.jpg

This is something you don’t see every day: a substantial, carefully-researched book on how to reform our manufacturing industries, paired with an album of songs on the same theme.

Let’s start with the book. Sustainable Materials: With Both Eyes Open tackles a particularly thorny question: how can we cut our greenhouse gas emissions to a safe level, without shutting down essential industries? It focuses on steel and aluminium, which between them account for 28 per cent of all industrial emissions, although later chapters briefly consider cement, paper and plastics as well.

This is a follow-up book to David MacKay’s much-vaunted Sustainable Energy – Without the Hot Air. Both feature academics from the University of Cambridge carefully working out how we can transform an emissions-heavy sector of the economy.

The eight authors, led by Julian Allwood and Jonathan Cullen, first take a close look at how steel and aluminium are produced from their respective ores, asking “how much can the metals industry do to clean up its act?” The answer they come up with: “plenty, but nowhere near enough”.

So they take a second approach, asking whether we can redesign the things we make to use less metal, use them for longer, and recycle their components when they wear out. This also offer plenty of options. Reassuringly, when the two approaches are combined the total emissions cuts are substantial.

 

Some of the ideas they come up with are so simple, I wondered why no one thought of them before. For instance, the average fridge lasts about 10 years, and gets thrown out when the compressor fails. This is a small part, but it takes a lot of work to replace so it’s cheaper to buy a new fridge. If fridges were redesigned so that the compressor was easy to replace, they would last far longer. “You shouldn’t have to buy two fridges in your lifetime,” they say.

Of course, this is another example of a solution for climate change that involves huge numbers of people taking concerted action. The problem is people’s disinclination to get off their backsides.

It’s quite a technical book, so it may not have much popular appeal, despite its nicely chatty style. But for policy-makers trying to cut emissions, and anyone in manufacturing, it should be required reading.

And so to the album, a collaboration between Allwood and soprano Adey Grummet, which is much better than it has any right to be. Worthy music on eco-conscious themes can sound like Spinal Tap’s Listen to the Flower People, but With Both Eyes Open actually contains a couple of good tunes.

The strongest songs get away from the details of materials science and become universal. The opening track, You Gotta Start, is an up-tempo number extolling the virtues of having a go, even when you don’t know exactly what you need to do. It’s not just about sustainability.

Similarly, the title track is a passionate call to arms, urging people to move away from blind consumerism. The closing line – “the stuff of life is life and not just stuff” – is better and more relevant than anything Coldplay will write next year.

Given how specialist the subject matter is, I’m not sure how many people the album will really appeal to. Of the 12 songs, I only expect to keep the two I’ve highlighted on my MP3 player. Unfortunately, the rest just restate ideas from the book in a slightly less clear way.

I worry that the album will give people, particularly policy-makers, the impression that the book is somehow flaky and not worth paying attention to. That would be a crying shame, because the book’s lessons are clear, well-supported, and vital.

Book information
Sustainable Materials: With Both Eyes Open
by Julian Allwood and Jonathan Cullen
UIT Cambridge
Free online or £31.82

Can disaster aid win hearts and minds?

A Friend in Need – By Charles Kenny | Foreign Policy.

BY CHARLES KENNY | OCTOBER 31, 2011

On Tuesday last week, Turkey reversed its previous stand and decided to accept aid from Israel to help deal with the tragic earthquake that had stricken the country’s east. Shipments of portable housing units began the next day. Turkey’s Foreign Minister Ahmet Davutoglu was quick to emphasize that accepting aid did not signal an improvement in diplomatic relations between the two countries, strained ever since Israel’s raid of a Turkish aid flotilla bound for Gaza in 2010 — likely a response to the perception that aid can buy off recipient governments, even if it can’t change popular attitudes. The irony is that the humanitarian assistance that responds to disasters — unlike the majority of aid that goes to long-term development projects — might be the one case where that logic is sometimes reversed.

At a time when the United States’ aid budget is confronted by an army of hatchet-wielding deficit hawks among the Republican Party’s congressional majority and presidential candidates, some aid proponents are making the case that development and humanitarian assistance are powerful tools to buy friends and influence people. And it is true that aid has long been used to grease the often-rusty wheels of diplomacy. The Camp David Accords between Egypt and Israel were cemented with the help of an aid package worth an average of $2 billion a year to Egypt. Since 1985, U.S. law has mandated that the U.S. Agency for International Development (USAID) take account of would-be aid recipients’ voting patterns at the United Nations — rewarding larger aid packages to those who vote with America. Political Scientists David Carter at Pennsylvania State and Randall Stone at the University of Rochester note that this kind of carrot-minded approach has been successful, influencing countries’ votes on decisions that the U.S. State Department declares as politically important.Watch movie online The Lego Batman Movie (2017)

Twisting politicians’ arms is one thing, but changing popular attitudes is another matter entirely. Look again at Egypt: Despite being one of the largest recipients of USAID financing over the past 30 years, Pew surveys suggest only 20 percent of Egyptians have a favorable view of the United States — considerably less than half of the U.S. favorability rating in former Cold War foe Russia. Popular opinion in Egypt is driven by other factors, not least broader U.S. foreign policy in the region. (A propensity to invade neighboring countries doesn’t help.) And development assistance just isn’t a major factor in the financial fortunes of the average citizen. Maybe that was true back in 1990, when net overseas development assistance to the country equaled 36 percent of government expenditures. But by 2008, that figure was just 3 percent — only a little more one-tenth the value of tourism and one-seventh that of manufacturing exports.

Aid’s limited impact on public opinion usually applies even when the aid is specifically focused on winning converts. A study by consultant Michael Kleinman and Mark Bradbury, a director at the Rift Valley Institute, looked at U.S. military aid for small projects in Kenya designed to improve popular support for the U.S. military presence there, and found that it didn’t. Attitudes were shaped by faith, the relationship between target populations and the Kenyan state, U.S. foreign policy, and events in Somalia — not by a U.S.-financed well or asphalt road. A German aid agency-financed 2010 study, using repeated surveys in Afghanistan’s Takhar and Kunduz provinces, found that in a comparatively peaceful period between 2005 and 2007, development aid did have a small, short-lived positive impact on the general attitudes of Afghan respondents towards foreign peace-building operations in their backyard. But this impact disappeared as threat perceptions rose between 2007 and 2009. Not surprisingly, other factors — in this case, how many people were getting shot — were just more important than who was cutting the checks.

But there is evidence of an exception to the rule that money can’t buy love, and it involves disaster assistance. Four years after a 2005 earthquake in northern Pakistan, economists Tahir Andrabi of Pomona College and Jishnu Das of the World Bank surveyed attitudes towards foreigners in the region. They found trust in foreigners was significantly higher in areas where humanitarian aid had been concentrated than in other areas — dropping off by six percentage points for each 10 kilometers of distance from the fault line.

Why might recipients react differently and more positively to disaster relief assistance than they do to other forms of aid? In part it is surely related to the simple gratitude felt by people who have just lost much of what they had in a flood or earthquake. But it is also more plausible that such aid is given without a broader political motive. Although U.S. food aid flows according to the size of the surplus domestic crop as much as recipient need, using humanitarian relief to reward or punish countries for U.N. voting records or other diplomatic policies presents a practical challenge — you can’t schedule a disaster. Recipients appear to understand that, and are more likely to view such aid as given in good faith. In the Pakistan case, for example, Andrabi and Das note that the positive impact on attitudes was related to a significant on-the-ground presence of foreigners who were assumed to have purely humanitarian motivations — aid distribution was not perceived to be (and wasn’t) linked to war-fighting efforts.

Aid is likely to be a more effective foreign policy tool when it comes to persuading governments to do things that lack popular support. Creating that popular support in the first place is much harder. Perhaps Turkey’s Davutoglu is right to say that even government relations won’t improve in the case of Israeli disaster aid — after all, U.S. humanitarian support in the aftermath of Iran’s Bam earthquake only temporarily thawed diplomatic tensions. On the other hand, maybe the assistance can play a small role in improving popular opinion towards Israel in Turkey. For good or ill, that’s one more reason for governments to respond with open hearts and open checkbooks whenever disaster strikes worldwide.

Thailand: Super-canal may prevent floods

Thailand: Super-canal may prevent floods – CNN.

Thai authorities are considering the construction of a super-express waterway through Bangkok to prevent future floods similar to the one that has crippled the Thai capital and brought manufacturing in other parts of the country to a standstill.

A team of disaster experts from Chulalongkorn University in Bangkok is now investigating permanent solutions to the disaster that has left hundreds dead.

“One of the urgent solutions is a super-express floodway,” Thanawat Jarupongsakul, from the university’s Unit for Disaster and Land Information Studies, told the Bangkok Post.

Advertisement

Under the plan, existing natural canals — some of them more than 100 kilometers (62 miles) long — would be linked in a 200-km “super-highway” that would divert the course of floodwaters from the north.

The super-canal would hold 1.6 billion cubic meters of water and drain run-off at a rate of 6,000 cubic meters per second — the equivalent of two and a half Olympic-sized swimming pools a second.

“This idea is much cheaper than digging a new river as a floodway,” Thanawat said.

He said the proposed scheme would involve the construction of a kilometer-wide exclusion zone next to the floodway to prevent properties from being inundated, and a raised highway on both side of the canal.

The super-express floodway would then drain upstream run-off directly into the sea.

The university team is also looking at other flood-prevention measures such as a better early-warning system, improved water resource management, a flood tax, the use of a flood-risk map for urban development and groundwater-use controls.

“Now, the government must stop [trying to] solve flood problems with political methods,” Thanawat told the Bangkok Post. He said poor water management rather than excess rain had caused this year’s severe flooding, adding that natural swamps in the west of Thailand’s Central Plains, which once absorbed water flow, had been developed into industrial and residential areas, blocking the natural floodway.

While giant flood tunnels in the Bangkok metropolitan area could drain floodwater from the city, they could not cope with a massive inundation from the north.

“If there is no step forward, foreign investors will eventually disappear from the country and the next generation will be still worried whether flooding will happen or not,” he said.

Crop scientists now fret about heat not just water

NewsDaily: Crop scientists now fret about heat not just water.

By Christine StebbinsPosted 2011/10/24 at 10:49 am EDT

CHICAGO, Oct. 24, 2011 (Reuters) — Crop scientists in the United States, the world’s largest food exporter, are pondering an odd question: could the danger of global warming really be the heat?

Haze from forest fires engulfs La Paz city, August 23, 2010. REUTERS/David Mercado

For years, as scientists have assembled data on climate change and pointed with concern at melting glaciers and other visible changes in the life-giving water cycle, the impact on seasonal rains and irrigation has worried crop watchers most.

What would breadbaskets like the U.S. Midwest, the Central Asian steppes, the north China Plain or Argentine and Brazilian crop lands be like without normal rains or water tables?

Those were seen as longer-term issues of climate change.

But scientists now wonder if a more immediate issue is an unusual rise in day-time and, especially, night-time summer temperatures being seen in crop belts around the world.

Interviews with crop researchers at American universities paint the same picture: high temperatures have already shrunken output of many crops and vegetables.

“We don’t grow tomatoes in the deep South in the summer. Pollination fails,” said Ken Boote, a crop scientist with the University of Florida.

The same goes for snap beans which can no longer be grown in Florida during the summer, he added.

“As temperatures rise we are going to have trouble maintaining the yields of crops that we already have,” said Gerald Nelson, an economist with the International Food Policy Research Institute (IFPRI) who is leading a global project initially funded by the Bill and Melinda Gates Foundation to identify new crop varieties adapted to climate change.

“When I go around the world, people are much less skeptical, much more concerned about climate change,” said David Lobell, a Stanford University agricultural scientist.

Lobell was one of three authors of a much-discussed 2011 climate study of world corn, wheat, soybean and rice yields over the last three decades (1980-2008). It concluded that heat, not rainfall, was affecting yields the most.

“The magnitude of recent temperature trends is larger than those for precipitation in most situations,” the study said.

“We took a pretty conservative approach and still found sizable impacts. They certainly are happening already and not just something that will or might happen in the future,” Lobell told Reuters in an interview.

CONCERNS GROWING

Scientists at an annual meeting of U.S. agronomists last week in San Antonio said the focus was climate change.

“Its impact on agriculture systems, impacts on crops, mitigation strategies with soil management — a whole range of questions was being asked about climate change,” said Jerry Hatfield, Laboratory Director at the National Soil Tilth Laboratory in Ames, Iowa.

“The biggest thing is high night-time temperatures have a negative impact on yield,” Hatfield added, noting that the heat affects evaporation and the life process of the crops.

“One of the consequences of rising temperatures … is to compress the life cycle of that plant. The other key consequence is that when the atmosphere gets warmer the atmospheric demand for water increases,” Hatfield said.

“These are simple things that can occur and have tremendous consequences on our ability to produce a stable supply of food or feed or fiber,” he said.

Boote at the University of Florida found that rice and sorghum plants failed to produce grain, something he calls “pollen viability,” when the average 24-hour temperature is 95 degrees Fahrenheit (35 Celsius). That equates to highs of 104 F during the day and 86 F at night, he said.

The global seed industry has set a high bar to boost crop yields by 2050 to feed a hungry world. Scientists said that the impact of heat on plant growth needs more focus and study.

“If you look at a lot of crop insurance claims, farmers say it is the lack of water that caused the plant to die,” said Wolfram Schlenker, assistant professor at Columbia University.

“But I think it’s basically different sides of the same coin because the water requirement of the plant increases tremendously if it’s hot,” he said.

“The private sector understands the threats coming from climate change and have significant research programs in regards to drought tolerance. They focus less on higher temperatures, but that’s a tougher challenge,” Nelson said.

“We are responding with a number of initatives…the primary one is focusing on drought tolerance,” said John Soper, vice president in charge of global seed development for DuPont’s Pioneer Hi-Bred, a top U.S. seed producer.

Pioneer launched a conventionally bred drought-tolerant corn hybrid seed in the western U.S. Corn Belt this spring, selected for its yield advantage over other varieties.

“We have some early results in from Texas that show that is exactly how they are behaving. They currently have a 6 percent advantage over normal products in those drought zones,” Soper said.

Roy Steiner, deputy director for agricultural development for the Bill & Melinda Gates Foundation, said the foundation is focused on current agricultural effects of climate change.

“It’s amazing that there are still people who think that it’s not changing. Everywhere we go we’re seeing greater variability, the rains are changing and the timing of the rains is creating a lot more vulnerability,” Steiner said.

“Agriculture is one of those things that needs long-term planning, and we are very short-cycled thinking,” he said. “There are going to be some real shocks to the system. Climate is the biggest challenge. Demand is not going away.”

New Oil Remediation Tech takes $1mil Xprize

Disc Spins Its Way to $1 Million Oil Spill Clean Up Prize: Scientific American Podcast.

When oil started spewing from BP’s Macondo well in April 2010, there weren’t too many options for cleaning it up. Concentrated slicks on the ocean surface could be set ablaze. Booms kept oil off the shores, as long as the waves stayed calm. And chemical dispersants of unknown toxicity could be sprayed to break up oil patches.

But thanks to the Wendy Schmidt X Prize that won’t be true next time. A company from Illinois known as Elastec / American Marine won the $1 million first prize by tripling previous clean up rates.

Over 10 weeks this summer, 10 finalists out of 350 entrants demonstrated their technology at the largest outdoor saltwater wave test facility in North America. Elastec’s Grooved Disc Skimmer scooped up 4,670 gallons of oil per minute and didn’t leave much behind.

The machine looks like a giant, thick, grooved vinyl record spinning at high speed to capture nearly 90 percent of the oil on the waters. And there’s no shortage of oil spills for it to work on. The latest is underway in New Zealand, as a stranded cargo ship leaks heavy oil onto a coral reef and local beaches.

Quake-prone Japanese Area Runs Disaster System on Force.com

Quake-prone Japanese Area Runs Disaster System on Force.com | PCWorld.

A coastal region of Japan due for a major earthquake and possible tsunamis has implemented a cloud-based disaster management system run by Salesforce.com.

Shizuoka Prefecture, on Japan’s eastern coast in the central region of the country, lies curled around an undersea trough formed by the junction of two tectonic plates. It has been rocked by repeated large temblors in past centuries, collectively called “Tokai earthquakes,” and the central government has warned that with underground stresses high another is imminent.

The local prefectural government began to build a new disaster management system last year, the initial version of which went live in July. It is based on Salesforce.com’s platform-as-a-service offering, Force.com, which hosts hundreds of thousands of applications.

“It would have cost a lot more to run our own servers and network, and if a disaster happened managing something like that would be very difficult, especially if the prefecture office was damaged,” said Keisuke Uchiyama, a Shizuoka official who works with the system.

Japanese prefectures are the rough equivalent of states.

The system is currently hosted on Salesforce.com’s servers in the U.S. and goes live when an official disaster warning is issued by the government. It links up information about key infrastructure such as roads, heliports and evacuation centers.

Salesforce.com says it combines GIS (geographic information system) data with XML sent from Japan’s Meteorological Agency. Users can also send email updates from the field using their mobile phones, with GPS coordinates and pictures attached.

Uchiyama said the original plan was to allow open access, but budget cuts forced that to be postponed and it is now available only to government workers and disaster-related groups. The system was implemented with a budget of about 200 million yen (US$2.6 million) over its first two years, down from an original allotment of about 500 million yen over three years.

He said it was used to keep track of the situation last week when a powerful typhoon swept through central Japan.

The obvious downside to a hosted system is that key infrastructure is often destroyed during natural disasters. After the powerful earthquake and tsunami that hit Japan’s northeastern coast in March, some seaside towns were completely devastated and went weeks without basics like power or mobile phone service. Local communities turned to word-of-mouth and public bulletin boards to spread information and search for survivors.

“If the network gets cut, it’s over,” said Uchiyama.

Two Plus Two Equals Five – A 2nd look at disaster death tolls

Two Plus Two Equals Five – By Philip Walker | Foreign Policy.

The death toll and level of destruction immediately following a disaster are always difficult to determine, but over time a consensus usually emerges between governments and aid organizations. But, as David Rieff points out, “Sadly, over the course of the past few decades, exaggeration seems to have become the rule in the world of humanitarian relief.… These days, only the most extreme, most apocalyptic situations are likely to move donors in the rich world.” And with donor fatigue an ever-present possibility, it is no surprise then that later studies that contradict the original, inflated estimates are criticized — or worse, ignored — for seemingly undermining the humanitarian cause.

Arriving at these estimates is no easy endeavor, as government agencies and relief organization are rarely able to survey entire populations. Instead, emergency management experts rely on sound statistical and epidemiological techniques. But debating and questioning the numbers behind man-made and natural disasters is not just an academic exercise: the implications are huge. For example, relief agencies were restricted from operating in Darfur, partly because of Sudan’s anger that the U.S.-based Save Darfur Coalition had estimated that 400,000 people were killed in the region. Moreover, the U.N. Security Council used the International Rescue Committee’s death toll of 5.4 million in the Congo to put together its largest peacekeeping operation ever. Similarly, government aid pledges increase or decrease depending upon the extent of the disaster. Numbers do matter, and much depends upon their validity and credibility. What follows is a look at some recent disasters where the numbers just don’t match up.

Above, a view of some of the destruction in Bandar Aceh, Indonesia, a week after the devastating earthquake and tsunami struck on Dec. 26, 2004. According to the U.S. Geological Survey, 227,898 people died and about 1.7 million people were displaced in 14 countries in Southeast Asia, South Asia, and East Africa. Indonesia, the hardest hit country by the disaster, initially claimed that 220,000 people had died or went missing but ended up revising that number down to around 170,000.

THE DEADLIEST WAR IN THE WORLD

Discrepancy: 5.4 million vs. 900,000 dead in the Democratic Republic of the Congo between 1998 and 2008

The Democratic Republic of the Congo (DRC) has seen more than its fair share of conflict over the past 15 years. The war in the DRC officially broke out in 1998 and although the conflict technically ended in 2003 when the transitional government took over, fighting has continued in many of the country’s provinces. The conflict has been dubbed “Africa’s World War,” both due to the magnitude of the devastation and the number of African countries that have, at different times, been involved in the conflict. According to a widely cited 2008 report by the New York-based International Rescue Committee (IRC), “an estimated 5.4 million people have died as a consequence of the war and its lingering effects since 1998,” making it the world’s deadliest crisis since World War II. The organization is one of the largest providers of humanitarian aid in the Congo and is therefore deemed one of the few reliable sources on the conflict.

However, Andrew Mack, director of the Human Security Report Project at Simon Fraser University in Canada, said the IRC study did not employ appropriate scientific methodologies and that in reality far less people have died in the Congo. “When we used an alternative measure of the pre-war mortality rate, we found that the IRC estimates of their final three surveys, the figure dropped from 2.83 million to under 900,000,” Mack argued. (He also argued that international relief agencies — such as the International Rescue Committee — are facing a potential conflict of interest because they depend on donations that, in turn, are stimulated by their studies of death tolls. Those studies should be done by independent experts, not by relief agencies that depend on donations, he says.)

Above, the body of a young man lying on the central market avenue of Ninzi, about 25 miles north of Bunia, where on June 20, 2003, Lendu militias launched an attack, killing and mutilating at least 22 civilians.

Discrepancy: 400,000 vs. 15,000 women raped in the Democratic Republic of the Congo between 2006 and 2007

A June 2011 study in the American Journal of Public Health found that 400,000 women aged 15-49 were raped in the DRC over a 12-month period in 2006 and 2007. The shockingly high number is equivalent to four women being raped every five minutes. Perhaps even more alarming, the new number is 26 times higher than the 15,000 rapes that the United Nations reported during the same period.

Maria Eriksson Baaz, a Swedish academic from the University of Gothenburg, has called the study into question by arguing that it is based on out-of-date and questionable figures. As a long-time researcher on women’s rights in the DRC, Baaz claims that extrapolations made from these figures cannot be backed up scientifically. In a recent interview with the BBC, she said it was difficult to collect reliable data in the Congo and that women sometimes claim to be victims in order to get free health care. “Women who have been raped can receive free medical care while women who have other conflict-related injuries or other problems related to childbirth have to pay,” she said. “In a country like the DRC, with [its] extreme poverty where most people can simply not afford health care, it’s very natural this happens.”

Above, Suzanne Yalaka breastfeeds her baby Barunsan on Dec. 11, 2003, in Kalundja, South Kivu province. Her son is the consequence of her being raped by ten rebels from neighboring Burundi. She was left behind by her husband and her husband’s family.

NORTH KOREAN FAMINE

Discrepancy: 2.4 million vs. 220,000 dead in North Korea between 1995 and 1998

Due to the regime’s secretive nature, reliable statistics on the 1990s famine in North Korea are hard to come by. Yet, surprisingly, on May 15, 2001, at a UNICEF conference in Beijing, Choe Su-hon, one of Pyongyang’s nine deputy foreign ministers at the time, stated that between 1995 and 1998, 220,000 North Koreans died in the famine. Compared with outside estimates, these figures were on the low end — presumably because it was in the regime’s interest to minimize the death toll.

A 1998 report by U.S. congressional staffers, who had visited the country, found that from 1995 to 1998 between 900,000 and 2.4 million people had died as a result of food shortages. It noted that other estimates by exile groups were substantially higher but that these numbers were problematic because they were often based on interactions with refugees from the northeastern province of North Hamgyong, which was disproportionately affected by the famine.

Above, North Koreans rebuilding a dike in Mundok county, South Pyongan province, in September 1997, following an August tidal wave after typhoon Winnie. The rebuilding effort was part of an emergency food-for-work project organized by the World Food Program. According to a former North Korean government official, during the famine — from 1993 to 1999 — life expectancy fell from 73.2 to 66.8 and infant mortality almost doubled from 27 to 48 per 1,000 people.

GENOCIDE IN DARFUR

Discrepancy: 400,000 vs. 60,000 dead in Darfur between 2003 and 2005

In 2006, three years after the conflict in Darfur began, Sudanese President Omar al-Bashir publically criticized the United Nations for exaggerating the extent of the fighting in Darfur. “The figure of 200,000 dead is false and the number of dead is not even 9,000,” he proclaimed. At the same time, outside groups like the Save Darfur Coalition and various governments, including the United States, were having a difficult time producing concrete numbers as well. Their only consensus was that the real death toll was exponentially higher than those numbers provided by Bashir.

In 2005, a year after U.S. Secretary of State Colin Powell told a U.S. congressional committee that the ethnic violence in Darfur amounted to “genocide,” Deputy Secretary of State Robert Zoellick estimated the death toll between 60,000 and 160,000. Zoellick was widely criticized for understating the numbers. The World Health Organization estimated that 70,000 people had died over a seven-month period alone. At the same time, researchers for the Coalition for International Justice contended that 396,563 people had died in Darfur. Today, the Sudanese authorities claim that since the conflict began in 2003, 10,000 people have died, while the U.N. estimates that over 300,000 have been killed and another 2.7 million have been displaced.

Above, an armed Sudanese rebel arrives on Sept. 7, 2004, at the abandoned village of Chero Kasi less than an hour after Janjaweed militiamen set it ablaze in the violence-plagued Darfur region.

CYCLONE NARGIS 

Discrepancy: 138,000 vs. unknown death toll in Burma in 2008

Tropical cyclone Nargis made landfall in southern Burma on May 2, 2008, leaving a trail of death and destruction before petering out the next day. It devastated much of the fertile Irrawaddy delta and Yangon, the nation’s main city. Nargis brought about the worst natural disaster in the country’s history — with a death toll that may have exceeded 138,000, according to a study by the Georgia Institute of Technology. But, with a vast number of people still unaccounted for three years later, the death toll might even be higher. The Burmese authorities allegedly stopped counting for fear of political fallout.

It’s more common for countries hit by a devastating disaster to share their plight with the world and plead for a robust relief effort, but in the aftermath of cyclone Nargis the Burmese military regime sought to maintain control over news of the disaster — restricting access to journalists and censoring the release of information and images. Moreover, the United Nations and other relief agencies were initially banned from setting up operations. At the time, with over 700,000 homes blown away, the U.N. and the Red Cross estimated that over 2.5 million people were in desperate need of aid.

Above, school teacher Hlaing Thein stands on the wreckage of a school destroyed by cyclone Nargis in Mawin village in the Irrawaddy delta region on June 9, 2008.

 

Two Plus Two Equals Five

What numbers can we trust? A second look at the death toll from some of the world’s worst disasters.

BY PHILIP WALKER | AUGUST 17, 2011

EARTHQUAKE IN HAITI

Discrepancy: 318,000 vs. 46,000-85,000 dead in Haiti in 2010

The devastating earthquake of Jan. 12, 2010, killed over 318,000 people and left over 1.5 million people homeless, according to the Haitian government. International relief organizations generally estimate anywhere between 200,000 and 300,000 casualties.

However, a recently leaked report compiled for USAID by a private consulting firm claims that the death toll is likely between 46,000 and 85,000, and that roughly 900,000 people were displaced by the earthquake. The report has not yet been published, but its alleged findings have already been disputed by both Haitian authorities and the United Nations. Even the U.S. State Department, for now, is reluctant to endorse it, saying “internal inconsistencies” in some of the statistical analysis are currently being investigated prior to publication.

PAKISTAN FLOODS

Discrepancy: Large numbers affected vs. small death toll in Pakistan in 2010

A young girl washes the mud from her toy at a water pump in the middle of collapsed buildings at a refugee camp near Nowshera in northwest Pakistan on Sept. 23, 2010. Figures provided by the United Nations and Pakistan’s government estimate that 20 million people were affected by the 2010 summer floods — the worst in the country’s history. Almost 2,000 people died, 3,000 were injured, 2 million homes were damaged or destroyed, and over 12 million people were left in need of emergency food aid, according to Pakistan’s National and Provincial Disaster Management Authority. Flood waters wiped out entire villages and vast stretches of farmland affecting an area roughly the size of England. After surveying 15 key sectors across the country, in Oct. 2010, the World Bank and Asian Development Bank announced an estimated damage of $9.7 billion — an amount more than twice that of Pakistan’s 2005 earthquake which killed approximately 86,000 people. U.N. Secretary-General Ban Ki-moon characterized the destruction as more dire than that caused by the 2004 Indian Ocean tsunami and the Pakistani earthquake combined. “In the past I have visited the scenes of many natural disasters around the world, but nothing like this,” he stated.

David Rieff warns that, “By continually upping the rhetorical ante, relief agencies, whatever their intentions, are sowing the seeds of future cynicism, raising the bar of compassion to the point where any disaster in which the death toll cannot be counted in the hundreds of thousands, that cannot be described as the worst since World War II or as being of biblical proportions, is almost certainly condemned to seem not all that bad by comparison.” This was the case in Pakistan where the number affected by the flooding was gigantic but the death toll was relatively low — especially compared to the Haiti earthquake a few months earlier. As a result, the United Nations and other aid organizations were unable to raise large sums for the relief effort compared to previous disasters. “Right now, our level of needs in terms of funding is huge compared to what we’ve been receiving, even though this is the largest, by far, humanitarian crisis we’ve seen in decades, ” said Louis-George Arsenault, director of emergency operations for UNICEF, in an interview with the BBC in Aug. 2010.

As David Meltzer, senior vice president of international services for the American Red Cross, discerningly put it, “Fortunately, the death toll [in Pakistan] is low compared to the tsunami and the quake in Haiti. … The irony is, our assistance is focused on the living — and the number of those in need is far greater than in Haiti.”

 

NASA satellite to Crash to Earth Today

Heads up! NASA satellite descends toward fiery doom | The Space Shot – CNET News.

NASA’s decommissioned Upper Atmosphere Research Satellite, out of gas and out of control, is not descending toward re-entry as rapidly as expected, officials say, likely delaying the satellite’s kamikaze plunge to Earth by a few hours, to late Friday or early Saturday.

Experts expect more than two dozen chunks of debris to survive re-entry and hit the ground in a 500-mile-long footprint somewhere along the satellite’s orbital track. But given the bus-size 6.3-ton’s satellite’s trajectory and the vast areas of ocean and sparsely populated areas UARS passes over, experts say it is unlikely any falling debris will result in injuries or significant property damage.

Additional radar tracking is required to pinpoint when–and where–the satellite will make its final descent.

 

A chart showing the latest predicted entry point for the Upper Atmosphere Research Satellite, based on data from U.S. Strategic Command. Because of uncertainty about the satellite’s behavior as it approaches the discernible atmosphere, the timing of the re-entry could change by several hours either way.

(Credit: William Harwood/MacDoppler Pro)

“As of 10:30 a.m. EDT on Sept. 23, 2011, the orbit of UARS was 100 miles by 105 miles (160 km by 170 km),” NASA said in a brief update. “Re-entry is expected late Friday, Sept. 23, or early Saturday, Sept. 24, Eastern Daylight Time. Solar activity is no longer the major factor in the satellite’s rate of descent. The satellite’s orientation or configuration apparently has changed, and that is now slowing its descent.

Related story
Track NASA’s crashing satellite to avoid getting hit by space junk

“There is a low probability any debris that survives re-entry will land in the United States, but the possibility cannot be discounted because of this changing rate of descent. It is still too early to predict the time and location of re-entry with any certainty, but predictions will become more refined in the next 12 to 18 hours.”

A subsequent update from U.S. Strategic Command, which operates a global radar network used to monitor more than 20,000 objects in low-Earth orbit, predicted the satellite would re-enter sometime around 11:34 p.m. EDT Friday as the spacecraft flies over the southern Indian Ocean. But the prediction was uncertain by several hours and at orbital velocities of 5 miles per second, just 10-minutes of uncertainty translates into 3,000 miles of uncertainty in position.

For comparison, some 42.5 tons of wreckage from the shuttle Columbia hit the ground in a footprint stretching from central Texas to Louisiana when the orbiter broke apart during re-entry in 2003. No one on the ground was injured and no significant property damage was reported.

Tracking data is expected to improve as the day wears on, and subsequent updates should be more precise.

The centerpiece of a $750 million mission, the Upper Atmosphere Research satellite was launched from the shuttle Discovery in September 1991. The solar-powered satellite studied a wide variety of atmospheric phenomena, including the depletion of Earth’s ozone layer 15 to 30 miles up.

The long-lived satellite was decommissioned in 2005 and one side of its orbit was lowered using the last of its fuel to hasten re-entry and minimize the chances of orbital collisions that could produce even more orbital debris. No more fuel is available for maneuvering and the satellite’s re-entry will be “uncontrolled.”

Nick Johnson, chief scientist with NASA’s Orbital Debris Program at the Johnson Space Center in Houston, told reporters last week he expects most of the satellite to burn up as it slams into the dense lower atmosphere at more than 17,000 mph. But computer software used to analyze possible re-entry outcomes predicts 26 pieces of debris will survive to impact the surface in a 500-mile-long down-range footprint.

“We looked at those 26 pieces and how big they are and we’ve looked at the fact they can hit anywhere in the world between 57 north and 57 south and we looked at what the population density of the world is,” he said. “Numerically, it comes out to a chance of 1-in-3,200 that one person anywhere in the world might be struck by a piece of debris. Those are obviously very, very low odds that anybody’s going to be impacted by this debris.”

For comparison, some 42.5 tons of wreckage from the shuttle Columbia hit the ground in a footprint stretching from central Texas to Louisiana when the orbiter broke apart during re-entry in 2003. No one on the ground was injured and no significant property damage was reported.

Can we count on cell networks in disasters?

Can we count on cell networks in disasters? | Signal Strength – CNET News.

Andrea Mancuso was working just north of the World Trade Center on September 11, 2001, when two planes struck the towers. Soon after, she was the only person around who seemed to have cell phone service.

“I walked from downtown to Lincoln Center (about 4.5 miles) before I was able to hail a cab with four strangers,” she said. “Everyone was upset, and no one had a cell phone signal except me. I passed my phone around like a hot potato all the way to Harlem. Everyone including the cab driver graciously and tearfully called their families.”

Her story, of course, is not unique. For hours, family members and co-workers frantically tried to contact people they knew in Lower Manhattan.

The network failure could partially be pinned on infrastructure damage. Cell towers were destroyed in the attacks, along with switching equipment used for landline phones. But another cause of the problem was the huge surge in traffic from people trying to find loved ones or letting others know they were OK.

Since 9/11, wireless networks have been tested time and again, and their performance has been shaky. A major blackout in the Northeast in 2003, Hurricane Katrina in 2005, and the Minneapolis bridge collapse in 2007 put strains on local networks. Cellular service in New York City even ground to a halt last month because of a minor earthquake centered several hundred miles away.

Undoubtedly, with each crisis, operators have learned more about what they can do to keep service up and running. But there’s a flip side to that growing expertise: we’re more dependent than ever on cell phones.

In September 2001, there were between 118 million and 128 million wireless subscribers who owned cell phones, according to data compiled by the CTIA Wireless Association. At the end of 2010, it was 302 million, or more than 96 percent of Americans. To keep up with the surge, wireless operators have spent billions of dollars upgrading their networks, adding more than 125,000 new cell sites since the end of 2001.

 

Related stories:
Hurricane Irene’s challenge for cell phone networks
Cell service jammed after East Coast earthquake
Wireless operators accelerate upgrade plans

But is that enough? Despite anecdotal evidence to the contrary, industry experts interviewed by CNET believe that carriers have made huge strides on network reliability by doing simple things like expansion and adding generator backups, and complex things like forming emergency response units. They believe we’re in far better shape than we were 10 years ago…to a point.

“There’s no question the networks are in a much better position today than they were in 2001 to handle a significant crisis, even in the face of a staggering increase in users,” said Charles Golvin, an analyst at Forrester Research. “But it’s also important for people to make contingency plans for communications, in case the network isn’t working.”

On the plus side
There’s reason for that guarded optimism. Carriers have added spectrum and high-capacity connections from their cell towers to the wired networks that transport voice and data traffic. And they’ve hardened their networks with equipment that can withstand heavy wind and rain, as well as ensure that the equipment remains functional when commercial power is lost.

In fact, all the major wireless carriers have increased the number of cell sites with backup power supplies. They’ve also increased the number of cell sites on wheels that they can roll into locations that have had infrastructure damage.

“A good proportion of the cellular bay stations across all the major carriers now have some kind of battery or generator for backup power,” said Gerard Hallaren, an equities analyst at JRPG Research. “That wasn’t the case back in 2001.”

Verizon Wireless, for example, has for years been installing backup generators and batteries to many of its cell sites. During the 2003 blackout that kept much of the Northeast in the dark for hours, Verizon’s customers could still communicate when customers from other carriers could not.

There’s also that human element, in terms of specialized units that can quickly move into an area.

“In the event of a natural or man-made disaster, like 9/11, we have multiple groups within AT&T who are equipped to respond quickly to repair and restore network capabilities,” AT&T spokesman Mark Siegel said. “Our network disaster recovery team is a great example of that. They were deployed after 9/11 to assist in restorations, and we’ve invested $600 million in our NDR team since its formation.”

There are also new services. Enhanced 911, which allows 911 operators to locate callers on a cell phone, is a prime example. A decade ago, cell phones in the U.S. didn’t yet support the technology. Today, every phone sold in the U.S. is capable of providing location information to emergency 911 operators.

While it was gaining popularity in Europe and elsewhere in 2001, SMS text-messaging services in the U.S. weren’t used much by wireless subscribers because they worked only within carrier networks. This meant that on September 11, 2001, if someone wanted to send a text message to a family member or loved one, they were able to send it only to someone who subscribed to the same carrier.

A few months later, in November 2001, carriers began to connect their networks for text messaging, allowing subscribers on different networks to exchange texts. Today, more than 187 billion text messages cross U.S. wireless networks each month.

Text messaging has become a critical form of communication during a crisis. In the lead-up to Hurricane Irene on the East Coast last month, wireless operators and public-safety officials were asking consumers to use text messaging during the storm instead of making voice calls to help alleviate network congestion.

Text messaging is a better way of communicating in a crisis for several reasons. To start, the messages are small and consume only a small amount of network resources. Second, messages are sent on a cell phone’s signaling channel. This means that they’re in a separate “lane” from voice and data messages, so they may have a clear path when the voice network is congested. And if the network is too congested even to send a text, the message can be stored. When service resumes, the message is sent.

The technology has become such a ubiquitous and reliable form of communication during an emergency that the Federal Communication Commission is working on rules to allow 911 call centers to accept SMS text messages, as well as photos, videos, and data communications, to provide more information to first responders for assessing and responding to emergencies.

Congestion problem
The biggest problem wireless networks face today in a crisis is a rapid increase in usage. The networks don’t have enough capacity to handle the surge in call volume. Cellular networks are designed to handle a certain amount of calls in each cell site or region, with wireless operators carefully calculating how much usage is needed to serve the average usage volume while having just enough capacity to handle spikes in demand.

The problem occurs when a disaster hits, and thousands of people all at once pick up their phones to call someone, send a text message, update Twitter, and so on. There simply isn’t enough capacity in the network to allow everyone in a cell site to make a phone call at the same time.

Steve Largent, the president of the CTIA Wireless Association, argues that more wireless spectrum is needed to ensure that more “lanes” for data can be opened up for wireless operators to direct traffic to during a crisis.

“Crisis situations are a perfect example of why it’s so important that the government makes more wireless spectrum available,” he said.

While more spectrum could help, it’s unclear if it would ever be cost-effective for wireless operators to configure their networks to withstand the highest demand for network resources. Analyst Gerard Hallaren said most networks are designed to handle only about 20 percent to 40 percent of maximum traffic, with 40 percent being on the conservative side.

“It’s just economic insanity for any carrier to try to solve the congestion problem,” he said. “It’s cost-prohibitive to build a network that could serve 330 million at the same time. A service like that would cost hundreds of dollars a month, and people are not willing to pay that much for cell phone service.”

That said, the carriers say they’ve made improvements to their networks and are trying to alleviate the issue.

“We continue to build redundancies into our network and increase capacity so that it is not overwhelmed by ‘sudden calling events,'” AT&T’s Siegel said.

New generations of cellular technology have also helped make wireless more available during a crisis. The move from 2G to 3G, and now to 4G, will offer carriers more efficiencies in how they use their spectrum, which could also be a benefit during an emergency to alleviate network congestion.

The other major difference between September 2001 and now is that the mobile Internet as we know it today did not exist. Third-generation, or so called 3G, wireless networks were not deployed, and most people did not have access to the Internet from their cell phones. Facebook, Twitter, and other social-networking apps that people access easily from their cell phones today to share pictures, updates, and other information weren’t even invented back then. While this traffic also increases the load on networks, sometimes it’s easier for users to get through to these sites than to make voice connections via their cell phones.

“People have so many more ways of communicating with each other now to tell someone where they are or that they are all right,” Forrester’s Golvin said. “Having these communication alternatives is a huge improvement over where we were a decade ago.”

Public-safety officials had their own communications challenges responding to the terrorist attacks on September 11, 2001. Tomorrow CNET will explain the problems first responders had that day and why the public-safety community is still waiting for their own wireless network.

Forensic Meteorology Becomes New Growth Industry as Weather-Related Damage Intensifies

CSI: Mother Nature–Forensic Meteorology a New Growth Industry as Weather-Related Damage Intensifies: Scientific American.

WEATHER-DAMAGE LIABILITY?: Trees can be helpful to forensic meteorologists in solving cases. Image: Keegan Mullaney/Flickr

As Irene battered the East Coast two weeks ago, Frank Lombardo knew that only after the rain and wind stopped and the floods receded, would his work begin.

That’s because as a forensic meteorologist, Lombardo is often called on to consult on legal and insurance cases resulting from violent storms. His job, and that of any forensic meteorologist, is to reconstruct the weather conditions that occurred at a specific time and location in question by retrieving and analyzing archived atmospheric data and re-creating a time line of meteorological events.

“As soon as something happens…whether there’s a catastrophic event or a minor localized event, forensic meteorologists understand things will quiet down, but in a few years from now, it will get into the courts,” says Lombardo, president of WeatherWorks, Inc. The Hackettstown, N.J.–based company provides meteorological expertise to public and private sector organizations, including the media.

Described as a combination of science, art and interpretation, forensic meteorology mirrors the work that detectives do to solve crimes. Cases may involve whether lightning sparked a fire or, if someone slips and falls, whether ice on a property was to blame. Data comes from various sources, including observations, weather stations at airports, Doppler radar and satellite imagery, National Weather Service bulletins, and even tidal gages. Forensic meteorologists may also take their own measurements, such as wind velocity. Cases are mainly site-specific, and much of the problem-solving involves knowing what synoptic, or generalized, data is needed to reconstruct the micrometeorology at a particular location.

“A lot of what we depend on is experience, but we need tools of the industry, such as Doppler radar and good observations” to solve mysteries related to weather, Lombardo says.

He recounts one of his cases in which a crane collapsed near a building, injuring the operator. It was a blustery day, and the wind threshold of the crane ranged from approximately 48 to 56 kilometers per hour, according to the manufacturer. Hired by the operator, his charge was to determine how the localized weather influenced the crane’s fall. Lombardo visited the site on a day that had similar conditions as when the accident occurred, and on noticing that the crane was positioned near a nine-meter wall, wondered if that had influenced the site’s wind velocity. He measured the wind speed using an anemometer, noting that the wind intersected the wall at a 70 to 80 degree angle. By calculating simple vectors, he discovered that the wind speed near the crane was around 29 to 48 kilometers per hour, right at the edge of what the crane could withstand. “I secured information that supported the case that the [worker] shouldn’t have been operating the crane,” Lombardo says. “It wasn’t his fault. It was a function of the wind converging on the wall, which increased the wind pressure on the crane, causing it to collapse.”

Sometimes, data that is needed to decipher how atmospheric conditions affected a particular location and case is not available. Stephen Wistar, a forensic meteorologist with AccuWeather, consulted on hundreds of cases relating to Hurricane Katrina. Most of his investigations centered on insurance claims about whether wind or storm surge caused property damage. He lacked access to much of the typical data he would have used in similar circumstances because many of the standard tools, such as weather stations and tidal gages, failed when the storm hit. Instead, he utilized a massive computer model called “ADvanced CIRCulation” (ADCIRC), which predicts tidal and storm surge elevations and velocities over large areas. Combined with information he retrieved from Doppler radar sites outside of New Orleans, and on-site investigations he conducted himself, he was able to reconstruct time lines of property damage and state which hit property first—the wind or the water.

In his Katrina inquiries, Wistar also discovered that “trees were very helpful” in solving cases. In areas of Mobile, Ala., where complete neighborhoods were torn apart, he often examined tree damage for clues about what caused property destruction. He knew the direction of the wind at various points in the storm, both before and after the storm surge, and therefore could determine which direction trees would have fallen at those same points. By noting fallen tree locations and directions, he determined time lines for property devastation, even when there were no structures left standing.

“Part of our job is to filter the data and understand what makes and doesn’t make sense,” Wistar says. But above all, “my job is to tell the truth.”

With the climate changing, forensic meteorologists’ work will not diminish. “‘Extreme’ will become the new normal,” Lombardo says. “Our most difficult tasks as forensic meteorologists are dealing with these extreme events, and how the forensic meteorology, insurance and legal industries are going to react.” New definitions of what is “extreme” will affect potential claims against municipalities, Wistar adds.

Another concern is the uptick in weather and related events occurring on a planet-wide basis, such as El Niño and La Niña. Generally, forensic meteorologists’ examinations are limited to a specific site. “If climate change continues to occur, however, and we see more worldwide events in increasing frequency, will that change how we look at local events? It may,” ponders Lombardo. He cites a global weather phenomenon, called atmospheric blocking, as an example of a planet-wide occurrence that is has already affected his forensic micrometeorology endeavors. Atmospheric blocking obstructs winds that come across the Pacific and forces them north into Alaska, Siberia and the North Pole. The winds then head south, “creating a pool of cold Arctic air that moves into the United States, providing a source for ice storms to develop,” he says. In the past two years, up and down the eastern seaboard, atmospheric blocking has directly led to snowfall in areas where heavy snow is uncommon, and consequently, “hundreds of slip-and-fall cases” have come across Lombardo’s desk. “[Atmospheric blocking] results in localized storms that produce the conditions that are favorable to generating future forensic work,” he says.

But the future holds other concerns—and opportunities—for forensic meteorologists. As Wistar notes, more people have migrated to regions on the planet “where the weather tends to be more dangerous,” such as the southern U.S. With more people in harm’s way, there will undoubtedly be more legal, insurance and engineering cases in which forensic meteorologists’ expert contributions will be vital.