Category Archives: Honesty-Truth

Two Plus Two Equals Five – A 2nd look at disaster death tolls

Two Plus Two Equals Five – By Philip Walker | Foreign Policy.

The death toll and level of destruction immediately following a disaster are always difficult to determine, but over time a consensus usually emerges between governments and aid organizations. But, as David Rieff points out, “Sadly, over the course of the past few decades, exaggeration seems to have become the rule in the world of humanitarian relief.… These days, only the most extreme, most apocalyptic situations are likely to move donors in the rich world.” And with donor fatigue an ever-present possibility, it is no surprise then that later studies that contradict the original, inflated estimates are criticized — or worse, ignored — for seemingly undermining the humanitarian cause.

Arriving at these estimates is no easy endeavor, as government agencies and relief organization are rarely able to survey entire populations. Instead, emergency management experts rely on sound statistical and epidemiological techniques. But debating and questioning the numbers behind man-made and natural disasters is not just an academic exercise: the implications are huge. For example, relief agencies were restricted from operating in Darfur, partly because of Sudan’s anger that the U.S.-based Save Darfur Coalition had estimated that 400,000 people were killed in the region. Moreover, the U.N. Security Council used the International Rescue Committee’s death toll of 5.4 million in the Congo to put together its largest peacekeeping operation ever. Similarly, government aid pledges increase or decrease depending upon the extent of the disaster. Numbers do matter, and much depends upon their validity and credibility. What follows is a look at some recent disasters where the numbers just don’t match up.

Above, a view of some of the destruction in Bandar Aceh, Indonesia, a week after the devastating earthquake and tsunami struck on Dec. 26, 2004. According to the U.S. Geological Survey, 227,898 people died and about 1.7 million people were displaced in 14 countries in Southeast Asia, South Asia, and East Africa. Indonesia, the hardest hit country by the disaster, initially claimed that 220,000 people had died or went missing but ended up revising that number down to around 170,000.

THE DEADLIEST WAR IN THE WORLD

Discrepancy: 5.4 million vs. 900,000 dead in the Democratic Republic of the Congo between 1998 and 2008

The Democratic Republic of the Congo (DRC) has seen more than its fair share of conflict over the past 15 years. The war in the DRC officially broke out in 1998 and although the conflict technically ended in 2003 when the transitional government took over, fighting has continued in many of the country’s provinces. The conflict has been dubbed “Africa’s World War,” both due to the magnitude of the devastation and the number of African countries that have, at different times, been involved in the conflict. According to a widely cited 2008 report by the New York-based International Rescue Committee (IRC), “an estimated 5.4 million people have died as a consequence of the war and its lingering effects since 1998,” making it the world’s deadliest crisis since World War II. The organization is one of the largest providers of humanitarian aid in the Congo and is therefore deemed one of the few reliable sources on the conflict.

However, Andrew Mack, director of the Human Security Report Project at Simon Fraser University in Canada, said the IRC study did not employ appropriate scientific methodologies and that in reality far less people have died in the Congo. “When we used an alternative measure of the pre-war mortality rate, we found that the IRC estimates of their final three surveys, the figure dropped from 2.83 million to under 900,000,” Mack argued. (He also argued that international relief agencies — such as the International Rescue Committee — are facing a potential conflict of interest because they depend on donations that, in turn, are stimulated by their studies of death tolls. Those studies should be done by independent experts, not by relief agencies that depend on donations, he says.)

Above, the body of a young man lying on the central market avenue of Ninzi, about 25 miles north of Bunia, where on June 20, 2003, Lendu militias launched an attack, killing and mutilating at least 22 civilians.

Discrepancy: 400,000 vs. 15,000 women raped in the Democratic Republic of the Congo between 2006 and 2007

A June 2011 study in the American Journal of Public Health found that 400,000 women aged 15-49 were raped in the DRC over a 12-month period in 2006 and 2007. The shockingly high number is equivalent to four women being raped every five minutes. Perhaps even more alarming, the new number is 26 times higher than the 15,000 rapes that the United Nations reported during the same period.

Maria Eriksson Baaz, a Swedish academic from the University of Gothenburg, has called the study into question by arguing that it is based on out-of-date and questionable figures. As a long-time researcher on women’s rights in the DRC, Baaz claims that extrapolations made from these figures cannot be backed up scientifically. In a recent interview with the BBC, she said it was difficult to collect reliable data in the Congo and that women sometimes claim to be victims in order to get free health care. “Women who have been raped can receive free medical care while women who have other conflict-related injuries or other problems related to childbirth have to pay,” she said. “In a country like the DRC, with [its] extreme poverty where most people can simply not afford health care, it’s very natural this happens.”

Above, Suzanne Yalaka breastfeeds her baby Barunsan on Dec. 11, 2003, in Kalundja, South Kivu province. Her son is the consequence of her being raped by ten rebels from neighboring Burundi. She was left behind by her husband and her husband’s family.

NORTH KOREAN FAMINE

Discrepancy: 2.4 million vs. 220,000 dead in North Korea between 1995 and 1998

Due to the regime’s secretive nature, reliable statistics on the 1990s famine in North Korea are hard to come by. Yet, surprisingly, on May 15, 2001, at a UNICEF conference in Beijing, Choe Su-hon, one of Pyongyang’s nine deputy foreign ministers at the time, stated that between 1995 and 1998, 220,000 North Koreans died in the famine. Compared with outside estimates, these figures were on the low end — presumably because it was in the regime’s interest to minimize the death toll.

A 1998 report by U.S. congressional staffers, who had visited the country, found that from 1995 to 1998 between 900,000 and 2.4 million people had died as a result of food shortages. It noted that other estimates by exile groups were substantially higher but that these numbers were problematic because they were often based on interactions with refugees from the northeastern province of North Hamgyong, which was disproportionately affected by the famine.

Above, North Koreans rebuilding a dike in Mundok county, South Pyongan province, in September 1997, following an August tidal wave after typhoon Winnie. The rebuilding effort was part of an emergency food-for-work project organized by the World Food Program. According to a former North Korean government official, during the famine — from 1993 to 1999 — life expectancy fell from 73.2 to 66.8 and infant mortality almost doubled from 27 to 48 per 1,000 people.

GENOCIDE IN DARFUR

Discrepancy: 400,000 vs. 60,000 dead in Darfur between 2003 and 2005

In 2006, three years after the conflict in Darfur began, Sudanese President Omar al-Bashir publically criticized the United Nations for exaggerating the extent of the fighting in Darfur. “The figure of 200,000 dead is false and the number of dead is not even 9,000,” he proclaimed. At the same time, outside groups like the Save Darfur Coalition and various governments, including the United States, were having a difficult time producing concrete numbers as well. Their only consensus was that the real death toll was exponentially higher than those numbers provided by Bashir.

In 2005, a year after U.S. Secretary of State Colin Powell told a U.S. congressional committee that the ethnic violence in Darfur amounted to “genocide,” Deputy Secretary of State Robert Zoellick estimated the death toll between 60,000 and 160,000. Zoellick was widely criticized for understating the numbers. The World Health Organization estimated that 70,000 people had died over a seven-month period alone. At the same time, researchers for the Coalition for International Justice contended that 396,563 people had died in Darfur. Today, the Sudanese authorities claim that since the conflict began in 2003, 10,000 people have died, while the U.N. estimates that over 300,000 have been killed and another 2.7 million have been displaced.

Above, an armed Sudanese rebel arrives on Sept. 7, 2004, at the abandoned village of Chero Kasi less than an hour after Janjaweed militiamen set it ablaze in the violence-plagued Darfur region.

CYCLONE NARGIS 

Discrepancy: 138,000 vs. unknown death toll in Burma in 2008

Tropical cyclone Nargis made landfall in southern Burma on May 2, 2008, leaving a trail of death and destruction before petering out the next day. It devastated much of the fertile Irrawaddy delta and Yangon, the nation’s main city. Nargis brought about the worst natural disaster in the country’s history — with a death toll that may have exceeded 138,000, according to a study by the Georgia Institute of Technology. But, with a vast number of people still unaccounted for three years later, the death toll might even be higher. The Burmese authorities allegedly stopped counting for fear of political fallout.

It’s more common for countries hit by a devastating disaster to share their plight with the world and plead for a robust relief effort, but in the aftermath of cyclone Nargis the Burmese military regime sought to maintain control over news of the disaster — restricting access to journalists and censoring the release of information and images. Moreover, the United Nations and other relief agencies were initially banned from setting up operations. At the time, with over 700,000 homes blown away, the U.N. and the Red Cross estimated that over 2.5 million people were in desperate need of aid.

Above, school teacher Hlaing Thein stands on the wreckage of a school destroyed by cyclone Nargis in Mawin village in the Irrawaddy delta region on June 9, 2008.

 

Two Plus Two Equals Five

What numbers can we trust? A second look at the death toll from some of the world’s worst disasters.

BY PHILIP WALKER | AUGUST 17, 2011

EARTHQUAKE IN HAITI

Discrepancy: 318,000 vs. 46,000-85,000 dead in Haiti in 2010

The devastating earthquake of Jan. 12, 2010, killed over 318,000 people and left over 1.5 million people homeless, according to the Haitian government. International relief organizations generally estimate anywhere between 200,000 and 300,000 casualties.

However, a recently leaked report compiled for USAID by a private consulting firm claims that the death toll is likely between 46,000 and 85,000, and that roughly 900,000 people were displaced by the earthquake. The report has not yet been published, but its alleged findings have already been disputed by both Haitian authorities and the United Nations. Even the U.S. State Department, for now, is reluctant to endorse it, saying “internal inconsistencies” in some of the statistical analysis are currently being investigated prior to publication.

PAKISTAN FLOODS

Discrepancy: Large numbers affected vs. small death toll in Pakistan in 2010

A young girl washes the mud from her toy at a water pump in the middle of collapsed buildings at a refugee camp near Nowshera in northwest Pakistan on Sept. 23, 2010. Figures provided by the United Nations and Pakistan’s government estimate that 20 million people were affected by the 2010 summer floods — the worst in the country’s history. Almost 2,000 people died, 3,000 were injured, 2 million homes were damaged or destroyed, and over 12 million people were left in need of emergency food aid, according to Pakistan’s National and Provincial Disaster Management Authority. Flood waters wiped out entire villages and vast stretches of farmland affecting an area roughly the size of England. After surveying 15 key sectors across the country, in Oct. 2010, the World Bank and Asian Development Bank announced an estimated damage of $9.7 billion — an amount more than twice that of Pakistan’s 2005 earthquake which killed approximately 86,000 people. U.N. Secretary-General Ban Ki-moon characterized the destruction as more dire than that caused by the 2004 Indian Ocean tsunami and the Pakistani earthquake combined. “In the past I have visited the scenes of many natural disasters around the world, but nothing like this,” he stated.

David Rieff warns that, “By continually upping the rhetorical ante, relief agencies, whatever their intentions, are sowing the seeds of future cynicism, raising the bar of compassion to the point where any disaster in which the death toll cannot be counted in the hundreds of thousands, that cannot be described as the worst since World War II or as being of biblical proportions, is almost certainly condemned to seem not all that bad by comparison.” This was the case in Pakistan where the number affected by the flooding was gigantic but the death toll was relatively low — especially compared to the Haiti earthquake a few months earlier. As a result, the United Nations and other aid organizations were unable to raise large sums for the relief effort compared to previous disasters. “Right now, our level of needs in terms of funding is huge compared to what we’ve been receiving, even though this is the largest, by far, humanitarian crisis we’ve seen in decades, ” said Louis-George Arsenault, director of emergency operations for UNICEF, in an interview with the BBC in Aug. 2010.

As David Meltzer, senior vice president of international services for the American Red Cross, discerningly put it, “Fortunately, the death toll [in Pakistan] is low compared to the tsunami and the quake in Haiti. … The irony is, our assistance is focused on the living — and the number of those in need is far greater than in Haiti.”

 

climate sceptics take note: raw data you wanted now available

OK, climate sceptics: here’s the raw data you wanted – environment – 28 July 2011 – New Scientist.

Anyone can now view for themselves the raw data that was at the centre of last year’s “climategate” scandal.

Temperature records going back 150 years from 5113 weather stations around the world were yesterday released to the public by the Climatic Research Unit (CRU) at the University of East Anglia in Norwich, UK. The only records missing are from 19 stations in Poland, which refused to allow them to be made public.

“We released [the dataset] to dispel the myths that the data have been inappropriately manipulated, and that we are being secretive,” says Trevor Davies, the university’s pro-vice-chancellor for research. “Some sceptics argue we must have something to hide, and we’ve released the data to pull the rug out from those who say there isn’t evidence that the global temperature is increasing.”

Hand it over

The university were ordered to release data by the UK Information Commissioner’s Office, following a freedom-of-information request for the raw data from researchers Jonathan Jones of the University of Oxford and Don Keiller of Anglia Ruskin University in Cambridge, UK.

Davies says that the university initially refused on the grounds that the data is not owned by the CRU but by the national meteorological organisations that collect the data and share it with the CRU.

When the CRU’s refusal was overruled by the information commissioner, the UK Met Office was recruited to act as a go-between and obtain permission to release all the data.

Poland refused, and the information commissioner overruled Trinidad and Tobago’s wish for the data it supplied on latitudes between 30 degrees north and 40 degrees south to be withheld, as it had been specifically requested by Jones and Keiller in their FOI request and previously shared with other academics.

The price

The end result is that all the records are there, except for Poland’s. Davies’s only worry is that the decision to release the Trinidad and Tobago data against its wishes may discourage the open sharing of data in the future. Other research organisations may from now on be reluctant to pool data they wish to be kept private.

Thomas Peterson, chief scientist at the National Climatic Data Center of the US National Oceanographic and Atmospheric Administration (NOAA) and president of the Commission for Climatology at the World Meteorological Organization, agrees there might be a cost to releasing the data.

“I have historic temperature data from automatic weather stations on the Greenland ice sheet that I was able to obtain from Denmark only because I agreed not to release them,” he says. “If countries come to expect that sharing of any data with anyone will eventually lead to strong pressure for them to fully release those data, will they be less willing to collaborate in the future?”

Davies is confident that genuine and proper analysis of the raw data will reproduce the same incontrovertible conclusion – that global temperatures are rising. “The conclusion is very robust,” he says, explaining that the CRU’s dataset of land temperatures tally with those from other independent research groups around the world, including those generated by the NOAA and NASA.

“Should people undertake analyses and come up with different conclusions, the way to present them is through publication in peer-reviewed journals, so we know it’s been through scientific quality control,” says Davies.

No convincing some people

Other mainstream researchers and defenders of the consensus are not so confident that the release will silence the sceptics. “One can hope this might put an end to the interminable discussion of the CRU temperatures, but the experience of GISTEMP – another database that’s been available for years – is that the criticisms will continue because there are some people who are never going to be satisfied,” says Gavin Schmidt of Columbia University in New York.

“Sadly, I think this will just lead to a new round of attacks on CRU and the Met Office,” says Bob Ward, communications director of the Grantham Research Institute on Climate Change and the Environment at the London School of Economics. “Sceptics will pore through the data looking for ways to criticise the processing methodology in an attempt to persuade the public that there’s doubt the world has warmed significantly.”

The CRU and its leading scientist, Phil Jones, were at the centre of the so-called “climategate” storm in 2009 when the unit was accused of withholding and manipulating data. It was later cleared of the charge.

Why Is The Federal Government Running Ads Secretly Created & Owned By NBC Universal?

Why Is The Federal Government Running Ads Secretly Created & Owned By NBC Universal? | Techdirt.

from the so-that’s-how-it-works… dept

We certainly suspected this when New York City first announced that it was running a series of silly and misleading videos as part of a media campaign to “Stop Piracy in NYC,” but now it’s been confirmed that these videos were not, in fact, New York City’s, but are purely NBC Universal’s. At the time, NYC had “thanked” NBC Universal (among others), but had not admitted that NBC Universal “owned” and had created the videos themselves. However, in response to one of the Freedom of Information requests that I filed with New York City, the city noted that the videos are property of NBC Universal. I had asked for any licensing info between NYC and Homeland Security/ICE because ICE was using the same videos. Since NYC had clearly suggested that those videos were the creation of the NYC government, I assumed that ICE must have licensed the videos from NYC. However, NYC responded to my request by saying that there was no such info to hand over, because it did not license the videos to Homeland Security. And the reason was that NYC did not own the videos:

The Mayor’s Office of Media and Entertainment has no records responsive to your request. Please note that NBC Universal owns the material, not the City of New York.

That’s fascinating information. Of course, I had also filed a separate FOI request for any info on the licensing agreement between NYC and NBC Universal. As of this writing there has been no response from NYC, in violation of New York State’s Freedom of Information Law, which requires a response within 5 business days (we’re way beyond that).

Still, at least give NYC credit for making it clear that NBC Universal had a hand in the creation of the videos, even if it left out the rather pertinent information that it created and owned the videos. While I find it immensely troubling that a municipal government would run PSAs created by corporate interests (without making that clear), I’m extremely troubled by the news that the federal government would run those same videos with absolutely no mention of the fact that the videos were created and owned by a private corporation with a tremendous stake in the issue.

Could you imagine how the press would react if, say, the FDA ran PSAs that were created and owned by McDonald’s without making that clear to the public? How about if the Treasury Department ran a PSA created and owned by Goldman Sachs? So, shouldn’t we be asking serious questions about why Homeland Security and ICE are running a one-sided, misleading corporate propaganda video, created and owned by a private company, without mentioning the rather pertinent information of who made it?

Does Homeland Security work for the US public… or for NBC Universal? 11 Comments

Effects of climate change in Arctic more extensive than expected; a clever proposal for climate change legislation

Effects of climate change in Arctic more extensive than expected, report finds.

ScienceDaily (May 4, 2011) — A much reduced covering of snow, shorter winter season and thawing tundra: The effects of climate change in the Arctic are already here. And the changes are taking place significantly faster than previously thought. This is what emerges from a new research report on the Arctic, presented in Copenhagen this week. Margareta Johansson, from Lund University, is one of the researchers behind the report.

Together with Terry Callaghan, a researcher at the Royal Swedish Academy of Sciences, Margareta is the editor of the two chapters on snow and permafrost.

“The changes we see are dramatic. And they are not coincidental. The trends are unequivocal and deviate from the norm when compared with a longer term perspective,” she says.

The Arctic is one of the parts of the globe that is warming up fastest today. Measurements of air temperature show that the most recent five-year period has been the warmest since 1880, when monitoring began. Other data, from tree rings among other things, show that the summer temperatures over the last decades have been the highest in 2000 years. As a consequence, the snow cover in May and June has decreased by close to 20 per cent. The winter season has also become almost two weeks shorter — in just a few decades. In addition, the temperature in the permafrost has increased by between half a degree and two degrees.

“There is no indication that the permafrost will not continue to thaw,” says Margareta Johansson.

Large quantities of carbon are stored in the permafrost.

“Our data shows that there is significantly more than previously thought. There is approximately double the amount of carbon in the permafrost as there is in the atmosphere today,” says Margareta Johansson.

The carbon comes from organic material which was “deep frozen” in the ground during the last ice age. As long as the ground is frozen, the carbon remains stable. But as the permafrost thaws there is a risk that carbon dioxide and methane, a greenhouse gas more than 20 times more powerful than carbon dioxide, will be released, which could increase global warming.

“But it is also possible that the vegetation which will be able to grow when the ground thaws will absorb the carbon dioxide. We still know very little about this. With the knowledge we have today we cannot say for sure whether the thawing tundra will absorb or produce more greenhouse gases in the future,” says Margareta Johansson.

Effects of this type, so-called feedback effects, are of major significance for how extensive global warming will be in the future. Margareta Johansson and her colleagues present nine different feedback effects in their report. One of the most important right now is the reduction of the Arctic’s albedo. The decrease in the snow- and ice-covered surfaces means that less solar radiation is reflected back out into the atmosphere. It is absorbed instead, with temperatures rising as a result. Thus the Arctic has entered a stage where it is itself reinforcing climate change.

The future does not look brighter. Climate models show that temperatures will rise by a further 3 to 7 degrees. In Canada, the uppermost metres of permafrost will thaw on approximately one fifth of the surface currently covered by permafrost. The equivalent figure for Alaska is 57 per cent. The length of the winter season and the snow coverage in the Arctic will continue to decrease and the glaciers in the area will probably lose between 10 and 30 per cent of their total mass. All this within this century and with grave consequences for the ecosystems, existing infrastructure and human living conditions.

New estimates also show that by 2100, the sea level will have risen by between 0.9 and 1.6 metres, which is approximately twice the increase predicted by the UN’s panel on climate change, IPCC, in its 2007 report. This is largely due to the rapid melting of the Arctic icecap. Between 2003 and 2008, the melting of the Arctic icecap accounted for 40 per cent of the global rise in sea level.

“It is clear that great changes are at hand. It is all happening in the Arctic right now. And what is happening there affects us all,” says Margareta Johansson.

The report “Impacts of climate change on snow, water, ice and permafrost in the Arctic” has been compiled by close to 200 polar researchers. It is the most comprehensive synthesis of knowledge about the Arctic that has been presented in the last six years. The work was organised by the Arctic Council’s working group for environmental monitoring (the Arctic Monitoring and Assessment Programme) and will serve as the basis for the IPCC’s fifth report, which is expected to be ready by 2014.

Besides Margareta Johansson, Torben Christensen from Lund University also took part in the work.

More information on the report and The Artic as a messenger for global processes – climate change and pollution conference in Copenhagen can be found at:

http://amap.no/Conferences/Conf2011/

**************************

2 °C or not 2 °C? That is the climate question

Targets to limit the global temperature rise won’t prevent climate disruption. Tim Lenton says that policy-makers should focus on regional impacts.

As a scientist who works on climate change, I am not comfortable with recommending policy. Colleagues frown on it, and peer review of scientific papers slams anything that could be construed as policy prescription. Yet climate science is under scrutiny in multiple arenas, and climate scientists have been encouraged to engage more openly in societal debate.

I don’t want to write policies, but I do want to ensure that global efforts to tackle the climate problem are consistent with the latest science, and that all useful policy avenues remain open. Ongoing negotiations for a new climate treaty aim to establish a target to limit the global temperature rise to 2 °C above the average temperature before the industrial revolution. But that is not enough.

The target is linked to the United Nations Framework Convention on Climate Change (UNFCCC), which aims to “prevent dangerous anthropogenic interference with the climate system”. But that noble objective is nearly 20 years old and is framed too narrowly, in terms of the “stabilization of greenhouse gas concentrations in the atmosphere”. Long-term goals to limit temperature or concentrations have so far failed to produce effective short-term action, because they do not have the urgency to compel governments to put aside their own short-term interests.

“Global average warming is not the only kind of climate change that is dangerous.”

Global average warming is not the only kind of climate change that is dangerous, and long-lived greenhouse gases are not the only cause of dangerous climate change. Target setters need to take into account all the factors that threaten to tip elements of Earth’s climate system into a different state, causing events such as irreversible loss of major ice sheets, reorganizations of oceanic or atmospheric circulation patterns and abrupt shifts in critical ecosystems.

Such ‘large-scale discontinuities’ are arguably the biggest cause for climate concern. And studies show that some could occur before global warming reaches 2 °C, whereas others cannot be meaningfully linked to global temperature.

Disruption of the south- or east-Asian monsoons would constitute dangerous climate change, as would a repeat of historic droughts in the Sahel region of Africa or a widespread dieback of the Amazon rainforest. These phenomena are not directly dependent on global average temperature, but on localized warming that alters temperature gradients between regions. In turn, these gradients are influenced by uneven distribution of anthropogenic aerosols in the atmosphere.

Equally, an abrupt shift in the regions in which dense masses of water form in the North Atlantic could dangerously amplify sea-level rises along the northeastern seaboard of the United States. But the point at which that will occur depends on the speed of climate change more than its magnitude.

Even when a threshold can be directly related to temperature, as with the melting of ice sheets, it is actually the net energy input that is important. The rapid warming of the Arctic in recent years is attributable less to increasing carbon dioxide levels than to reductions in emissions of sulphate aerosols (which have a cooling effect), and to increases in levels of warming agents, including black-carbon aerosols and the shorter-lived greenhouse gases methane and tropospheric ozone.

Ultimately, crucial climate events are driven by changes in energy fluxes. However, the one metric that unites them, radiative forcing, is missing from most discussions of dangerous climate change. Radiative forcing measures the change in the net imbalance of energy that enters and leaves the lower atmosphere; it is a better guide to danger than greenhouse-gas concentrations or global warming. It takes into account almost all anthropogenic activities that affect our climate, including emissions of methane, ozone-producing gases and hydrofluorocarbons, and changes in land use and aerosol levels.

I suggest that the UNFCCC be extended. The climate problem, and the political targets presented as a solution, should be aimed at restricting anthropogenic radiative forcing to limit the rate and gradients of climate change, before limiting its eventual magnitude.

ADVERTISEMENT

 

How would this help? A given level of radiative forcing is reached long before the resulting global temperature change is fully realized, which brings urgency to the policy process. The 2 °C target would translate into a radiative forcing of about 2.5 Watts per square metre (W m?2), but to protect major ice sheets, we might need a tougher global target of 1.5 W m?2. We will still need a binding target to limit long-term global warming. And because CO2 levels remain the most severe threat in the long term, a separate target could tackle cumulative carbon emissions. But while we wait for governments to reach an agreement on CO2, we can get to work on shorter-lived radiative-forcing agents.

The beauty of this approach is that it opens separate policy avenues for different radiative-forcing agents, and regional treaties to control those with regional effects. For example, hydrofluorocarbons emissions could be tackled under a modification of the 1987 Montreal Protocol, which aimed to halt ozone depletion. And emissions of black-carbon aerosols and ozone-producing gases could be regulated under national policies to limit air pollution. This would both break the political impasse on CO2 and help to protect vulnerable elements of the Earth system.

Tim Lenton is professor of Earth system science in the College of Life and Environmental Sciences, University of Exeter, UK. e-mail: t.m.lenton@exeter.ac.uk

 

WikiLeaks wars: Digital conflict spills into real life

WikiLeaks wars: Digital conflict spills into real life – tech – 15 December 2010 – New Scientist.

Editorial: Democracy 2.0: The world after WikiLeaks

WHILE it is not, as some have called it, the “first great cyberwar“, the digital conflict over information sparked by WikiLeaks amounts to the greatest incursion of the online world into the real one yet seen.

In response to the taking down of the WikiLeaks website after it released details of secret diplomatic cables, a leaderless army of activists has gone on the offensive. It might not have started a war, but the conflict is surely a sign of future battles.

No one is quite sure what the ultimate political effect of the leaks will be. What the episode has done, though, is show what happens when the authorities attempt to silence what many people perceive as a force for freedom of information. It has also shone a light on the evolving world of cyber-weapons (see “The cyber-weapon du jour”).

WikiLeaks was subjected to a distributed denial of service (DDoS) attack, which floods the target website with massive amounts of traffic in an effort to force it offline. The perpetrator of the attack is unknown, though an individual calling himself the Jester has claimed responsibility.

WikiLeaks took defensive action by moving to Amazon’s EC2 web hosting service, but the respite was short-lived as Amazon soon dumped the site, saying that WikiLeaks violated its terms of service. WikiLeaks responded via Twitter that: “If Amazon are so uncomfortable with the first amendment, they should get out of the business of selling books”.

With WikiLeaks wounded and its founder Julian Assange in custody, a certain section of the internet decided to fight back. Armed with freely available software, activists using the name “Anonymous” launched Operation Avenge Assange, targeting DDoS attacks of their own at the online services that had dropped WikiLeaks.

With WikiLeaks wounded and its founder in custody, a section of the internet decided to fight back

These efforts have so far had limited success, in part due to the nature of Anonymous. It is not a typical protest group with leaders or an organisational structure, but more of a label that activists apply to themselves. Anonymous has strong ties to 4chan.org, a notorious and anarchic message board responsible for many of the internet’s most popular memes, such as Rickrolling and LOLcats. The posts of unidentified 4chan users are listed as from “Anonymous”, leading to the idea of a collective anonymous campaigning force.

This loose group has previously taken action both on and offline against a number of targets, including Scientologists and the Recording Industry Association of America, but the defence of WikiLeaks is their most high-profile action yet. Kristinn Hrafnsson, a spokesman for WikiLeaks, said of the attacks: “We believe they are a reflection of public opinion on the actions of the targets.”

The “public” have certainly played a key role. The kind of DDoS attacks perpetrated by Anonymous are usually performed by botnets – networks of “zombie” computers hijacked by malicious software and put to use without their owner’s knowledge. Although Anonymous activists have employed traditional botnets in their attacks, the focus now seems to be on individuals volunteering their computers to the cause.

“I think there are two groups of people involved,” says Tim Stevens of the Centre for Science and Security Studies at Kings College London. The first group are the core of Anonymous, who have the technological know-how to bring down websites. The second group are ordinary people angry at the treatment of WikiLeaks and wanting to offer support. “Anonymous are providing the tools for these armchair activists to get involved,” says Stevens.

The human element of Anonymous is both a strength and a weakness. Though the group’s freely available LOIC software makes it easy for anyone to sign up to the cause, a successful DDoS requires coordinated attacks. This is often done through chat channels, where conversations range from the technical – “I have Loic set to 91.121.92.84 and channel set to #loic, is that correct” – to the inane – “please send me some nutella ice cream”.

There are continual disagreements about who and when to attack, though new tactics also emerge from the chat, such as Leakspin, an effort to highlight some of the less-publicised leaks, and Leakflood, a kind of analogue DDoS that attempts to block corporate fax machines with copies of the cables.

These chat channels are also occasionally knocked offline by DDoS attacks. Some blame “the feds”, but could governments – US or otherwise – actually be involved? (see “Are states unleashing the dogs of cyberwar?”)

The US Department of Defense’s recently launched Cyber Command has a dual remit: to defend US interests online and conduct offensive operations. Cyber Command is meant to defend .mil and .gov web domains, but do commercial websites qualify too? “Is PayPal really that important to national security that the US military would have a role in defending it?” asks Stevens, who also teaches in the Department of War Studies at King’s College London. “The US doesn’t have an answer to that particular conundrum, and they’re not alone – nobody does”.

Is PayPal so important to national security that the US military would have a role in defending it?

The difficulty comes in assessing whether DDoS attacks are an act of cyberwar, a cybercrime or more akin to online civil disobedience.

Individual LOIC users may not even be breaking the law. “All that DDoS does is send the normal kind of traffic that a website receives,” says Lilian Edwards, professor of internet law at the University of Strathclyde in Glasgow, UK. “That has always been the legal problem with regulating DDoS – each individual act is in fact authorised by the site, but receiving 10 million of them isn’t.”

It’s hard to say what will happen next. Anonymous might continue its attempt to cause as much disruption as possible, but it could just as easily become fragmented and give up. With no leaders or central structure, it is unlikely to be stopped by a few arrests or server takedowns but may equally find it difficult to coordinate well enough to have an impact.

More worrying is the prospect that more organised groups may follow Anonymous’s example. If that happens, who will be responsible for stopping them – and will they be able to?

Read more: Are states unleashing the dogs of cyber war?