The United Nations has called an emergency meeting to discuss the Horn of Africa drought, which it says has already claimed tens of thousands of lives. Famine was declared in two regions of Somalia on Wednesday where 3.7 million people are in need of urgent humanitarian assistance. Another eight million people need food assistance in neighboring countries including Kenya and Ethiopia. United Nations Secretary-General Ban Ki-moon calls the situation a “catastrophic combination of conflict, high food prices and drought” and has appealed for immediate aid. We go to Nairobi for an update from Kiki Gbeho of the U.N. Office for the Coordination of Humanitarian Affairs. We also speak with Christian Parenti, author of “Tropic of Chaos: Climate Change and the New Geography of Violence.” “This was predicted long ago by people on the ground,” Parenti says. “It’s a combination of war, climate change and very bad policy, particularly an embrace of radical free market policies by regional governments that mean the withdrawal of support for pastoralists, the type of people you saw with their dead cattle.” [includes rush transcript]
Michael Marshall, environment reporter
This is something you don’t see every day: a substantial, carefully-researched book on how to reform our manufacturing industries, paired with an album of songs on the same theme.
Let’s start with the book. Sustainable Materials: With Both Eyes Open tackles a particularly thorny question: how can we cut our greenhouse gas emissions to a safe level, without shutting down essential industries? It focuses on steel and aluminium, which between them account for 28 per cent of all industrial emissions, although later chapters briefly consider cement, paper and plastics as well.
This is a follow-up book to David MacKay’s much-vaunted Sustainable Energy – Without the Hot Air. Both feature academics from the University of Cambridge carefully working out how we can transform an emissions-heavy sector of the economy.
The eight authors, led by Julian Allwood and Jonathan Cullen, first take a close look at how steel and aluminium are produced from their respective ores, asking “how much can the metals industry do to clean up its act?” The answer they come up with: “plenty, but nowhere near enough”.
So they take a second approach, asking whether we can redesign the things we make to use less metal, use them for longer, and recycle their components when they wear out. This also offer plenty of options. Reassuringly, when the two approaches are combined the total emissions cuts are substantial.
Some of the ideas they come up with are so simple, I wondered why no one thought of them before. For instance, the average fridge lasts about 10 years, and gets thrown out when the compressor fails. This is a small part, but it takes a lot of work to replace so it’s cheaper to buy a new fridge. If fridges were redesigned so that the compressor was easy to replace, they would last far longer. “You shouldn’t have to buy two fridges in your lifetime,” they say.
Of course, this is another example of a solution for climate change that involves huge numbers of people taking concerted action. The problem is people’s disinclination to get off their backsides.
It’s quite a technical book, so it may not have much popular appeal, despite its nicely chatty style. But for policy-makers trying to cut emissions, and anyone in manufacturing, it should be required reading.
And so to the album, a collaboration between Allwood and soprano Adey Grummet, which is much better than it has any right to be. Worthy music on eco-conscious themes can sound like Spinal Tap’s Listen to the Flower People, but With Both Eyes Open actually contains a couple of good tunes.
The strongest songs get away from the details of materials science and become universal. The opening track, You Gotta Start, is an up-tempo number extolling the virtues of having a go, even when you don’t know exactly what you need to do. It’s not just about sustainability.
Similarly, the title track is a passionate call to arms, urging people to move away from blind consumerism. The closing line – “the stuff of life is life and not just stuff” – is better and more relevant than anything Coldplay will write next year.
Given how specialist the subject matter is, I’m not sure how many people the album will really appeal to. Of the 12 songs, I only expect to keep the two I’ve highlighted on my MP3 player. Unfortunately, the rest just restate ideas from the book in a slightly less clear way.
I worry that the album will give people, particularly policy-makers, the impression that the book is somehow flaky and not worth paying attention to. That would be a crying shame, because the book’s lessons are clear, well-supported, and vital.
WATER FALL: Unusually low water levels in many Chinese rivers has contributed to a big drop in hydropower production. Image: Tomasz Dunn/Flickr
SHANGHAI — China has set ambitious goals for itself to develop hydropower to help mitigate the risks of climate change, but increasing extreme weather events likely rooted in climate change are now sabotaging the goals’ foundations.
The latest blow came in September, when many major rivers across China ran into an unusual shrinkage, with less than 20 percent water remaining at some stretches. As a result, the nation’s hydroelectric generation dropped by almost a quarter compared with last year. There has been an ever-widening decrease in power each month since July, according to a recent government statement.
As water stocks in key hydro stations decline, the regular dry season is approaching. The resulting stress on hydroelectric generation will last into next year, the statement said.
The Chinese government has yet to explain why the water flows slumped. But experts blamed it on climate change, warning of more future droughts in areas traditionally blessed with water.
If this expectation comes true, it will hamper China’s hydropower sector, which contributes most of the country’s carbon-free electricity. It will also threaten a national strategy in transmitting electricity from resource-rich western China to feed the country’s power-hungry manufacturing sector, most of which is in the east.
For Guangdong province, located on China’s east coast, this threat has already turned into a daily reality. Since its western neighbors this year failed to send as much electricity as usual, the manufacturing hub, with a capacity to produce more than half of the world’s desktops and toys, is forced to conserve electricity.
Turbines left high and dry
China Southern Power Grid, the region’s electricity distributor, attributed the energy shortage partly to the evaporation of hydropower.
As of July, on average, not even half of its installed hydropower capacity found water to turn turbines, the company’s statistics show. And several major hydro stations, built as part of the west-to-east electricity transmission plan, failed to do their jobs.
Goupitan, the largest hydroelectric generator in Guizhou province, reportedly produced only 10 percent of its normal output per day, due to shrinking water flows. And in another hydro station called Longtan, located in the Guangxi region, this year’s missing rain dropped its reservoir’s water level to a point dozens of meters lower than previous years.
“This will definitely negatively affect our hydroelectric production from now to next summer,” said Li Yanguang, who is in charge of public relations in the power station. Asked whether next summer — a regular rainy season — could make the situation better, Li answered in a cautious tone.
“This totally depends on weather,” he said. “We can’t predict that.”
Hydro growth plan sticks despite falling power output
But Lin Boqiang, one of China’s leading energy experts, is confident that the nation’s hydroelectric generation may just go in one direction: getting worse.
“If climate change caused this year’s water flow decreases, which I think it did, and then its impact [on rivers] will be a long term. It will take a toll on China’s hydroelectric output, and also push up the cost of using it,” explained Lin, who directs the China Center for Energy Economics Research at Xiamen University.
But still, from Lin’s point of view, such setbacks can’t compete with the Chinese desire for tapping more water power. China, already the world’s largest hydropower user, plans to add another 120 gigawatts by 2015 — a crucial step toward greening 15 percent of its power mix by the end of the decade.
Yang Fuqiang, a senior climate and energy expert at the Natural Resources Defense Council, agreed that China’s hydropower plan will stand, though not primarily for energy supply concerns.
Although a climate-resilient approach is factored into the designs of hydro projects, China is still likely to suffer from hydroelectric output decline, says Yang. But the nation can seek more clean energy from the sun or wind, which won’t be affected by climate change, and get the electricity generated elsewhere via a smart grid, he said, referring to an advanced transmission infrastructure China has been building.
So what’s the point of keeping hydro?
“In the future, the importance of hydro projects won’t be on power generation, but on water management,” Yang explained. “It helps control floods, ensure ships transportation and reserve water — a function that [water-scarce] China needs badly.”
Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC. www.eenews.net, 202-628-6500
In April and May this year, two small earthquakes struck the UK near the town of Blackpool. Suspicion immediately fell on hydraulic fracturing, known as fracking – a controversial process to extract natural gas by fracturing the surrounding rock. A report has now confirmed that fracking caused the earthquakes.
New Scientist looks at what happened, and whether fracking is likely to cause more earthquakes.
When and where did the earthquakes happen?
A magnitude-2.3 earthquake occurred on 1 April, followed by a magnitude-1.5 quake on 27 May. Both occurred close to the Preese Hall drilling site, where Cuadrilla Resources was using fracking to extract gas from a shale bed.
Initial studies by the British Geological Survey (BGS) suggested that the quakes were linked to Cuadrilla’s fracking activities. The epicentre of the second quake was within 500 metres of the drilling site, at a depth of 2 kilometres. Less information was available on the first quake, but it seems to have been similar.
The link with fracking has now been confirmed by an independent report commissioned by Cuadrilla, Geomechanical Study of Bowland Shale Seismicity, which states: “Most likely, the repeated seismicity was induced by direct injection of fluid into the fault zone.”
The two geologists who wrote the report ran detailed models to show that the fracking could – and most likely did – provoke the quakes.
How did the fracking cause the earthquakes?
Fracking works by injecting huge volumes of water into the rocks surrounding a natural gas deposit. The water fractures the rocks, creating dozens of cracks through which the gas can escape to the surface.
The UK quakes were not caused by the violent rupturing of the rocks, as you might expect, but by the presence of water. This lubricates the rocks and pushes them apart, allowing them to slip past each other. “It’s a bit like oiling the fault,” says Brian Baptie of the BGS.
Seismologists have not been able to find the fault that moved, probably because it is tiny. Baptie says the surface area of the fault is likely to be just 100 metres by 100 metres, and that the rocks moved by about 1 centimetre – the seismological equivalent of a needle in a haystack.
So should we expect lots more earthquakes from fracking?
It’s difficult to say. Fracking has been going on in the US for decades, and has become much more common in recent years, yet evidence that it causes earthquakes has so far been elusive. “This is one of the first times felt earthquakes have been associated with fracking,” Baptie says.
The Cuadrilla report says the earthquakes occurred because of a rare combination of circumstances: the fault was already under stress, was brittle enough to fracture and had space for large amounts of water that could lubricate it. The report says this is unlikely to happen again at the Preese Hall site.
Baptie is not so sure. He says small faults are probably common in deep rocks, but go undetected because of their size. “It seems quite possible, given the same injection scheme in the same well, that there could be further earthquakes,” he says.
Cuadrilla is proposing to monitor seismic activity around its fracking site. If earthquakes begin to occur, it could reduce the flow of water into the well, or even pump it back out, preventing the bigger quakes. Baptie says such monitoring is now necessary to avoid further quakes at fracking sites.
Are these earthquakes dangerous?
Not particularly. Magnitude-2.3 earthquakes can shake the ground enough for people to notice, especially if they occur close to the surface, but damage is normally limited to objects falling off shelves.
According to Baptie, the UK gets an average of 15 magnitude-2.3 earthquakes every year, so the quakes produced by the fracking are not out of the ordinary.
- Follow @9brandon
Climate change threatens to turn the planet into a stormy, overheated mess: That much we know. But according to 28 leading scientists, greenhouse gas pollution is but one of nine environmental factors critical to humanity’s future. If their boundaries are stretched too far, Earth’s environment could be catastrophically altered — and three have already been broken, with several others soon to follow.
This grim diagnosis, published Wednesday in Nature, is the most ambitious assessment of planetary health to date. It’s a first-draft users’ manual for an era that scientists dub the “anthropocene,” in which nearly seven billion resource-hungry humans have come to dominate ecological change on Earth. The scientists’ quantifications are open to argument, but not the necessity of their perspective.
“It’s a crude attempt to map the environmental space in which we can operate,” said Jon Foley, director of the University of Minnesota’s Institute on the Environment and one of the paper’s lead authors. “We need to keep our activities in a certain range, or the planet could tip into a state we haven’t seen in the history of our civilization.”
Thresholds for atmospheric carbon dioxide and ozone have already been described, and are widely known to the public. But the scientists say five other factors are just as important: ocean acidification, nitrogen and phosphorus pollution, land use, freshwater use and biodiversity. They say chemical pollution and atmospheric aerosols may also be essential, but can’t yet be quantified.
Values for the proposed boundaries are still just estimates, and don’t account for how pushing one could affect another — how, for example, acidification that kills plankton could make it harder for the ocean to absorb CO2 and rebound from nitrogen pollution. Ecological models still can’t capture the entirety of Earth’s biological, geological and chemical processes, and it’s impossible to run whole-Earth experiments — except, arguably, for the experiment that’s going on now.
Despite those uncertainties, one aspect of Earth’s behavior is becoming clear. Records of global transitions between geological ages, and of regional changes between environmental stages, suggest that planet-wide change could happen relatively quickly. It might not take thousands or millions of years for Earth’s environment to be altered. It could happen in centuries, perhaps even decades.
Exactly what Earth would look like is difficult to predict in detail, but it could be radically different from the mild environment that has prevailed for the last 10,000 years. It was temperate stability that nurtured the rise of civilization, and it should continue for thousands of years to come, unless humanity keeps pushing the limits.
“The Earth of the last 10,000 years has been more recognizable than the Earth we may have 100 years from now. It won’t be Mars, but it won’t be the Earth that you and I know,” said Foley. “This is the single most defining problem of our time. Will we have the wisdom to be stewards of a world we’ve come to dominate?”
Foley’s team put the atmospheric carbon dioxide threshold at 350 parts per million, a level the Intergovernmental Panel for Climate Change says should keep Earth’s average temperature from rising by more than four degrees Fahrenheit. Current atmospheric CO2 levels are already approaching 400 parts per million.
Also exceeded are limits for species loss, which the scientists set at 10 per year per million species, and nitrogen use, pegged at 35 million tons per year. The current extinction rate is ten times higher than advised, ostensibly compromising the ability of ecosystems to process nutrients. The use of nitrogen — which is needed for fertilizer, but causes oxygen-choking algae blooms — is nearly four times higher than recommended.
On the positive side, atmospheric levels of ultraviolet radiation-blocking ozone are safe, thanks to a 1987 ban on ozone-destroying chemicals. Total rates of ocean acidification, freshwater consumption and land use are also acceptable, but those thresholds are expected to be exceeded in coming decades.
The seven boundary points are certain to be controversial, and Nature commissioned seven separate critiques by leading experts in each field.
William Schlesinger, president of the Cary Institute of Ecosystem Studies, said the recommended nitrogen limit “seems arbitrary.” Echoing his words was Steve Bass of the International Institute for Environment and Development, who said the 15 percent cap on land devoted to agriculture could as easily be 10 or 20 percent.
International Water Management Institute researcher David Molden said the 4,000 cubic kilometer ceiling on freshwater use — roughly one-third of all freshwater — “may be too high.” Myles Allen, an Oxford University climatologist, argued that CO2 emissions should be counted in a different way. Cristian Samper, director of the U.S. Natural History Museum, said that taxonomic family loss is a more relevant measure than species loss.
According to Foley, who called his team’s threshold values a “cave painting” version of the true limits, the paper is less important for its details than its approach. And though the critics argued over the numbers, all agreed that exceeding them will be disastrous.
“Planetary boundaries are a welcome new approach,” wrote Molden. “It is imperative that we act now on several fronts to avert a calamity far greater than what we envision from climate change.”
Peter Brewer, an ocean chemist at the Monterey Bay Aquarium Research Institute, criticized the paper’s lack of proposed solutions. Given the ongoing failure of governments and citizens to follow their scientists’ advice on climate change, more than dire warnings is clearly needed.
“Is it truly useful to create a list of environmental limits without serious plans for how they may be achieved?” Brewer wrote. “Without recognition of what would be needed economically and politically to enforce such limits, they may become just another stick to beat citizens with.”
“It’s unsatisfactory, I agree. We don’t answer the question of how to keep humanity from crossing the boundaries,” said Johan Rockstrom, director of the Stockholm Environment Institute and a lead author of the Nature paper. “That’s the next challenge. To stay within planetary boundaries, we need tremendous social transformation.”
- Scientists Seek Warning Signs for Catastrophic Tipping Points …
- Latest Extinction is the Greatest
- Ocean Dead Zones May Be Worse Than Thought
- Western Oceans Quickly Going Acidic, Say Scientists
- It’s a New Geological Age. Now Name It!
- Humans Halfway to Causing Dangerous Climate Change
- Mapping the Humanized World
Note: The Nature paper is an edited version of the full article, which is available from the Stockholm Resilience Institute.
Citations: “A safe operating space for humanity.” By Johan Rockström, Will Steffen, Kevin Noone, Åsa Persson, F. Stuart Chapin, III, Eric F. Lambin, Timothy M. Lenton, Marten Scheffer, Carl Folke, Hans Joachim Schellnhuber, Björn Nykvist, Cynthia A. de Wit, Terry Hughes, Sander van der Leeuw, Henning Rodhe, Sverker Sörlin, Peter K. Snyder, Robert Costanza, Uno Svedin, Malin Falkenmark, Louise Karlberg, Robert W. Corell, Victoria J. Fabry, James Hansen, Brian Walker, Diana Liverman, Katherine Richardson, Paul Crutzen, Jonathan A. Foley. Nature, Vol. 461 No. 7263, September 24, 2009.
“Thresholds risk prolonged degradation.” By William Schlesinger. Nature, Vol. 461 No. 7263, September 24, 2009.
“Keep off the grass.” By Steve Bass. Nature, Vol. 461 No. 7263, September 24, 2009.
“Tangible targets are critical.” By Myles Allen. Nature, Vol. 461 No. 7263, September 24, 2009.
“Identifying abrupt change.” By Mario J. Molina. Nature, Vol. 461 No. 7263, September 24, 2009.
“The devil is in the detail.” By David Molden. Nature, Vol. 461 No. 7263, September 24, 2009.
“Consider all consequences.” By Peter Brewer. Nature, Vol. 461 No. 7263, September 24, 2009.
“Rethinking biodiversity.” By Cristian Samper. Nature, Vol. 461 No. 7263, September 24, 2009.
- Follow @9brandon
What does it mean?
In itself, not much: Seven billion is just a one-digit flicker from 6,999,999,999. But the number carries a deep existential weight, symbolizing themes central to humanity’s relationship with the rest of life on Earth.
For context, let’s consider a few other numbers. The first: 10,000. That’s approximately how many Homo sapiens existed 200,000 years ago, the date at which scientists mark the divergence of our species from the rest of Homo genus, of which we are the sole survivors.
From those humble origins, humans — thanks to our smarts, long-distance running skills, verbal ability and skill with plants — proliferated at an almost inconceivable rate.
Some may note that, in a big-picture biological sense, humanity has rivals: In total biomass, ants weigh as much as we do, oceanic krill weigh more than both of us combined, and bacteria dwarf us all. Those are interesting factoids, but they belie a larger point.
Ants and krill and bacteria occupy an entirely different ecological level. A more appropriate comparison can be made between humans and other apex predators, which is precisely the ecological role humans evolved to play, and which — beneath our civilized veneer — we still are.
According to a back-of-the-envelope calculation, there are about 1.7 million other top-level, land-dwelling, mammalian predators on Earth. Put another way: For every non-human mammal sharing our niche, there are more than 4,000 of us.
In short, humans are Earth’s great omnivore, and our omnivorous nature can only be understood at global scales. Scientists estimate that 83 percent of the terrestrial biosphere is under direct human influence. Crops cover some 12 percent of Earth’s land surface, and account for more than one-third of terrestrial biomass. One-third of all available fresh water is diverted to human use.
Altogether, roughly 20 percent of Earth’s net terrestrial primary production, the sheer volume of life produced on land on this planet every year, is harvested for human purposes — and, to return to the comparative factoids, it’s all for a species that accounts for .00018 percent of Earth’s non-marine biomass.
We are the .00018 percent, and we use 20 percent. The purpose of that number isn’t to induce guilt, or blame humanity. The point of that number is perspective. At this snapshot in life’s history, at — per the insights of James C. Rettie, who imagined life on Earth as a yearlong movie — a few minutes after 11:45 p.m. on December 31, we are big. Very big.
However, it must be noted that, as we’ve become big, much of life had to get out of the way. When modern Homo sapiens started scrambling out of East Africa, the average extinction rate of other mammals was, in scientific terms, one per million species years. It’s 100 times that now, a number that threatens to make non-human life on Earth collapse.
In regard to that number, environmentalists usually say that humanity’s fate depends on the life around us. That’s debatable. Humans are adaptable and perfectly capable of living in squalor, without clean air or clean water or birds in the trees. If not, there wouldn’t be 7 billion of us. Conservation is a moral question, and probably not a utilitarian imperative.
But the fact remains that, for all of humanity to experience a material standard of living now enjoyed by a tiny fraction, we’d need four more Earths. It’s just not possible. And that, in the end, is the significance of 7 billion. It’s a challenge.
In just a few minutes of evolutionary time, humanity has become a force to be measured in terms of the entirety of life itself. How do we, the God species, want to live? For the answer, check back at 8 billion.
- Plant owner: A “cold shutdown” of damaged reactors could be completed by the end of the year
- Government officials say the removal of nuclear fuel should begin by 2021
- The panel predicts it will take more than 10 years to remove nuclear fuel
- The damaged reactors might not be retired until at least 2041
Tokyo (CNN) — The decommissioning of four reactors at the crippled Fukushima Daiichi nuclear power plant will likely take more than 30 years to complete, according to a report by Japanese officials.
The draft report, released by Japan’s Atomic Energy Commission of the Cabinet Office on Friday, said the removal of debris — or nuclear fuel — should begin by the end of 2021.
“We set a goal to start taking out the debris within a 10-year period, and it is estimated that it would take 30 years or more (after the cold shutdown) to finish decommissioning because the process at Fukushima would be complicated,” the report states.
Last month, the plant’s owner — Tokyo Electric Power Company — said engineers might be able to complete the cold shutdown of damaged reactors by the end of the year.
Temperatures in the three reactors where meltdowns occurred in the wake of the historic March 11 earthquake and tsunami have already been brought down below 100 degrees Celsius (212 degrees Fahrenheit), but the company has to maintain those conditions for some time before declaring the reactors in cold shutdown, Tokyo Electric spokesman Yoshikazu Nagai said.
Experts have said it will take years — perhaps decades — to fully clean up the worst nuclear disaster since Chernobyl. Hydrogen explosions blew apart the No. 1 and No. 3 reactor housings, while another hydrogen blast is suspected to have damaged the No. 2 reactor. Fires believed caused by heat from the No. 4 spent fuel pool damaged that unit’s reactor building.
The atomic energy commission’s report noted it took 10 years to remove nuclear fuel after the 1979 Three Mile Island disaster in the United States. The commission predicted removing fuel at Fukushima would require more time because the situation is more severe.
The disaster at the Fukushima Daiichi nuclear plant in March released far more radiation than the Japanese government has claimed. So concludes a study1 that combines radioactivity data from across the globe to estimate the scale and fate of emissions from the shattered plant.
The study also suggests that, contrary to government claims, pools used to store spent nuclear fuel played a significant part in the release of the long-lived environmental contaminant caesium-137, which could have been prevented by prompt action. The analysis has been posted online for open peer review by the journal Atmospheric Chemistry and Physics.
Andreas Stohl, an atmospheric scientist with the Norwegian Institute for Air Research in Kjeller, who led the research, believes that the analysis is the most comprehensive effort yet to understand how much radiation was released from Fukushima Daiichi. “It’s a very valuable contribution,” says Lars-Erik De Geer, an atmospheric modeller with the Swedish Defense Research Agency in Stockholm, who was not involved with the study.
The reconstruction relies on data from dozens of radiation monitoring stations in Japan and around the world. Many are part of a global network to watch for tests of nuclear weapons that is run by the Comprehensive Nuclear-Test-Ban Treaty Organization in Vienna. The scientists added data from independent stations in Canada, Japan and Europe, and then combined those with large European and American caches of global meteorological data.
Stohl cautions that the resulting model is far from perfect. Measurements were scarce in the immediate aftermath of the Fukushima accident, and some monitoring posts were too contaminated by radioactivity to provide reliable data. More importantly, exactly what happened inside the reactors — a crucial part of understanding what they emitted — remains a mystery that may never be solved. “If you look at the estimates for Chernobyl, you still have a large uncertainty 25 years later,” says Stohl.
Nevertheless, the study provides a sweeping view of the accident. “They really took a global view and used all the data available,” says De Geer.
Japanese investigators had already developed a detailed timeline of events following the 11 March earthquake that precipitated the disaster. Hours after the quake rocked the six reactors at Fukushima Daiichi, the tsunami arrived, knocking out crucial diesel back-up generators designed to cool the reactors in an emergency. Within days, the three reactors operating at the time of the accident overheated and released hydrogen gas, leading to massive explosions. Radioactive fuel recently removed from a fourth reactor was being held in a storage pool at the time of the quake, and on 14 March the pool overheated, possibly sparking fires in the building over the next few days.
But accounting for the radiation that came from the plants has proved much harder than reconstructing this chain of events. The latest report from the Japanese government, published in June, says that the plant released 1.5?×?1016?bequerels of caesium-137, an isotope with a 30-year half-life that is responsible for most of the long-term contamination from the plant2. A far larger amount of xenon-133, 1.1?×?1019?Bq, was released, according to official government estimates.
The new study challenges those numbers. On the basis of its reconstructions, the team claims that the accident released around 1.7?×?1019?Bq of xenon-133, greater than the estimated total radioactive release of 1.4?×?1019? Bq from Chernobyl. The fact that three reactors exploded in the Fukushima accident accounts for the huge xenon tally, says De Geer.
Xenon-133 does not pose serious health risks because it is not absorbed by the body or the environment. Caesium-137 fallout, however, is a much greater concern because it will linger in the environment for decades. The new model shows that Fukushima released 3.5?×?1016? Bq caesium-137, roughly twice the official government figure, and half the release from Chernobyl. The higher number is obviously worrying, says De Geer, although ongoing ground surveys are the only way to truly establish the public-health risk.
Stohl believes that the discrepancy between the team’s results and those of the Japanese government can be partly explained by the larger data set used. Japanese estimates rely primarily on data from monitoring posts inside Japan3, which never recorded the large quantities of radioactivity that blew out over the Pacific Ocean, and eventually reached North America and Europe. “Taking account of the radiation that has drifted out to the Pacific is essential for getting a real picture of the size and character of the accident,” says Tomoya Yamauchi, a radiation physicist at Kobe University who has been measuring radioisotope contamination in soil around Fukushima.
Stohl adds that he is sympathetic to the Japanese teams responsible for the official estimate. “They wanted to get something out quickly,” he says. The differences between the two studies may seem large, notes Yukio Hayakawa, a volcanologist at Gunma University who has also modelled the accident, but uncertainties in the models mean that the estimates are actually quite similar.
The new analysis also claims that the spent fuel being stored in the unit 4 pool emitted copious quantities of caesium-137. Japanese officials have maintained that virtually no radioactivity leaked from the pool. Yet Stohl’s model clearly shows that dousing the pool with water caused the plant’s caesium-137 emissions to drop markedly (see ‘Radiation crisis’). The finding implies that much of the fallout could have been prevented by flooding the pool earlier.
The Japanese authorities continue to maintain that the spent fuel was not a significant source of contamination, because the pool itself did not seem to suffer major damage. “I think the release from unit 4 is not important,” says Masamichi Chino, a scientist with the Japanese Atomic Energy Authority in Ibaraki, who helped to develop the Japanese official estimate. But De Geer says the new analysis implicating the fuel pool “looks convincing”.
The latest analysis also presents evidence that xenon-133 began to vent from Fukushima Daiichi immediately after the quake, and before the tsunami swamped the area. This implies that even without the devastating flood, the earthquake alone was sufficient to cause damage at the plant.
The Japanese government’s report has already acknowledged that the shaking at Fukushima Daiichi exceeded the plant’s design specifications. Anti-nuclear activists have long been concerned that the government has failed to adequately address geological hazards when licensing nuclear plants (see Nature 448, 392–393; 2007), and the whiff of xenon could prompt a major rethink of reactor safety assessments, says Yamauchi.
The model also shows that the accident could easily have had a much more devastating impact on the people of Tokyo. In the first days after the accident the wind was blowing out to sea, but on the afternoon of 14 March it turned back towards shore, bringing clouds of radioactive caesium-137 over a huge swathe of the country (see ‘Radioisotope reconstruction’). Where precipitation fell, along the country’s central mountain ranges and to the northwest of the plant, higher levels of radioactivity were later recorded in the soil; thankfully, the capital and other densely populated areas had dry weather. “There was a period when quite a high concentration went over Tokyo, but it didn’t rain,” says Stohl. “It could have been much worse.”
Additional reporting by David Cyranoski and Rina Nozawa.
- Stohl, A. et al. Atmos. Chem. Phys. Discuss. 11, 28319-28394 (2011). | Article |
- Chino, M. et al. J. Nucl. Sci. Technol. 48, 1129-1134 (2011). | Article | ChemPort |
There is much we do not understand about Earth’s climate. That is hardly surprising, given the complex interplay of physical, chemical and biological processes that determines what happens on our planet’s surface and in its atmosphere.
Despite this, we can be certain about some things. For a start, the planet is warming, and human activity is largely responsible. But how much is Earth on course to warm by? What will the global and local effects be? How will it affect our lives?Watch movie online A Cure for Wellness (2017)
In these articles, Michael Le Page sifts through the evidence to provide a brief guide to what we currently do – and don’t – know about the planet’s most burning issue.