December 5, 2013
The magnitude 9.0 Tohoku-Oki earthquake that struck Japan on 11 March 2011, killing more than 15,000 people and setting off a devastating tsunami that the nation is still working to recover from, brought up a lot of troubling questions. For instance,
what made such a powerful earthquake possible, and could it happen again in Japan or somewhere else?
An international group of scientists that drilled miles beneath the Pacific Ocean and into the earthquake fault now have answers to these questions, and they report their findings in a trio of papers published today in Science.
The epicenter of the 2011 quake was in an unusual spot, about 130 kilometers east of Sendai, Japan, just off the northern coast of that nation
. In this area, a subduction zone, the Pacific plate is diving beneath the Eurasian plate. Strong earthquakes are possible here, but scientists hadn’t thought that there was enough energy to produce one larger than magnitude 7.5. They were wrong, and they’ve been interested in finding out more about what made the fault capable of producing such a large quake.
A little over a year after the earthquake, the deep sea drilling vessel Chikyu was tasked with the mission to drill into the fault off the Japanese coast and install a temperature observatory. By taking the temperature of a fault after an earthquake, scientists can measure how much energy was released in the quake and calculate a fault’s friction—how easily the rocks rub against each other.
“One way to look at the friction of these big blocks is to compare them to cross-country skis on snow,” Robert Harris, a study co-author and geophysicist at Oregon State University, said in a statement. “At rest, the skis stick to the snow and it takes a certain amount of force to make them slide. Once you do, the ski’s movement generates heat and it takes much less force to continue the movement…. The same thing happens with an earthquake.”
Getting that temperature measurement was tricky. The Chikyu team had to drill 850 meters into the seafloor, which itself was 6,900 meters below the ocean’s surface. They had to deal with bad weather, and the fault itself was still shifting, putting the instruments at risk.
The difficult work paid off, though, and it revealed residual heat from the earthquake, from which the scientists could calculate the fault’s friction, which was very low. Bottom line: “The Tohoku fault is more slippery than anyone expected,” Emily Brodsky, a study co-author and geophysicist at the University of California, Santa Cruz, said in another statement.
The slippery nature of the fault helps to explain some characteristics of the 2011 quake. The fault slipped an unprecedented 50 meters and the rupture, which began deep underground, reached the surface where it caused a sudden disturbance in the ocean and set off the tsunami.
The drilling and laboratory tests also revealed another characteristic of the fault that made it so dangerous. The low friction can be attributed to incredibly fine clay sediment within the fault. “It’s the slipperiest clay you can imagine,” Christie Rowe, a study co-author and geologist at McGill University, said in a statement. “If you rub it between your fingers, it feels like a lubricant.” Incidentally, the area between the Pacific and Eurasian plates that experiences slip is also very thin, less than five meters across, which would make it the thinnest known fault zone on the planet.
Measuring the earthquake’s thermal signal was a first for science. It “was a major accomplishment,” Harris said, “but there is still a lot we don’t yet know.” For example, researchers don’t yet know how generalizable these results are to other subduction zones across the world or what effect the thinness of fault zones has on earthquake hazards. Nonetheless, the drilling results “suggest that the shallow megathrust at the Japan Trench has special traits not seen in many other subduction zones,” Kelin Wang of Natural Resources Canada and Masataka Kinoshita of the Japan Agency for Marine-Earth Science and Technology—the agency that runs the Chikyu—wrote in an accompanying Perspectives article.
Similar conditions may be rare, but they do exist in some places of the north Pacific, such as the Kamchatka Peninsula in Russia and the Aleutian Islands in Alaska, notes Rowe.Deep sea drilling shows that these regions have that same usually slippery clay that lowered the friction in the Japan fault.
But the fact that the unusual circumstances of the Japan fault may be rare shouldn’t put scientists, or the public, at ease, Wang and Kinoshita say. Such huge, shallow slip isn’t necessary for a devastating tsunami to form, and it wasn’t what caused either the 2010 Chile tsunami that destroyed 370,000 homes or the 2004 Indian Ocean tsunami that killed nearly 230,000 people. “It’s hard to say how generalizable these results are until we look at other faults,” Brodsky added. “But this lays the foundation for a better understanding of earthquakes and, ultimately, a better ability to identify earthquake hazards.”
November 25, 2013
Official estimates of U.S. emissions of the greenhouse gas methane may be far too low, according to a report published today by the Proceedings of the National Academy of Sciences. Oil and gas production is contributing much more methane than either the U.S. Environmental Protection Agency (EPA) or the best global survey of the greenhouse gas assume.
Carbon dioxide tends to get the most attention in climate change discussions because it’s the greenhouse gas most responsible for the changes we’re now seeing on Earth. But methane (CH4) has similar heat-trapping effects, and pound for pound, it traps 70 times more heat than carbon dioxide (CO2). However, methane has a shorter atmospheric lifespan, sticking around only for about ten years, compared to a century for CO2.
Like carbon dioxide, methane has been on the rise. Atmospheric concentrations of CH4 have increased from around 680 to 715 parts per billion (ppb) before the Industrial Revolution to approximately 1,800 ppb today. Determining where all that extra methane is coming from is important for efforts to reduce greenhouse gas emissions and limit future climate change effects.
The EPA currently lists livestock production as the biggest methane contributor, followed by, in order, natural gas production, landfills and coal mining. Methane measurements made from aircraft, however, are calling that order, and the EPA’s methane estimates, into question. The EPA and the Emissions Database for Global Atmospheric Research (EDGAR) both use a “bottom up” method of estimating methane, which depends on taking samples and calculating how much methane comes from known emitters, such as livestock herds and petroleum fields, then adding it all up. The aircraft studies take a “top-down” approach instead, starting with measurements of methane in atmospheric samples.
In the new study, Scot M. Miller of Harvard University and colleagues used aircraft-based sampling and a National Oceanic and Atmospheric Administration/Department of Energy air-sampling network to tally 12,694 observations of methane from across the United States in 2007 and 2008. They then used those observations and a computer model to create estimates of monthly methane emissions. The analysis found large differences between their observations and the EPA and EDGAR estimates: The new figures were 1.5 times greater than those of the EPA and 1.7 times those from EDGAR.
Nearly a quarter of the nation’s methane emissions came from just three states—Texas, Oklahoma and Kansas. The estimates for CH4 emissions from these three states were 2.7 times higher than those of EDGAR. “Texas and Oklahoma were among the top five natural gas producing states in the country in 2007,” the researchers note in their paper. The team was able to trace the methane to oil and gas production not simply through coincidences of geography but also because of
their observations found propane in the atmosphere above certain areas in these states. Propane is not produced by methane sources such as livestock or landfills–rather, it is released during fossil fuel extraction. Thus, its presence indicates that some fraction of the methane over those those regions must come from fossil fuels.
“This is the first study to quantify methane emissions at regional scales within the continental United States with enough spatial resolution to significantly criticize the official inventories,” study co-author Marc L. Fischer, of the University of California Berkeley, said in a statement. “Even if we made emissions from livestock several times higher than inventory estimates would suggest for the southwest, you still don’t get enough to cover what’s actually being observed. That’s why it looks like oil and gas are likely responsible for a large part of the remainder…Cows don’t produce propane; oil and gas does.”
Cow farts aren’t getting off the hook here, and clearly the oil and gas industry is already known to be a big contributor to climate change. But one of the selling points of natural gas has been that it is more climate-friendly–or at least less climate-damaging–than other forms of fossil fuels, such as coal. If producing that natural gas results in more methane emissions than currently assumed, then it might not be such a good choice after all.
November 20, 2013
The world’s inland waterways move more than just water; they play a pivotal role in the global carbon cycle, soaking up carbon from the land and releasing it into the atmosphere as carbon dioxide. But are rivers or lakes bigger greenhouse gas contributors? A study published today in Nature finds that, cumulatively, rivers and streams release about five times more carbon dioxide than all the world’s lakes and reservoirs, even though the latter cover far more of the Earth’s surface.
Figuring out how much carbon dioxide these water bodies contribute to the carbon cycle is a complex task. Scientists have to determine the global surface area of the world’s lakes, streams, rivers and other water bodies. Then, they have to figure out how much carbon dioxide those bodies hold, and how quickly that carbon is transferred from water to atmosphere, a factor called the gas-transfer velocity. Uncertainties and a lack of data in all three areas have hamstrung efforts to determine exactly how much carbon inland waters are releasing.
To get better estimates, a team led by biogeochemist Peter Raymond of the Yale School of Forestry and Environmental Studies had to create more detailed data sets for all three parameters. They revised a census of lakes and reservoirs, and drew on data from sources as varied as space-shuttle missions and U.S. river monitors to determine the extents of global waterways. Inland waters are generally supersaturated with carbon dioxide, but how much carbon the waters hold differed by type. Gas-transfer velocities had been determined in earlier experiments; factors such as turbulence and lake size played a role in how quickly carbon dioxide moved through the system.
The researchers calculated that all the planet’s inland waters contribute about 2.1 gigatonnes of carbon to the atmosphere each year. Rivers and streams, which cover some 241,000 square miles (624,000 square kilometers) of the Earth, release about 1.8 gigatonnes of carbon each year. Another 0.32 gigatonnes come from lakes and reservoirs, which account for 1,200,000 square miles (3,000,000 square kilometers). These estimates were about twice as high as any done previously, the researchers note. However, the results are in line with detailed studies that have been done of places like the Amazon and temperate regions. To put this all in perspective, humans are expected to contribute about 36 gigatonnes of carbon to the atmosphere in 2013.
“Understanding the relative importance of these sources is crucial to the global carbon budget,” the researchers write. “A flux of 1.8 [gigatonnes of carbon per year] for streams and rivers is large considering their small surface area, reinforcing the concept that streams and rivers are hotspots for exchange.” In addition to giving researchers a better overall picture, the study highlights locations that are the biggest contributors of carbon dioxide
released through rivers, such as Southeast Asia, the Amazon, Europe and southeast Alaska.
There are still uncertainties in these calculations, however. The researchers left out the world’s wetlands because, with their vegetation, they function in a very different manner than open bodies of water–a wetland’s canopy can alter the movement of carbon dioxide into the atmosphere. There’s also a need for even better data than is currently available. “Because tropical regions are seriously under-represented in global data sets, additional studies of carbon concentrations in the predicted hotspot areas in the tropics are urgently needed,” Bernhard Wehrli, a biogeochemist at the Swiss Federal Institute of Technology in Zurich, writes in an accompanying News & Views article.
Plus, Wehrli notes, humans have been altering waterways for hundreds of years—damming them, draining them, channeling them. Some of these constructions, such as turbine releases associated with dams, along with natural features such as waterfalls, can be places of high gas emission. Others, such as human-made channels and drained wetlands, have produced such altered systems that they act very differently from the natural systems on which models of carbon budgets are based.
These uncertainties, however, give much food for thought. Do certain agricultural practices promote the transfer of carbon to rivers, which then escapes into the atmosphere as carbon dioxide? How much does the unnatural alteration of our waterways contribute to the amount of carbon dioxide released by rivers? Answering these questions will help scientists understand the degree to which human behavior is increasing greenhouse gas emission rates, giving us a fuller picture of the causes of human-induced climate change and where efforts to reduce carbon emissions might have the greatest effect.
November 14, 2013
When it comes to deforestation, Brazil’s Amazon often tops the list of places to worry about. New maps of global forest loss, however, find plenty of other sites throughout the world that should be of even bigger concern
. Angola, Zambia, Bolivia, Paraguay and Malaysia all have high rates of forest loss, but the situation is perhaps worst in Indonesia, where the rate of deforestation may soon exceed that in Brazil.
On a global scale, the planet lost 888,000 square miles of forest and gained 309,000 square miles of new forest between 2000 and 2012, a team of researchers led by remote sensing scientist Matthew Hansen of the University of Maryland College Park report today in Science. That’s a net forest loss equivalent to all the land in Alaska.
“Losses or gains in forest cover shape many important aspects of an ecosystem including climate regulation, carbon storage, biodiversity and water supplies, but until now there has not been a way to get detailed, accurate, satellite-based and readily available data on forest cover change from local to global scales,” Hansen said in a statement.
Hansen’s team began with a collection of more than 650,000 images taken by the Landsat 7 Earth-imaging satellite from 1999 to 2012 and housed in the Google Earth Engine, a cloud-computing platform that was created for just this kind of thing—planetary analyses of environmental characteristics, accomplished at amazing speeds. They tasked the engine to monitor vegetation taller than 16 feet (5 meters) across the globe as it appeared and disappeared through time. The result was a set of highly detailed maps showing forest extent, loss, gain and net change at a resolution of a mere 98 feet (30 meters).
The maps reveal a variety of stories taking place around the world. Tropical forests accounted for nearly a third of global deforestation, as humans stripped forest lands, both legally and illegally. Deforestation in those regions is a particular concern–tropical forests are home to many unique species that can be endangered or lost entirely when their forest homes are destroyed. What’s more, depending on the scale and patchiness of the tree loss, rainfall can either intensify or decrease, either of which can have devastating consequences, such as flood or drought. And the lost vegetation can no longer be a
sink for atmospheric carbon–the carbon stays in the atmosphere and intensifies climate change.
The rate of deforestation recorded by the study varied from nation to nation. Indonesia witnessed a doubling of forest loss in just a decade. In Brazil, by contrast, deforestation slowed from a pace of more than 15,400 square miles per year in 2003 and 2004 to a rate less than half that in 2010 and 2011, confirming that efforts in that country to reduce forest loss, including the combating of illegal logging, are seeing success. Despite the decline, however,
Brazil still suffers a lot of tree loss—the second highest total globally. And when combined with deforestation going on in other nations on that continent, such as Argentina, Bolivia and Paraguay, about half of tropical forest loss occurred in South America, Hansen’s team calculated.
Another way to look at the scope of tropical deforestation is to calculate loss as a percentage of a nation’s total land area. In that ranking, Brazil doesn’t look too bad since it’s a country with a large land area. Malaysia, Cambodia, Cote d’Ivoire, Tanzania, Argentina and Paraguay experienced a much greater loss of forest as a share of all their land.
Determining the extent of forest loss can be helpful for reducing it in the future, the researchers note. “Brazil’s use of Landsat data in documenting trends in deforestation was crucial to its policy formulation and implementation,” they write in their paper. “The maps and statistics we present can be used as an initial reference point for a number of countries lacking such data.”
The maps also reveal the small and large stories of forest growth and loss taking place in other regions around the world, highlighting places such as the American Southeast, where large portions of forest are lost and regrown in short periods of time; the region is a much bigger player in the timber industry than the more famous Northwest U.S. In Alaska, Canada and Russia—home to the world’s greatest extent of forest loss (loss per national area) simply due to that nation’s size—one can see how slowly these high-latitude forests recover from events such as wildfires. The maps even allow the detection of smaller events—such as the mountain pine bark beetle infestation in British Columbia and even a powerful windstorm that leveled forests in southwestern France.
“With our global mapping of forest changes every nation has access to this kind of information, for their own country and the rest of the world,” Hansen said. Whether they follow Brazil’s footsteps and use the data to work towards conserving these important ecosystems will be a question for the future.
October 29, 2012
As much as we might not like to admit it, humans make snap judgments based on appearances all the time. And that’s true even when it comes to cats. White Persians are snooty. Black cats are evil or unlucky. Some shelters even suspend adoptions of black cats and white cats around Halloween in fear of what misguided people might do with the kitties.
In a new study published in Anthrozoos, researchers from California State University and the New College of Florida set out to discover our hidden kitty biases with an Internet-based survey of nearly 200 people. They asked the participants to associate 10 personality terms (active, aloof, bold, calm, friendly, intolerant, shy, stubborn, tolerant and trainable) with five cat colors–orange, tri-colored (tortoiseshells and calico cats), white, black and bi-colored (white and anything else).
Some trends appeared in the data. Orange kitties were perceived as friendly and rated low in the aloof and shy categories. (They were also considered more trainable than were white cats, although the idea that anyone considers a cat trainable is kind of funny. Or am I betraying my own bias here?) Tri-colored cats rated high in aloofness and intolerance, and white cats were also considered aloof, as well as shy and calm. And bi-colored cats–which could have been any color, really, in the participants’ minds–were thought to be friendly. The data for black cats, however, was a bit muddier and no clear trends emerged.
Despite people’s perceptions that there are links between coat color and how a cat will behave, there is little hard evidence that such a connection is real. “But there are serious repercussions for cats if people believe that some cat colors are friendlier than others,” Mikel Delgado, lead author of the study and a doctoral student in psychology at the University of California, Berkeley, said in a statement.
That’s because when people are choosing a cat, they may make assumptions based on coat color about how that cat will behave in the home. But when they take the kitty home and he isn’t as friendly or cuddly or sedentary as they had hoped, the cat may be returned to the shelter. At least a million cats end up in shelters each year; many of them are euthanized.
And these biases have repercussions for cats of certain colors. A 2002 study in the Journal of Applied Animal Welfare Science, for example, found that black cats and brown cats were the least likely to be adopted. Dark cats were also more likely to euthanized. And despite there being little genetic evidence that the genes that guide the coloring and patterning on a cat’s coat also influence it’s behavior, the study found that people frequently believed that tortoiseshells had too much attitude (or “tortitude”), which may explain why they don’t get adopted quickly or get returned to the shelter.
But it’s difficult to cut through people’s biases. So shelters will have to work extra hard to educate prospective kitty adopters about cats and cat behavior. “You can’t judge a cat by its color,” Berkeley East Bay Humane Society cat coordinator Cathy Marden said in a statement. “If someone comes in to adopt, we encourage them to spend time with all the cats, because it’s the personality of that cat–not the color–that will let you know if the animal’s the right fit for you.”
And if a black cat crosses your path this week, don’t get frightened. He’s no more likely to be evil than the cat you have at home.