December 5, 2013
The magnitude 9.0 Tohoku-Oki earthquake that struck Japan on 11 March 2011, killing more than 15,000 people and setting off a devastating tsunami that the nation is still working to recover from, brought up a lot of troubling questions. For instance,
what made such a powerful earthquake possible, and could it happen again in Japan or somewhere else?
An international group of scientists that drilled miles beneath the Pacific Ocean and into the earthquake fault now have answers to these questions, and they report their findings in a trio of papers published today in Science.
The epicenter of the 2011 quake was in an unusual spot, about 130 kilometers east of Sendai, Japan, just off the northern coast of that nation
. In this area, a subduction zone, the Pacific plate is diving beneath the Eurasian plate. Strong earthquakes are possible here, but scientists hadn’t thought that there was enough energy to produce one larger than magnitude 7.5. They were wrong, and they’ve been interested in finding out more about what made the fault capable of producing such a large quake.
A little over a year after the earthquake, the deep sea drilling vessel Chikyu was tasked with the mission to drill into the fault off the Japanese coast and install a temperature observatory. By taking the temperature of a fault after an earthquake, scientists can measure how much energy was released in the quake and calculate a fault’s friction—how easily the rocks rub against each other.
“One way to look at the friction of these big blocks is to compare them to cross-country skis on snow,” Robert Harris, a study co-author and geophysicist at Oregon State University, said in a statement. “At rest, the skis stick to the snow and it takes a certain amount of force to make them slide. Once you do, the ski’s movement generates heat and it takes much less force to continue the movement…. The same thing happens with an earthquake.”
Getting that temperature measurement was tricky. The Chikyu team had to drill 850 meters into the seafloor, which itself was 6,900 meters below the ocean’s surface. They had to deal with bad weather, and the fault itself was still shifting, putting the instruments at risk.
The difficult work paid off, though, and it revealed residual heat from the earthquake, from which the scientists could calculate the fault’s friction, which was very low. Bottom line: “The Tohoku fault is more slippery than anyone expected,” Emily Brodsky, a study co-author and geophysicist at the University of California, Santa Cruz, said in another statement.
The slippery nature of the fault helps to explain some characteristics of the 2011 quake. The fault slipped an unprecedented 50 meters and the rupture, which began deep underground, reached the surface where it caused a sudden disturbance in the ocean and set off the tsunami.
The drilling and laboratory tests also revealed another characteristic of the fault that made it so dangerous. The low friction can be attributed to incredibly fine clay sediment within the fault. “It’s the slipperiest clay you can imagine,” Christie Rowe, a study co-author and geologist at McGill University, said in a statement. “If you rub it between your fingers, it feels like a lubricant.” Incidentally, the area between the Pacific and Eurasian plates that experiences slip is also very thin, less than five meters across, which would make it the thinnest known fault zone on the planet.
Measuring the earthquake’s thermal signal was a first for science. It “was a major accomplishment,” Harris said, “but there is still a lot we don’t yet know.” For example, researchers don’t yet know how generalizable these results are to other subduction zones across the world or what effect the thinness of fault zones has on earthquake hazards. Nonetheless, the drilling results “suggest that the shallow megathrust at the Japan Trench has special traits not seen in many other subduction zones,” Kelin Wang of Natural Resources Canada and Masataka Kinoshita of the Japan Agency for Marine-Earth Science and Technology—the agency that runs the Chikyu—wrote in an accompanying Perspectives article.
Similar conditions may be rare, but they do exist in some places of the north Pacific, such as the Kamchatka Peninsula in Russia and the Aleutian Islands in Alaska, notes Rowe.Deep sea drilling shows that these regions have that same usually slippery clay that lowered the friction in the Japan fault.
But the fact that the unusual circumstances of the Japan fault may be rare shouldn’t put scientists, or the public, at ease, Wang and Kinoshita say. Such huge, shallow slip isn’t necessary for a devastating tsunami to form, and it wasn’t what caused either the 2010 Chile tsunami that destroyed 370,000 homes or the 2004 Indian Ocean tsunami that killed nearly 230,000 people. “It’s hard to say how generalizable these results are until we look at other faults,” Brodsky added. “But this lays the foundation for a better understanding of earthquakes and, ultimately, a better ability to identify earthquake hazards.”
November 14, 2013
When it comes to deforestation, Brazil’s Amazon often tops the list of places to worry about. New maps of global forest loss, however, find plenty of other sites throughout the world that should be of even bigger concern
. Angola, Zambia, Bolivia, Paraguay and Malaysia all have high rates of forest loss, but the situation is perhaps worst in Indonesia, where the rate of deforestation may soon exceed that in Brazil.
On a global scale, the planet lost 888,000 square miles of forest and gained 309,000 square miles of new forest between 2000 and 2012, a team of researchers led by remote sensing scientist Matthew Hansen of the University of Maryland College Park report today in Science. That’s a net forest loss equivalent to all the land in Alaska.
“Losses or gains in forest cover shape many important aspects of an ecosystem including climate regulation, carbon storage, biodiversity and water supplies, but until now there has not been a way to get detailed, accurate, satellite-based and readily available data on forest cover change from local to global scales,” Hansen said in a statement.
Hansen’s team began with a collection of more than 650,000 images taken by the Landsat 7 Earth-imaging satellite from 1999 to 2012 and housed in the Google Earth Engine, a cloud-computing platform that was created for just this kind of thing—planetary analyses of environmental characteristics, accomplished at amazing speeds. They tasked the engine to monitor vegetation taller than 16 feet (5 meters) across the globe as it appeared and disappeared through time. The result was a set of highly detailed maps showing forest extent, loss, gain and net change at a resolution of a mere 98 feet (30 meters).
The maps reveal a variety of stories taking place around the world. Tropical forests accounted for nearly a third of global deforestation, as humans stripped forest lands, both legally and illegally. Deforestation in those regions is a particular concern–tropical forests are home to many unique species that can be endangered or lost entirely when their forest homes are destroyed. What’s more, depending on the scale and patchiness of the tree loss, rainfall can either intensify or decrease, either of which can have devastating consequences, such as flood or drought. And the lost vegetation can no longer be a
sink for atmospheric carbon–the carbon stays in the atmosphere and intensifies climate change.
The rate of deforestation recorded by the study varied from nation to nation. Indonesia witnessed a doubling of forest loss in just a decade. In Brazil, by contrast, deforestation slowed from a pace of more than 15,400 square miles per year in 2003 and 2004 to a rate less than half that in 2010 and 2011, confirming that efforts in that country to reduce forest loss, including the combating of illegal logging, are seeing success. Despite the decline, however,
Brazil still suffers a lot of tree loss—the second highest total globally. And when combined with deforestation going on in other nations on that continent, such as Argentina, Bolivia and Paraguay, about half of tropical forest loss occurred in South America, Hansen’s team calculated.
Another way to look at the scope of tropical deforestation is to calculate loss as a percentage of a nation’s total land area. In that ranking, Brazil doesn’t look too bad since it’s a country with a large land area. Malaysia, Cambodia, Cote d’Ivoire, Tanzania, Argentina and Paraguay experienced a much greater loss of forest as a share of all their land.
Determining the extent of forest loss can be helpful for reducing it in the future, the researchers note. “Brazil’s use of Landsat data in documenting trends in deforestation was crucial to its policy formulation and implementation,” they write in their paper. “The maps and statistics we present can be used as an initial reference point for a number of countries lacking such data.”
The maps also reveal the small and large stories of forest growth and loss taking place in other regions around the world, highlighting places such as the American Southeast, where large portions of forest are lost and regrown in short periods of time; the region is a much bigger player in the timber industry than the more famous Northwest U.S. In Alaska, Canada and Russia—home to the world’s greatest extent of forest loss (loss per national area) simply due to that nation’s size—one can see how slowly these high-latitude forests recover from events such as wildfires. The maps even allow the detection of smaller events—such as the mountain pine bark beetle infestation in British Columbia and even a powerful windstorm that leveled forests in southwestern France.
“With our global mapping of forest changes every nation has access to this kind of information, for their own country and the rest of the world,” Hansen said. Whether they follow Brazil’s footsteps and use the data to work towards conserving these important ecosystems will be a question for the future.
October 1, 2013
Last Tuesday, a 7.7-magnitude earthquake hit Pakistan, causing widespread destruction, the creation of a new island off the country’s coastline and at least 515 deaths.
Of course, there’s nothing we can do to prevent such disasters—earthquakes result from the shifting and collision of enormous, continent-scale tectonic plates over which we have no control. If we know a massive quake is about to strike, though, there may be measures we can take to better protect ourselves.
But how could we possibly know when a quake is about to hit? Seismologists are extremely good at characterizing the overall hazards that those living in fault zones face, but they’re far away from being able (and may never have the ability) to predict exactly when an earthquake will strike.
Undeterred, several different teams of scientists are hatching plans for a new kind of solution. And the key to their success may may be the smartphone in your pocket.
Their idea takes advantage of the fact that most new smartphones include a tiny chip called an accelerometer. These chips measure the movement of the phone in three directions (up-down, left-right, and backward-forward) to customize your experience as you use the phone—for example, rotating the display if you turn the device.
As it happens, seismometers (the large, expensive instruments used by geologists to detect and measure earthquakes) do essentially the same thing, albeit with much more accuracy. Still, the tiny accelerometers we already carry around with us all the time could allow scientists to gather much more real-time data than is currently available—there are countless times more smartphones than seismometers, they’re much cheaper and they’re already deployed in a wide range of locations—if they can actually measure earthquake movement with sufficient precision.
Recently, Antonino D’Alessandro and Giuseppe D’Anna, a pair of seismologists at Italy’s Istituto Nazionale di Geofisica e Vulcanologia, set out to resolve this question. To assess the accelerometers—specifically, the LIS331DLH MEMS accelerometer used in iPhones—the duo placed five iPhones on a vibrating table in a variety of positions (flat, angled on top of a wedge-shaped piece, and vertical) and compared the data they recorded with a professional-quality earthquake sensor for reference.
Their results, published Sunday in the Bulletin of the Seismological Society of America, showed that the iPhone accelerometers performed even better than they expected. “When we compared the signals, we pleasantly surprised by the result—the recordings were virtually identical,” D’Alessandro says. “An accelerometer that costs a few dollars was able to record acceleration with high fidelity, very similar to a professional accelerometer that costs a few thousand.”
There are some limitations: the iPhone accelerometers aren’t as sensitive to weak vibrations, so during the tests, they were only able to record movements that correspond to earthquakes that would register as magnitude 5 or higher. But ”these limits will be overcome in the near future,” says D’Alessandro. “Because these chips are widely used in laptops, games controllers and mobile phones, research into improving them is going on around the world.”
The next step would be developing software to allow normal users to harness these accelerometers’ capabilities, turning their smartphones into mobile earthquake sensing systems. Last December, Berkeley researchers announced plans to develop an app that would allow users to donate their accelerometer data to earthquake research. Stanford’s Quake-Catcher Network and Caltech’s Community Seismic Network—both of which use small purpose-built seismometers that are distributed to volunteers and plugged into their computers—could serve as a model for this sort of network.
Once in place, the network would be able to gather a huge amount of data from thousands of geographically-dispersed users, allowing researchers to see how quakes move with finer resolution. If enough phones are on this network, emergency workers may be able to quickly gauge where they could most efficiently devote their time after a quake hits.
But how do you go from documenting earthquakes to warning people about when dangerous shaking will occur? As The Atlantic points out, the key is that earthquakes are actually comprised of two types of waves that ripple through the earth: P-waves, which arrive first and are difficult for humans to sense, and S-waves, which typically come a few seconds later and cause the majority of the physical damage.
If we had software installed on our phones that automatically detected strong P-waves and sounded an alarm, we might have a few scant seconds to take cover before the S-waves hit (officials recommend dropping to the ground, huddling under a stable table or desk and getting away from windows and doors). It’s not much, but in some cases, a just a few crucial seconds of warning could make all the difference.
September 24, 2013
When it comes to the calculating the likelihood of catastrophic weather, one group has an obvious and immediate financial stake in the game: the insurance industry. And in recent years, the industry researchers who attempt to determine the annual odds of catastrophic weather-related disasters—including floods and wind storms—say they’re seeing something new.
“Our business depends on us being neutral. We simply try to make the best possible assessment of risk today, with no vested interest,” says Robert Muir-Wood, the chief scientist of Risk Management Solutions (RMS), a company that creates software models to allow insurance companies to calculate risk. “In the past, when making these assessments, we looked to history. But in fact, we’ve now realized that that’s no longer a safe assumption—we can see, with certain phenomena in certain parts of the world, that the activity today is not simply the average of history.”
This pronounced shift can be seen in extreme rainfall events, heat waves and wind storms. The underlying reason, he says, is climate change, driven by rising greenhouse gas emissions. Muir-Wood’s company is responsible for figuring out just how much more risk the world’s insurance companies face as a result of climate change when homeowners buy policies to protect their property.
First, a brief primer on the concept of insurance: Essentially, it’s a tool for spreading risk—say, the chance your house will be washed away by a hurricane—among a larger group of people, so that the cost of rebuilding the destroyed house is shared by everyone who pays insurance. To accomplish this, insurance companies sell flood policies to thousands of homeowners and collect enough in payments from all of them so that they have enough to pay for the inevitable disaster, plus keep some extra revenue as profit afterward. To protect themselves, these insurance companies even buy their own policies from reinsurance companies, who make the same sorts of calculations, just on another level upward.
The tricky part, though, is determining just how much these companies need to charge to make sure they have enough to pay for disasters and to stay in business—and that’s where Muir-Wood’s work comes in. “If you think about it, it’s actually quite a difficult problem,” he says. “You’ve got to think about all the bad things that can happen, and then figure out how likely all those bad things are, and then work out ‘How much do I need to set aside per year to pay for all the catastrophic losses that can happen?’”
With natural disasters like floods, he notes, you can have many years in a row with no damage in one particular area, then have tens of thousands of houses destroyed at once. The fact that the frequency of some catastrophic weather events may be changing due to climate change makes the problem even more complex.
The best strategy for solving it is the use of computer models, which simulate thousands of the most extreme weather disasters—say, a record-setting hurricane slamming into the East Coast just when the power grid is overloaded due to a heat wave—to tell insurance companies the worst-case scenario, so they know just how much risk they’re taking on, and how likely it is they’ll have to pay out.
“Catastrophes are complex, and the kinds of things that happen during them are complex, so we are constantly trying to improve our modeling to capture the full range of extreme events,” Muir-Wood says, noting that RMS employs more than 100 scientists and mathematicians towards this goal. “When Hurricane Sandy happened, for instance, we already had events like Sandy in our models—we had anticipated the complexity of having a really big storm driving an enormous storm surge, even with wind speeds that were relatively modest.”
These models are not unlike those used by scientists to estimate the long-term changes our climate will undergo as it warms over the next century, but there’s one important difference: Insurance companies care mainly about the next year, not the next 100 years, because they mostly sell policies one year at a time.
But even in the short term, Muir-Wood’s team has determined, the risk of a variety of disasters seems to have already shifted. “The first model in which we changed our perspective is on U.S. Atlantic hurricanes. Basically, after the 2004 and 2005 seasons, we determined that it was unsafe to simply assume that historical averages still applied,” he says. “We’ve since seen that today’s activity has changed in other particular areas as well—with extreme rainfall events, such as the recent flooding in Boulder, Colorado, and with heat waves in certain parts of the world.”
RMS isn’t alone. In June, the Geneva Association, an insurance industry research group, released a report (PDF) outlining evidence of climate change and describing the new challenges insurance companies will face as it progresses. “In the non-stationary environment caused by ocean warming, traditional approaches, which are solely based on analyzing historical data, increasingly fail to estimate today’s hazard probabilities,” it stated. “A paradigm shift from historic to predictive risk assessment methods is necessary.”
Moving forward, Muir-Wood’s group will attempt to keep gauging the shifting likelihood of a range of extreme weather events, so that insurers can figure out how much to charge so that they can compete with others, but not be wiped out when disaster strikes. In particular, they’ll be closely looking at changing the model for flooding rates in higher latitudes, such as Canada and Russia—where climate is shifting more quickly—as well as wildfires around the planet.
On the whole, it seems likely that insurance premiums for houses and buildings in flood-prone coastal regions will go up to account for the shifts Muir-Wood is seeing. On the other hand, because of the complex impacts of climate change, we might see risks—and premiums—go down in other areas. There’s evidence, for example, that snowmelt-driven springtime floods in Britain will become less frequent in the future.
For his own part, Muir-Wood puts his money where his mouth is. “I personally wouldn’t invest in beachfront property anymore,” he says, noting the steady increase in sea level we’re expecting to see worldwide in the coming century, on top of more extreme storms. “And if you’re thinking about it, I’d calculate quite carefully how far back you’d have to be in the event of a hurricane.”
June 7, 2013
Ocean plants produce some 50% of the planet’s oxygen. Seawater absorbs a quarter of the carbon dioxide we pump into the atmosphere. Ocean currents distribute heat around the globe, regulating weather patterns and climate. And, for those who take pleasure in life’s simple rewards, a seaweed extract keeps your peanut butter and ice cream at the right consistency!
Nonetheless, those of us who can’t see the ocean from our window still feel a disconnect—because the ocean feels far away, it’s easy to forget the critical role the ocean plays in human life and to think that problems concerning the ocean will only harm those people that fish or make their living directly from the sea. But this isn’t true: the sea is far more important than that.
Every year, scientists learn more about the top threats to the ocean and what we can do to counter them. So for tomorrow’s World Oceans Day, here’s a run-down of what we’ve learned just in the past 12 months.
This year, we got the news that the apparent “slow down” in global warming may just be the ocean shouldering the load by absorbing more heat than usual. But this is no cause to celebrate: the extra heat may be out of sight, but it shouldn’t be out of mind. Ocean surface temperatures have been rising incrementally since the early 20th century, and the past three decades have been warmer than we’ve ever observed before. In fact, waters off the U.S. East Coast were hotter in 2012 than the past 150 years. This increase is already affecting wildlife. For example, fish are shifting their ranges globally to stay in the cooler water they prefer, altering ecosystems and fisheries’ harvests.
Coral reefs are highly susceptible to warming: warm water (and other environmental changes) drives away the symbiotic algae that live inside coral animals and provide them food. This process, called bleaching, can kill corals outright by causing them to starve to death, or make it more likely that they will succumb to disease. A study out this year found that even if we reduce our emissions and stop warming the planet beyond 2°C, the number considered to be safe for most ecosystems, around 70% of corals will degrade and die by 2030.
Although coral reefs can be quite resilient and can survive unimaginable disturbances, we need to get moving on reducing carbon dioxide emissions and creating protected areas where other stressors such as environmental pollutants are reduced.
More than a hit of acid
The ocean doesn’t just absorb heat from the atmosphere: it also absorbs carbon dioxide directly, which breaks down into carbonic acid and makes seawater more acidic. Since preindustrial times, the ocean has become 30% more acidic and scientists are just starting to unravel the diverse responses ecosystems and organisms have to acidification.
And it really is a variety: some organisms (the “winners”) may not be harmed by acidification at all. Sea urchin larvae, for instance, develop just fine, despite having calcium carbonate skeletons that are susceptible to dissolving. Sponges that drill into shells and corals show an ability to drill faster in acidic seawater, but to the detriment of the organisms they’re boring into.
Nonetheless, there will be plenty of losers. This year saw the first physical evidence of acidification in the wild: the shells of swimming snails called pteropods showed signs of dissolution in Antarctica. Researchers previously found that oyster larvae fail under acidic conditions, potentially explaining recent oyster hatchery collapses and smaller oysters. Acidification may also harm other fisheries.
Plastic, plastic, everywhere
Americans produced 31 million tons of plastic trash in 2010, and only eight percent of that was recycled. Where does the remaining plastic go? A lot of it ends up in the ocean.
Since last World Oceans Day, trash has reached the deep-sea and the remote Southern Ocean, two of the most pristine areas on Earth. Most of the plastic trash in the ocean is small—a few centimeters or less—and can easily be consumed by animals, with damaging consequences. Some animals get hit on two fronts: when already dangerous plastic degrades in their stomachs it leaches toxic chemicals into their systems. Laysan albatross chicks are fed the bits of plastic by their parents in lieu of their typical diet and one-third of fish in the English Channel have nibbled on plastic.
Where have all the fish gone?
A perennial problem for the ocean, overfishing has only gotten worse with the advent of highly advanced gear. Despite fishing fleets going farther and deeper, the fishing gains are not keeping up with the increased effort.
Our brains can’t keep up either: even as we catch fewer fish, we acclimate to the new normal, adjust to the shifting baseline, and forget the boon that used to be, despite the fact that our memories are long enough to realize that most of the world’s fisheries (especially the small ones that aren’t regulated) are in decline.
Thankfully, those responsible for managing our fisheries are aware of what’s at stake. New knowledge about fish populations and their role in ecosystems can lead to recovery. A report from March 2013 shows that two-thirds of U.S. fish species that are closely managed due to their earlier declines are now considered rebuilt, or on their way.
Learn more about the ocean from the Smithsonian’s Ocean Portal. This post was co-authored by Emily Frost and Hannah Waters.