April 18, 2013
If you weren’t on the East Coast during Hurricane Sandy, you likely experienced the disaster through electronic means: TV, radio, the internet or phone calls. As people across the country tracked the storm by listening to information broadcast through electromagnetic waves, a different kind of wave, produced by the storm itself, was traveling beneath their feet.
Keith Koper and Oner Sufri, a pair of geologists at the University of Utah, recently determined that the crashing of massive waves against Long Island, New York and New Jersey—as well as waves hitting each other offshore—generated measurable seismic waves across much of the U.S., as far away as Seattle. As Sufri will explain in presenting the team’s preliminary findings today during the Seismological Society of America‘s annual meeting, they analyzed data from a nationwide network of seismometers to track microseisms, faint tremors that spread through the earth as a result of the storm waves’ force.
The team constructed a video (below) of the readings coming from 428 seismometers over the course of a few days before and after the storm hit. Initially, as it traveled up roughly parallel to the East Coast , readings remained relatively stable. Then, “as the storm turned west-northwest,” Sufri said in a press statement, “the seismometers lit up.” Skip to about 40 seconds into the video to see the most dramatic seismic shift as the storm hooks toward shore:
The microseisms shown in the video differ from the waves generated by earthquakes. The latter arrive suddenly, in distinct waves, while the microseisms that resulted from Sandy arrived continuously over time, more like a subtle background vibration. That makes converting these waves to the moment magnitude scale used to measure earthquakes somewhat complicated, but Koper says that if the energy from these microseisms was compressed into a single wave, it would register as a 2 or 3 on the scale, comparable to a minor earthquake that can be felt by a few people but causes no damage to buildings.
The seismic activity peaked when Sandy changed direction, the researchers say, triggering a sudden increase in the number of waves running into each other offshore. These created massive standing waves, which sent significant amounts of pressure into the seafloor bottom, shaking the ground.
It’s not uncommon for events other than earthquakes to generate seismic waves—Hurricane Katrina produced shaking that was felt in California, landslides are known to have distinct seismic signatures and the meteor that crashed in Russia in February produced waves as well. One of the reasons the readings from Sandy scientifically interesting, though, is the potential that this type of analysis could someday be used to track a storm in real-time, as a supplement to satellite data.
That possibility is enabled by the fact that a seismometer detects seismic motion in three directions: vertical (up-and-down shaking) as well as North-South and East-West movement. So, for example, if most of the shaking detected by a seismometer in one location is oriented North-South, it indicates that the source of the seismic energy (in this case, a storm) is located either North or South of the device, rather than East or West.
A nationwide network of seismometers—such as Earthscope, the system that was used for this research and is currently still being expanded—could eventually provide the capacity to pinpoint the center of a storm. “If you have enough seismometers, you can get enough data to get arrows to point at the source,” Koper said.
Satellites, of course, can already locate a hurricane’s eye and limbs. But locating the energetic center of the storm and combining it with satellite observations of the storm’s extent could eventually enable scientists to measure the energy being released by a hurricane in real-time, as the storm evolves. Currently, the Saffir-Simpson scale is used to quantify hurricanes, but there are several criticisms of it—it’s solely based on wind speed, so it overlooks the overall size of a storm and the amount of precipitation in produces. Including the raw seismic energy released by a storm could be a way of improving future hurricane classification schemes.
The prospect of seismometers (instruments typically used to detect earthquakes) being employed to supplement satellites in tracking storms is also interesting because of a recent trend in the exact opposite direction. Last month, a satellite data was used for the first time to detect an earthquake by picking up extremely low pitched sound waves that traveled from the epicenter through outer space. The fields of meteorology and geology, it seems, are quickly coming together, reflecting the real-world interaction between the Earth and the atmosphere that surrounds it.
February 25, 2013
If you feel sluggish and have difficulty getting physical work done on very hot, humid days, it’s not your imagination. Our bodies are equipped with an adaptation to handle high temperatures—perspiration—but sweating becomes ineffective at cooling us down when the air around us is extremely humid.
Add in the fact that climate change is projected to increase the average humidity of Earth as well as its temperature, and you could have a recipe for a rather unexpected consequence of greenhouse gas emissions: a reduced overall ability to get work done. According to a study published yesterday in Nature Climate Change, increased heat and humidity has already reduced our species’ work capacity by 10% in the warmest months, and that figure could rise to 20% by 2050 and 60% by the year 2200, given current projections.
The Princeton research team behind the study, led by John Dunne, came to the finding by combining the latest data on global temperature and humidity over the past few decades with American military and industrial guidelines for how much work a person can safely do under environmental heat stress. For their projections, they used two sets of climate regimes: a pessimistic scenario, in which greenhouse gas emissions rise unchecked through 2200, and an optimistic one, in which they begin to stabilize after 2060.
The team also considered a range of possible activities we might consider work: heavy labor (such as heavy lifting or digging) that burns 350-500 Calories per hour, moderate labor (such as continuous walking) that burns 200-350 Calories per hour and light labor (such as standing in place) that burns less than 200. For each of these levels of activity, there is a cut-off point of temperature and humidity past which the human body cannot safely work at full capacity.
Much of the reduced work capacity, the researchers say, will occur in tropical latitudes. In the map from the study below, shaded areas correspond to places where, over the course of a year, there are more than 30 days during which heat and humidity stresses reduce work capacity. Purple and blue cover areas for which this is only true for mostly heavy labor, while green and yellow indicate regions where even moderate labor is impacted:
Under the pessimistic emissions scenario, in 2100, the area of the globe for which humidity curtails work will expand dramatically, covering much of the U.S., and reducing total human work capacity by 37% overall worldwide during the hottest months. Red covers areas for which capacity for even light labor is reduced due to climate for more than 30 days per year:
The effect, they note, is that “heat stress in Washington DC becomes higher than present-day New Orleans, and New Orleans exceeds present-day Bahrain.” This doesn’t include other types of dynamics which could accelerate the consequences of climate change in highly populated areas, such as the urban heat island effect—it’s just a basic calculation given what we project will happen to the climate and what we know about how the human body works.
Looking at the map and thinking about how the study defines “work” can lead to a troubling conclusion: in 2100, throughout much of the U.S., simply taking an extended walk outdoors might not be possible for many people. The economic impacts—in terms of construction and other fields that rely upon heavy manual labor—are another issue entirely. Climate change is certain to bring a wide range of unpleasantconsequences, butthe effect of humidity on a person’s ability to work could be the one that impacts daily life the most.
January 26, 2013
The urban heat island effect—in which heat trapped by large-scale construction and paving cause a city to be several degrees warmer than the surrounding countryside—is a well-documented phenomenon that’s been studied for decades.
Now, though, a group of atmospheric researchers have discovered that through a different mechanism, cities can also alter the weather over a much wider area—causing temperatures to rise or fall by nearly 2 degrees Fahrenheit thousands of miles away. As described in a paper published today in Nature Climate Change, they discovered that ambient heat generated by a city’s buildings and cars often gets lifted up into the jet stream, leading to weather changes over a massive area.
“What we found is that energy use from multiple urban areas collectively can warm the atmosphere remotely, thousands of miles away from the energy consumption regions,” said lead author Guang Zhang of the Scripps Institute of Oceanography. “This is accomplished through atmospheric circulation change.”
In studying the excess heat generated by daily activities in cities around the Northern Hemisphere, Zhang and colleagues from the National Center for Atmospheric Research and elsewhere found that a significant amount of the heat is lifted into the jet stream, causing the fast-moving current of air to widen. Overall, this causes an average of 1.8 degrees Fahrenheit warming during the winter for most of North America and Asia, and 1.8 degrees Fahrenheit cooling during the fall for Europe.
The explanation for this phenomenon is fairly simple: A disproportionate amount of the excess heat produced by human activity is concentrated in a few key areas, and many of these areas (the East and West coasts of the U.S., as well as Western Europe and East Asia) lie underneath the jet stream and other prominent air circulation belts. When the heat is taken up into the system, it disrupts the normal flow of energy and can cause surface temperatures to change in distant locales affected by the same air circulation patterns.
The overall effect of this trend on the climate, the researchers say, is negligible—it’s easily dwarfed by the effect of greenhouse gases in trapping heat and causing long-term climate change. It does, however, account for various anomalies in the difference between warming predicted by computer models and what’s actually been observed. Future models will need to take into account this phenomenon as they attempt to simulate the impact of climate change in various areas.
For residents of rural locales, the surprising finding means something more tangible: on an unexpectedly warm (or cold) day, they might have city-dwellers thousands of miles away to thank for the “waves” of warmth emanating from an urban heat island.
January 16, 2013
Compared to extreme drought, blistering heat, massive wildfires and tropical cyclones, the latest indicator of climate change is unexpectedly attractive: early spring flowers. According to a study published today in the journal PLOS ONE, unusually warm spring weather in 2010 and 2012 at a pair of notable sites in the eastern U.S. led to the earliest spring flowering times on record—earlier than any other time in the last 161 years.
The researchers involved, from Boston University, the University of Wisconsin and Harvard, examined the flowers at two sites well-known for their roles in the early environmental movement: Walden Pond, where Henry David Thoreau started keeping flowering records back in 1852, and Dane County, Wisc., where Aldo Leopold first recorded flowering data in 1935.
“We were amazed that wildflowers in Concord flowered almost a month earlier in 2012 than they did in Thoreau’s time or any other recent year, and it turns out the same phenomenon was happening in Wisconsin where Aldo Leopold was recording flowering times,” lead author Elizabeth Ellwood of Boston University said in a statement. “Our data shows that plants keep shifting their flowering times ever earlier as the climate continues to warm.”
In Massachusetts, the team studied 32 native spring flowering plant species—such as wild columbine, marsh marigold and pink lady slipper—for which average flowering dates had been fairly well-documented between Thoreau’s time and our own. They found that the plants’ flowering dates had steadily moved earlier as temperatures increased—Thoreau saw them flower on May 15, while they flowered on April 25 and 24 in 2010 and 2012, respectively. In the two years studied, 27 of the 32 species had their earliest flowering date ever.
In Wisconsin, they examined 23 species with similarly thorough records and found even more dramatic shifts. Between the 1930s and the present day, the plants’ average flowering date moved from May 7 to April 13, and 19 of the 23 species studied set records in either 2010 or 2012. On the whole, the researchers found that the plants examined in both locations flowered 4.1 days earlier for every 1 degree Celsius increase in average spring temperature.
There’s little disagreement among scientists that climate change, as a whole, is a fearful proposition. But, interestingly, some botanists might actually see these findings as encouraging for the plants in particular. Those studied, at least, seem able to adapt to the warmer springs and shorter winters by flowering earlier, rather than missing out on crucial growing time—a flexibility that bodes well for their future in a warming climate.
Of course, this is only a stop-gap measure, as the scientists suspect that there is some flowering threshold the plants cannot pass. If winters get so short that these flowering plants have no time at all to go dormant, it would conceivably alter their annual growth cycle to an extent that threatens their survival—or allows plants from warmer areas to move in and outcompete the natives.
December 16, 2012
For years, most of us have envisioned climate change as a long-term problem that requires a long-term solution. But as the years pass—and with the calendar soon to flip over to 2013—without any substantial attempts to cut greenhouse gas emissions worldwide, this impression needs to change in a hurry.
According to a new paper published today in the journal Nature Climate Change, there’s a startlingly small number we need to keep in mind when dealing with climate change: 8. That’s as in 8 more years until 2020, a crucial deadline for reducing global carbon emissions if we intend to limit warming to 2°C, according to a team of researchers from a trio of research institutions—the International Institute for Applied Systems Analysis and ETH Zurich in Switzerland, along with the National Center for Atmospheric Research in Boulder, Colorado—who authored the paper.
They came to the finding by looking at a range of different scenarios for emissions levels in 2020 and projecting outward how much warming each one would cause for the planet as a whole by the year 2100. They found that in order to have a good chance at holding long-term warming to an average of 2°C worldwide—a figure often cited as the maximum we can tolerate without catastrophic impacts—annual emissions of carbon dioxide (or equivalent greenhouse gas) in 2020 can be no higher than 41 to 47 gigatons worldwide.
That’s a problem when you consider the fact that we’re currently emitting 50 gigatons annually; if present trends continue, that number will rise to 55 gigatons by 2020. In other words, unless we want catastrophic levels of warming, we need to do something, quickly.
The researchers also weighed a number of technological approaches that could help us bring this figure down by 2020: mass conversion to nuclear power generation, rapid adoption of energy-efficient appliances and buildings, electric vehicle usage and other means of reducing fossil fuel use. “We wanted to know what needs to be done by 2020 in order to be able to keep global warming below two degrees Celsius for the entire twenty-first century,” said Joeri Rogelj, the lead author of the paper, in a statement.
It turns out that some combination of all of these methods will be necessary. But lowering global energy demand—in large part, by increasing efficiency—is by far the easiest route to making a dent in emissions soon enough to hit the goal by 2020.
If the reduction target isn’t reached by 2020, avoiding catastrophic warming could theoretically still be possible, the researchers note, but the cost of doing so would only increase, and our options would narrow. If we start cutting emissions now, for example, we might be able to hit the goal without increasing nuclear power generation, but wait too long and it becomes a necessity.
Waiting past 2020 would also require more costly changes. In that case, “you would need to shut down a coal power plant each week for ten years if you still wanted to reach the two-degree Celsius target,” said Keywan Riahi, one of the co-authors. Waiting would also make us more reliant on as-yet unproven technologies, such as carbon capture and storage and the efficient conversion of crops into biofuels.
“Fundamentally, it’s a question of how much society is willing to risk,” said David McCollum, another co-author. “It’s certainly easier for us to push the climate problem off for a little while longer, but…continuing to pump high levels of emissions into the atmosphere over the next decade only increases the risk that we will overshoot the two-degree target.”
Given the continuing failures of negotiators to come to any sort of international climate agreement—most recently highlighted by the lack of progress at the COP 18 Conference in Doha—this “risk” seems to more closely resemble a certainty. 2020 might seem a long way off, but if we spend the next 7 years stalling like we have over the past 18 years of climate negotiations, it’ll get here faster than we can imagine.