October 9, 2013
Climate change is a global problem, but that doesn’t mean it’s going to hit us all the same time.
If you live in Moscow, scientists estimate that your local climate will depart from the historical norm in the year 2063. In New York, that date is the year 2047. And if you happen to reside in Mexico City or Jakarta, those numbers are 2031 and 2029, respectively.
See a pattern here? These estimates, which all come from a new study published today in Nature by scientists from the University of Hawaii, reflect a concerning trend that some scientists believe will define the arrival of climate change’s effects on the planet: It’ll arrive in tropical, biodiverse areas first.
Most climate models simulate how changes in greenhouse gas concentration will affect the worldwide climate in a given year (most often 2020, 2050 or 2100). But the Hawaii team, led by biologist and geographer Camilo Mora, took an alternate approach—they assumed, in the absence of a global mitigation agreement, greenhouse gas levels will keep rising at a steady rate, and used climate models to track how long it would take for weather events that are currently thought of as extreme to become typical.
When they calculated which year this would occur for a range of cities—defining a deviation from the historical record as the first year when a given month’s coldest day is hotter than any day of that month between 1860 and 2005—our dates of climate departure came far sooner than they were expecting.
“The results shocked us. Regardless of the scenario, changes will be coming soon,” Mora said in a press statement. “Within my generation, whatever climate we were used to will be a thing of the past.”
For all locations on Earth, the average year of departure is 2047, but for some places concentrated in the tropics, that date will come much sooner, in the 2030′s, or in some extreme cases, the 2020′s. In just a few decades, in other words, the coldest day you experience in January will be hotter than the warmest days your parents had in January—and the hottest day you get in July (in the Northern hemisphere) will simply be hotter than any day anyone has ever felt in your city to date.
The fact that these effects would be felt soonest in the tropics, according to the simulation, is also surprising. Thus far, most models have predicted that the most abrupt shifts in temperature will occur at the poles.
The new study actually agrees with that fact, but views it from a different perspective, looking at relative changes compared to the historical record rather than absolute changes in temperature. Because the tropics have less variability in temperature to start with, it takes less of a shift to push temperatures there beyond the norm. On the other hand, temperatures will indeed surge most in the Arctic and Antarctic, but there’s already more natural climate variability at those locales to begin with.
This is a huge concern, because wildlife biodiversity is consistently highest at the tropics, and most of the world’s biodiversity hotspots are located there (tropical rainforests, for instance, are estimated to cover less than 2 percent of the Earth’s surface area yet contain roughly 50 percent of its plant and animal species). If, historically, these ecosystems evolved in the presence of relatively little climatic biodiversity, it follows that they might be less capable of coping with swings in temperature and adapting to survive.
It also happens that a disproportionate amount of the people living in poverty worldwide are located in the tropics. “Our results suggest that countries first impacted by unprecedented climates are the ones with the least capacity to respond,” study author Ryan Longman said. “Ironically, these are the countries that are least responsible for climate change in the first place.”
Despite the bad news, the researchers say they embarked on this alternate sort of climate modeling to empower people. “We hope that with this map, people can see and understand the progression of climate change in time where they live, hopefully connecting people more closely to the issue and increasing awareness about the urgency to act,” said co-author Abby Frazier said.
Towards this goal, the group also put out an interactive map that lets you click on any location and see the projected increase in temperature over time, along with two different years: the one in which you can expect a consistently extreme climate if we keep emitting carbon dioxide at current rates, and the one in which you’ll experience an abnormal climate if we figure out a way to stop.
September 24, 2013
When it comes to the calculating the likelihood of catastrophic weather, one group has an obvious and immediate financial stake in the game: the insurance industry. And in recent years, the industry researchers who attempt to determine the annual odds of catastrophic weather-related disasters—including floods and wind storms—say they’re seeing something new.
“Our business depends on us being neutral. We simply try to make the best possible assessment of risk today, with no vested interest,” says Robert Muir-Wood, the chief scientist of Risk Management Solutions (RMS), a company that creates software models to allow insurance companies to calculate risk. “In the past, when making these assessments, we looked to history. But in fact, we’ve now realized that that’s no longer a safe assumption—we can see, with certain phenomena in certain parts of the world, that the activity today is not simply the average of history.”
This pronounced shift can be seen in extreme rainfall events, heat waves and wind storms. The underlying reason, he says, is climate change, driven by rising greenhouse gas emissions. Muir-Wood’s company is responsible for figuring out just how much more risk the world’s insurance companies face as a result of climate change when homeowners buy policies to protect their property.
First, a brief primer on the concept of insurance: Essentially, it’s a tool for spreading risk—say, the chance your house will be washed away by a hurricane—among a larger group of people, so that the cost of rebuilding the destroyed house is shared by everyone who pays insurance. To accomplish this, insurance companies sell flood policies to thousands of homeowners and collect enough in payments from all of them so that they have enough to pay for the inevitable disaster, plus keep some extra revenue as profit afterward. To protect themselves, these insurance companies even buy their own policies from reinsurance companies, who make the same sorts of calculations, just on another level upward.
The tricky part, though, is determining just how much these companies need to charge to make sure they have enough to pay for disasters and to stay in business—and that’s where Muir-Wood’s work comes in. “If you think about it, it’s actually quite a difficult problem,” he says. “You’ve got to think about all the bad things that can happen, and then figure out how likely all those bad things are, and then work out ‘How much do I need to set aside per year to pay for all the catastrophic losses that can happen?’”
With natural disasters like floods, he notes, you can have many years in a row with no damage in one particular area, then have tens of thousands of houses destroyed at once. The fact that the frequency of some catastrophic weather events may be changing due to climate change makes the problem even more complex.
The best strategy for solving it is the use of computer models, which simulate thousands of the most extreme weather disasters—say, a record-setting hurricane slamming into the East Coast just when the power grid is overloaded due to a heat wave—to tell insurance companies the worst-case scenario, so they know just how much risk they’re taking on, and how likely it is they’ll have to pay out.
“Catastrophes are complex, and the kinds of things that happen during them are complex, so we are constantly trying to improve our modeling to capture the full range of extreme events,” Muir-Wood says, noting that RMS employs more than 100 scientists and mathematicians towards this goal. “When Hurricane Sandy happened, for instance, we already had events like Sandy in our models—we had anticipated the complexity of having a really big storm driving an enormous storm surge, even with wind speeds that were relatively modest.”
These models are not unlike those used by scientists to estimate the long-term changes our climate will undergo as it warms over the next century, but there’s one important difference: Insurance companies care mainly about the next year, not the next 100 years, because they mostly sell policies one year at a time.
But even in the short term, Muir-Wood’s team has determined, the risk of a variety of disasters seems to have already shifted. “The first model in which we changed our perspective is on U.S. Atlantic hurricanes. Basically, after the 2004 and 2005 seasons, we determined that it was unsafe to simply assume that historical averages still applied,” he says. “We’ve since seen that today’s activity has changed in other particular areas as well—with extreme rainfall events, such as the recent flooding in Boulder, Colorado, and with heat waves in certain parts of the world.”
RMS isn’t alone. In June, the Geneva Association, an insurance industry research group, released a report (PDF) outlining evidence of climate change and describing the new challenges insurance companies will face as it progresses. “In the non-stationary environment caused by ocean warming, traditional approaches, which are solely based on analyzing historical data, increasingly fail to estimate today’s hazard probabilities,” it stated. “A paradigm shift from historic to predictive risk assessment methods is necessary.”
Moving forward, Muir-Wood’s group will attempt to keep gauging the shifting likelihood of a range of extreme weather events, so that insurers can figure out how much to charge so that they can compete with others, but not be wiped out when disaster strikes. In particular, they’ll be closely looking at changing the model for flooding rates in higher latitudes, such as Canada and Russia—where climate is shifting more quickly—as well as wildfires around the planet.
On the whole, it seems likely that insurance premiums for houses and buildings in flood-prone coastal regions will go up to account for the shifts Muir-Wood is seeing. On the other hand, because of the complex impacts of climate change, we might see risks—and premiums—go down in other areas. There’s evidence, for example, that snowmelt-driven springtime floods in Britain will become less frequent in the future.
For his own part, Muir-Wood puts his money where his mouth is. “I personally wouldn’t invest in beachfront property anymore,” he says, noting the steady increase in sea level we’re expecting to see worldwide in the coming century, on top of more extreme storms. “And if you’re thinking about it, I’d calculate quite carefully how far back you’d have to be in the event of a hurricane.”
August 28, 2013
Over the past 15 years, a strange thing has happened. On one hand, carbon dioxide concentrations have kept on shooting up thanks to humans burning fossil fuels—in May, we passed 400 parts per million for the first time in human history.
On the other hand, despite certain regions experiencing drastically warmer weather, global average temperatures have stopped increasing. Climate change deniers have seized upon this fact to argue that, contrary to the conclusions reached by major science academies (PDF) around the world, greenhouse gas emissions do not cause global warming.
As it turns out, the truth is much grimmer. A pair of scientists from Scripps Institution of Oceanography have determined that the underlying process of global warming has merely been masked by natural decade-scale variations in the temperature of Pacific Ocean surface waters, related to the El Niño/La Niña cycle. Once that’s finished, our planet’s warming will march onward as usual.
Climate scientists have speculated about the possibility that ENSO (the El Niño-Southern Oscillation, the proper term for the cycle) was behind the apparent hiatus in warming for some time, but the scientists behind the new study—Yu Kosaka and Shang-Ping Xie—are the first to take a quantitative look at the role of Pacific surface temperatures in pausing global warming as a whole. Their paper, published today in Nature, uses climate models to show that the abnormally cool surface waters observed over the Pacific since 1998 can account for the lack of recent warming entirely.
Why has the Pacific been abnormally cool for the past 15 years? Naturally, as part of ENSO, a large swath of the ocean off the western coast of South America becomes notably warmer some years (called El Niño events) and cooler in others (La Niña events). Scientists still don’t fully understand why this occurs, but they do know that the warmer years are related to the formation of high air pressures over the Indian Ocean and Australia, and lower pressures over the eastern part of the Pacific.
Because winds move from areas of high pressure to low pressure, this causes the region’s normal trade winds to reverse in direction and move from west to east. As they move, they bring warm water with them, causing the El Niño events; roughly the reverse of this process happens in other years, bringing about La Niña. As it happens, colder surface temperatures in the Pacific—either official La Niña events or abnormally cool years that don’t quite qualify for that designation—have outweighed warm years since 1998.
That, say Kosaka and Xie, is the reason for the surprising lack of increase in global average temperatures. To come to this conclusion, they developed a climate model that, along with factors like the concentration of greenhouse gases over time and natural variations in the solar cycle, specifically takes the ENSO-related cycle of Pacific surface temperatures into account.
Typically, climate models mainly use radiative forcing—the difference between the amount of energy absorbed by the planet and the amount sent back out to space, which is affected by greenhouse gas emissions—as a data input, but they found that when their model did so, it predicted that global average temperatures would increase much more over the past 15 years than they actually have. However, when the abnormally-cool waters present in the eastern Pacific were taken into account, the temperatures predicted by the model matched up with observed temperatures nicely.
In models, the presence of these cooler waters over a huge area (a region within the Pacific that makes up about 8.2% of the Earth’s surface) serves to absorb heat from the atmosphere and thus slow down the underlying warming process. If the phenomenon is representative of reality, the team’s calculations show that it has caused the planet’s overall average temperature to dip by about 0.27°F over the past decade, combating the effects of rising carbon dioxide emissions and causing the apparent pause in warming.
This isn’t the first localized climate-related event to have effects on the progression of climate change as a whole. Last week, other researchers determined that in 2010 and 2011, massive floods in Australia slowed down the global rise in sea level that would have been been expected from observed rates of glacier melting and the thermal expansion of sea water. In many cases, it seems, the subtle and complex dynamics of the planet’s climate systems can camouflage the background trend of warming, caused by human activity.
But that trend is continuing regardless, and so the most obvious impact of this new finding is a disconcerting one: the Pacific will eventually return to normal temperatures, and as a result, global warming will continue. The scientists don’t know exactly when this will happen, but records indicate that the Pacific goes through this longer-term cycle every decade or so, meaning that the era of an abnormally-cool Pacific will probably soon be over.
Perhaps most distressing, the study implies that the extreme warming experienced in recent years in some areas—including much of the U.S.—is actually less warming than would be expected given the amount of carbon dioxide we’ve released. Other regions that haven’t seen much warming yet, meanwhile, are likely in line for some higher temperatures soon.
April 18, 2013
If you weren’t on the East Coast during Hurricane Sandy, you likely experienced the disaster through electronic means: TV, radio, the internet or phone calls. As people across the country tracked the storm by listening to information broadcast through electromagnetic waves, a different kind of wave, produced by the storm itself, was traveling beneath their feet.
Keith Koper and Oner Sufri, a pair of geologists at the University of Utah, recently determined that the crashing of massive waves against Long Island, New York and New Jersey—as well as waves hitting each other offshore—generated measurable seismic waves across much of the U.S., as far away as Seattle. As Sufri will explain in presenting the team’s preliminary findings today during the Seismological Society of America‘s annual meeting, they analyzed data from a nationwide network of seismometers to track microseisms, faint tremors that spread through the earth as a result of the storm waves’ force.
The team constructed a video (below) of the readings coming from 428 seismometers over the course of a few days before and after the storm hit. Initially, as it traveled up roughly parallel to the East Coast , readings remained relatively stable. Then, “as the storm turned west-northwest,” Sufri said in a press statement, “the seismometers lit up.” Skip to about 40 seconds into the video to see the most dramatic seismic shift as the storm hooks toward shore:
The microseisms shown in the video differ from the waves generated by earthquakes. The latter arrive suddenly, in distinct waves, while the microseisms that resulted from Sandy arrived continuously over time, more like a subtle background vibration. That makes converting these waves to the moment magnitude scale used to measure earthquakes somewhat complicated, but Koper says that if the energy from these microseisms was compressed into a single wave, it would register as a 2 or 3 on the scale, comparable to a minor earthquake that can be felt by a few people but causes no damage to buildings.
The seismic activity peaked when Sandy changed direction, the researchers say, triggering a sudden increase in the number of waves running into each other offshore. These created massive standing waves, which sent significant amounts of pressure into the seafloor bottom, shaking the ground.
It’s not uncommon for events other than earthquakes to generate seismic waves—Hurricane Katrina produced shaking that was felt in California, landslides are known to have distinct seismic signatures and the meteor that crashed in Russia in February produced waves as well. One of the reasons the readings from Sandy scientifically interesting, though, is the potential that this type of analysis could someday be used to track a storm in real-time, as a supplement to satellite data.
That possibility is enabled by the fact that a seismometer detects seismic motion in three directions: vertical (up-and-down shaking) as well as North-South and East-West movement. So, for example, if most of the shaking detected by a seismometer in one location is oriented North-South, it indicates that the source of the seismic energy (in this case, a storm) is located either North or South of the device, rather than East or West.
A nationwide network of seismometers—such as Earthscope, the system that was used for this research and is currently still being expanded—could eventually provide the capacity to pinpoint the center of a storm. “If you have enough seismometers, you can get enough data to get arrows to point at the source,” Koper said.
Satellites, of course, can already locate a hurricane’s eye and limbs. But locating the energetic center of the storm and combining it with satellite observations of the storm’s extent could eventually enable scientists to measure the energy being released by a hurricane in real-time, as the storm evolves. Currently, the Saffir-Simpson scale is used to quantify hurricanes, but there are several criticisms of it—it’s solely based on wind speed, so it overlooks the overall size of a storm and the amount of precipitation in produces. Including the raw seismic energy released by a storm could be a way of improving future hurricane classification schemes.
The prospect of seismometers (instruments typically used to detect earthquakes) being employed to supplement satellites in tracking storms is also interesting because of a recent trend in the exact opposite direction. Last month, a satellite data was used for the first time to detect an earthquake by picking up extremely low pitched sound waves that traveled from the epicenter through outer space. The fields of meteorology and geology, it seems, are quickly coming together, reflecting the real-world interaction between the Earth and the atmosphere that surrounds it.
April 15, 2013
This year, prolonged extreme temperatures and seemingly never-ending snowstorms in the United States forced many inside, seeking shelter from what felt like an unusually long winter. This meant some of us were stuck in bed for a day or two clutching a box of Kleenex and downing cough syrup. That’s because viruses that cause the common cold love enclosed spaces with lots of people—the family room, the office, the gym.
And though spring has arrived, cold-causing microbes haven’t slowed down. More than 200 viruses can trigger a runny nose, sore throat, sneezing and coughing—more than 1 billion cases of the common cold occur in the United States each year. The worst offenders (and the most common), known as human rhinoviruses, are most active in spring, summer and early fall.
While it’s difficult to pinpoint exactly when infected people cease to be contagious, they’re most likely to spread their cold when symptoms are at their worst, explains Dr. Teresa Hauguel of the National Institute of Allergy and Infectious Diseases. However, there’s another window of opportunity to be wary about. “A person can be infected before they actually develop symptoms, so they can be spreading it without even realizing it if they’re around people,” Hauguel writes in an email.
Surprised? Here are five more facts about the common cold.
Cold-causing viruses can be found in all corners of the world. Rhinoviruses (from the Greek word rhin, meaning “nose”) evolved from enteroviruses, which cause minor infections throughout the human body. They have been identified even in remote areas inside the Amazon. But it’s impossible to tell how long humans have been battling colds. Scientists can’t pinpoint when rhinoviruses evolved: they mutate too quickly and don’t leave a footprint behind in preserved human fossils. They could have been infecting
mankindhominids before our species appeared. Or they might have sprung up as small groups of humans moved out of isolation and into agricultural communities, where the pathogen became highly adapted to infecting them.
Cold-causing microbes can survive for up to two days outside of the body. Rhinoviruses, which cause 30 to 50 percent of colds, usually live for three hours on your skin or any touchable surface, but can sometimes survive for up to 48 hours. The list of touchable surfaces is a lengthy one: door knobs, computer keyboards, kitchen counters, elevator buttons, light switches, shopping carts, toilet paper rolls—the things we come in contact with on a regular basis. The number of microbes that can grow on these surfaces varies, but each spot can contain several different types of microbes.
You can calculate how far away to stand from someone who’s sick. When a sick person coughs, sneezes or talks, they expel virus-containing droplets into the air. These respiratory droplets can travel up to six feet to another person. A recent study found that the largest visible distance over which a sneeze travels is 0.6 meters, which is almost two feet. It did so at 4.5 meters per second, about 15 feet per second. A breath travels the same distance but much slower, at 1.4 meters—4.5 feet—per second. Moral of the story: remain six feet from infected people, and move quickly when they gear up to sneeze.
The weather plays a role in when and how we get sick—but not in the way you might think. Humidity levels can help those droplets whiz through the air quicker: the lower the humidity, the more moisture evaporates from the droplet, shrinking it in size so it can stay airborne for larger distances. Cold weather is notoriously dry, which explains why we’re more likely to catch a cold while we huddle up inside when temperatures start sinking. This type of air can dry out the mucus lining in our nasal passages; without this protective barrier that traps microbes before they enter the body, we’re more vulnerable to infection. So we’re weakened by the air we breathe in when it’s chilly out, not the chilly weather itself.
Contrary to popular belief, stocking up on vitamin C won’t help. Linus Pauling, a Nobel Prize-winning chemist, popularized the idea of taking high doses of vitamin C to ward off colds. But when put to the test, this cold remedy doesn’t actually work. If you take at least 0.2 grams of vitamin C every day, you’re not likely to have any fewer colds, but you may have colds that are a day or two shorter. When symptoms start to appear, drizzling packets of Emergen-C into glass after glass of water won’t help either. The vitamin is no more effective than a placebo at reducing how long we suffer from cold symptoms.