How Human Echolocation Allows People to See Without Using Their Eyes

Mimicking bats and dolphins, some people have developed the ability to analyze bouncing sound waves to generate a picture of their environment

environment
Much like bats and dolphins, some people have developed the ability to analyze bouncing sound waves to generate a picture of their environment. Image via Flickr user poolski

When a bat flies through the air, it rapidly emits a series of high-pitched clicks—at times as many as 200 per second—that are far higher in pitch than the human ear can hear. The bats, though, hear these sounds easily, and analyze the way the sounds bounce off objects in their surroundings before returning to their ears. By following cues in the volume, direction and speed at which these sounds return, bats can effectively see in the pitch-black dark.

In recent years, a growing amount of evidence has confirmed that humans—both sighted and vision-impaired—are capable of something similar. Unlike bats (along with dolphins, toothed whales and several other species capable of echolocation), the ability isn’t innate, but a number of experiments show that some people, at least, can teach themselves how to echolocate.

Many of the subjects of these studies have been vision-impaired people, who developed the ability over time as a necessity. The most famous is Daniel Kish, who lost his vision when he was a year old but has made headlines for climbing mountains, riding bikes and living alone in the wilderness. Kish, who’s been dubbed a “real-life Batman,” is able to perform these tasks because of his uncanny ability to “see” by echolocation.

Shorts: Daniel Kish's echolocation in action

How does he do it? Prompted in part by the high-profile coverage of Kish’s talent, a number of labs and research groups began investigating human echolocation in general a few years ago.

They’ve found that although we lack the specialized anatomical structures that evolved specifically for echolocation in species such as bats, the principles are largely the same. To start, a person must make a noise, analogous to the bat’s high-pitched click.

Most echolocators, including Kish, make the click by snapping the tip of the tongue against the roof of the mouth, temporarily creating a vacuum, which makes a sharp popping sound when the tongue is pulled away. A 2009 study by researchers from Spain, one of the first on human echolocation, found that Kish’s idiosyncratic click is particularly well-suited for echolocation: he pulls his tongue backward, away from the palate, instead of downward. Over time, practice can lead to a sharper, cleaner click, which makes echolocation easier.

We can’t match the 200 or so clicks per second achieved by bats and dolphins, but it’s not really necessary. Kish, for one, simply makes a clicking noise every few seconds, with interludes of silence when he doesn’t need to get a new picture of his surroundings.

From there, the sound waves produced by the click are broadcast into our environment at a speed of roughly 1,100 feet per second. Shot out in all directions, these waves bounce off the objects, structures and people around the echolocator and arrive back in his or her ears. The volume of the returning click is much quieter than the original, but those with proper training readily identify the subtle sound. And although it might seem amazing to be able to analyze these sound waves to generate a picture of the environment, some of the basic principles in play are concepts you already rely on everyday.

For one, there’s the fact that we have two ears, one on either side of our head, and thus (barring any impairments) can hear in stereo, the same way our pair of eyes allow us to see in stereo. In practice, this means that you unconsciously compare the volume of a particular sound in each of your ears, and assume the louder side is the one that the sound came from. When someone calls your name, for example, you typically know to turn in the right direction without much thought.

In the same way, echolocators can analyze the volume of the returning sound waves to “see” their surroundings. If one side receives much louder waves than the other, it shows that the sound bounced back faster, and thus took a shorter route—indicating the presence of an object or obstacle on that side.

Additionally, to the trained ear, the returning click sounds slightly different based on the particular object it bounced off of. You’re probably noticed that your voice sounds different in a carpeted, furnished room than an empty, tiled one. As Kish points out, a tennis ball bouncing off a wall sounds different than when it bounces off a bush. With enough practice, the same subtle distinctions can be made about the returning click sounds, painting a picture of the world at large.

Doing this might actually be easier for those without vision. In 2011, a team from the University of Western Ontario used fMRI (functional magnetic resonance imaging) to probe the underlying brain activity that goes on during echolocation for the first time. Interestingly, they found that in two vision-impaired echolocators, the act generated activity in the visual cortex, an area of the brain largely devoted to interpreting visual information. When they tested two sighted people who were new to echolocating, though, they found no activity in that area, implying that the brains of the two vision-impaired echolocators compensated for their lack of eyesight by devoting extra processing capacity to sound instead.

Advanced echolocators have shown increased mental activity in parts of the brain usually devoted for vision.
Advanced echolocators have shown increased mental activity in parts of the brain usually devoted for vision. Image via Wikimedia Commons/Alan Thistle

Becoming an expert echolocator takes years of practice, but research has shown that even an hour or so of practice can provide immediate results. In one study, published in May, participants were blindfolded and asked to tell which of two discs placed in front of them was larger by using echolocation. Over time, they were able to identify the correct disc at rates better than chance.

Both the Spanish research team and Kish, in his role as president of the World Access for the Blind organization, are working to help more people learn the art of echolocation. The researchers are developing a series of protocols to allow novices to start practicing, while Kish conducts workshops for the vision-impaired. “Two hours per day for a couple of weeks are enough to distinguish whether you have an object in front of you,” Juan Antonio Martínez, the lead author of the Spanish study, told Science Daily. ”Within another two weeks, you can tell the difference between trees and pavement.”

Get the latest Science stories in your inbox.