Echolocation Allows Blind Humans to ‘See’

Bats and dolphins aren’t the only animals that use sound to locate objects. Humans do it, too…

New research, which appears in a recent issue of the journal Acta Acustica united with Acustica, provides proof that humans can echolocate, a finding that is no surprise to members of the blind community.
It makes intuitive sense that the echoes of sounds can provide information about our location: our voices sound differently in front of a wall than in an open area. But blind people like Daniel Kish, Executive Director of World Access for the Blind in Long Beach, Calif., have developed their own personal sonar technique: listening to the echoes from clicking sounds they make with their mouths to glean information about their surroundings.
Now, a team of Spanish researchers from the University of Alcala has begun to investigate human echolocation. They aim to find the best sounds for echolocation and understand the extent of the ability in humans. In the end, they hope their insights will provide ways to improve humans’ echolocation skills.
“When I read about echolocation (in humans), I was immediately hooked by its research potential,” study lead author Juan Antonio Martinez Rojas told Discovery News. “I was surprised by the scarcity of scientific data in humans and I was very impressed by the fascinating echolocating ability of Daniel Kish and Ben Underwood.”
Kish’s clicks allow him to engage in seemingly impossible tasks, like riding a bicycle. Underwood, who lost his eyes to cancer when he was two and died of a cancer recurrence earlier this year at age 16, was able to use echolocation to shoot baskets.
Martinez’s research examined several sounds that could be used for echolocation, including a “ch” sound, a click at the front of the mouth and a click at the back.
By recording the sounds and analyzing the shape of their sound waves, the team determined that the front click, or “palatal click,” was the most suited for echolocation. It is a simple sound, so the brain can interpret the echo easily, Martinez noted, but it also contains many frequencies. “The more frequencies involved in the echo, the more information about the object,” he said.
Kish agrees that many of the best echolocators use palatal clicks, but he noted that there is a role for both types. The rear-of-the-mouth click can be used to make very loud “power clicks,” he said, which are helpful for locating a building from a distance, even if the resolution of the information created by the sound is not as sharp as with the clearer palatal click.
Natural clicks are optimal for echolocation, Martinez added. According to research in progress, “if you don’t use natural clicks, your echolocation performance will be orders of magnitude worse.”
Others are less sure. “It’s not that we are optimized for sounds we create ourselves. It’s just that you get used to whatever click you’re using,” said Kish’s physicist colleague Derik DeVecchio.
Kish and DeVecchio have developed a system of artificial clicks, modeled on bat sounds but within the human range of hearing, that they say can allow for even better echolocation. “As near as we could tell, it provided between two and three times the resolution,” Kish said.
According to Kish, echolocation may have been a skill humans possessed all along. “I believe that echolocation is quite primal,” Kish said. “I think that the hardware in our brains is there. It’s just rusty.”
Taken from: www.dsc.discovery.com