Saturday, November 28, 2009

Findings that people can "hear" through their skin may lead to new technology for deaf people

From The Boston Globe:

Listening is more than a matter of being "all ears." People can also hear with their skin, according to new research that deepens our understanding of the senses, showing they can work together but also override one another.

Strange though it seems, scientists are finding that multiple senses contribute to the simplest perceptions. People can see with their ears, hear with their eyes, or hear with a touch.

In the work published today in the journal Nature, researchers found they could influence what people hear by delivering puffs of air to the back of a hand or their neck. By demonstrating that the perception of speech is affected by touch, the experiment raises the possibility that one sense could be used as a substitute for another, creating new ways for deaf people to hear. In fact, researchers at MIT are already using this basic idea to develop technology that could one day assist people with hearing impairment.

"This study is part of us ... reconsidering how humans perceive the world, how humans interact with the world," said co-author Bryan Gick, a phonetician at the University of British Columbia and senior scientist at Haskins Laboratories, a speech research think tank in New Haven, Conn.


For years, scientists have known that watching another person speak can affect what we hear. In a well-known phenomenon called the McGurk effect, a person who listens to audio of someone saying "ba ba ba", while watching another person's lips forming the words "ga ga ga," hears something in-between: "da da da." (Watch this video for a demonstration.)

Now, Gick is exploring whether touch also affects hearing. In the experiment, subjects heard the sounds "pa" or "ba" and "ta" or "da." Sometimes, they received a puff of air on the back of their hand or neck when the words had an aspirated sound -- a sound like "pa" or "ta" that requires the speaker to expel a puff of air. (Hold your hand to your mouth and say "pa" and compare it to "ba" to feel the difference). Other times, they got the reverse: a puff of air when they heard "ba" or "da" -- non-aspirated sounds.

The researchers found that when the puff of air was paired with the aspirated word, people got better at identifying the sound. When the puff of air was paired with "ba" or "da," accuracy declined.

"This is a very intriguing finding, raising lots of theoretical possibilities and future studies," said Shinsuke Shimojo, head of a psychophysics laboratory at California Institute of Technology. He said it was interesting that the puff seemed to be an implicit signal for one sound over another, and worked at both spots on the body. The people were not told or taught that the puff was meant to signal a certain sound, and the researchers tried to choose spots on the body that likely did not feel aspirated puffs when people were speaking.

While it is not yet clear what is happening in the brains of people who "hear" puffs of air, the experiment supports the idea that people integrate information from touch with sounds they hear. In a study published in the journal NeuroImage three years ago, Finnish researchers used brain imaging to study 13 subjects and found that touch activated the auditory cortex, a part of the brain that is involved in hearing.

Charlotte Reed, a senior research scientist at the Massachusetts Institute of Technology, specializes in researching the ways in which touch can be used to interpret speech, by studying deaf-blind people who learn the Tadoma method -- a way of learning to talk and hear by placing a hand on the neck and mouth of a speaker.

"We know the auditory and tactile senses interact," Reed said. She pointed out that the new study certainly shows that touch can give a conflicting signal, making someone hear a "ba" as a "pa," but also shows that a person's accuracy in understanding a sound can be enhanced. Using touch to enhance hearing is the basic idea behind her research, which uses a machine called the "tactuator" that involves three vibrating and moving prongs -- one for the thumb, forefinger, and middle finger to rest on -- to turn speech into something people feel.

The idea is to create an aid for lipreading, and eventually to use the information gleaned from the bulky, three-pronged device to create software that could one day be used to turn a simple device like a cellphone into a prosthetic for deaf people. The microphone on a cellphone could translate a speaker's voice into tactile signals that could help a person understand someone as they were lip-reading.

Research on how the senses intersect in productive and conflicting ways goes beyond touch and speech.

Earlier this year, other researchers published research showing that people hear through their face. Stretching a person's facial muscles into shapes normally associated with speech could make them hear words differently. For example, when the researchers played an ambiguous word, they found that people heard the word as "head" when their facial skin was stretched upwards, and heard "had" when their facial skin was stretched downwards.

Other researchers have described the parchment-skin illusion, in which people rub their palms together next to a microphone and hear the sound through earphones. When scientists enhanced the rubbing noises, people reported their hands felt dry, like parchment paper -- an example of feeling with their ears. In another study, researchers found that by simply tweaking the sound of people crunching on a potato chip, listeners rated the chip as more crispy.

CalTech's Shimojo studies vision and has demonstrated that people shown a flash of light accompanied by a beep will report seeing two flashes if they then hear two beeps but see only one flash.

Gick says his research has potential for helping people hear using touch, but for now he hopes to use brain imaging to reveal what is happening when people "hear" an air puff.

Taken together, such work is more than just a collection of cross-sensory illusions: it suggests something fundamental about how our brains work.

"It's starting to look more and more like we're perceiving machines," Gick said. "Are we physically constructed to take in information, irrespective of where it's coming from or what form it's in?"