Study suggests humans naturally process tactile information to perceive speech sounds

NewsGuard 100/100 Score

Humans use their whole bodies, not just their ears, to understand speech, according to University of British Columbia linguistics research.

It is well known that humans naturally process facial expression along with what is being heard to fully understand what is being communicated. The UBC study is the first to show we also naturally process tactile information to perceive sounds of speech.

Prof. Bryan Gick of UBC's Dept. of Linguistics, along with PhD student Donald Derrick, found that air puffs directed at skin can bias perception of spoken syllables. "This study suggests we are much better at using tactile information than was previously thought," says Gick, also a member of Haskins Laboratories, an affiliate of Yale University.

The study, published in Nature today, offers findings that may be applied to telecommunications, speech science and hearing aid technology.

English speakers use aspiration - the tiny bursts of breath accompanying speech sounds - to distinguish sounds such as "pa" and "ta" from unaspirated sounds such as "ba" and "da." Study participants heard eight repetitions of these four syllables while inaudible air puffs - simulating aspiration - were directed at the back of the hand or the neck.

When the subjects - 66 men and women - were asked to distinguish the syllables, it was found that syllables heard simultaneously with air puffs were more likely to be perceived as aspirated, causing the subjects to mishear "ba" as the aspirated "pa" and "da" as the aspirated "ta." The brain associated the air puffs felt on skin with aspirated syllables, interfering with perception of what was actually heard.

It is unlikely aspirations are felt on the skin, say the researchers. The phenomenon is more likely analogous to lip-reading where the brain's auditory cortex area activates when the eyes see lips move, signaling speech. From the brain's point of view, you are "hearing" with your eyes.

"Our study shows we can do the same with our skin, "hearing" a puff of air, regardless of whether it got to our brains through our ears or our skin," says Gick.

Future research may include studies of how audio, visual and tactile information interact to form the basis of a new multi-sensory speech perception paradigm. Additional studies may examine how many kinds of speech sounds are affected by air flow, offering important information about how people interact with their physical environment.

Source: University of British Columbia

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Neural implants face ethical hurdles, study finds