Quantcast

Ann Arbor Times

Saturday, September 21, 2024

Study finds lip reading activates brain regions similar to real speech

Webp sb6ksd1br9udpbwyaxjk34n6qulm

Laurie McCauley Provost and Executive Vice President for Academic Affairs | University of Michigan-Ann Arbor

Laurie McCauley Provost and Executive Vice President for Academic Affairs | University of Michigan-Ann Arbor

Lip-read words can be decoded from the brain’s auditory regions similarly to heard speech, according to a new University of Michigan report that examined how vision supports verbal perception.

Researchers used functional magnetic resonance imaging and electrodes implanted in patients’ brains to show that watching someone speak when you can’t hear them (lip reading) activates auditory regions of the brain in ways similar to real speech.

David Brang, associate professor of psychology and the study’s senior author, said seeing a person’s facial movements often starts before sounds are produced. "The auditory system uses these early visual cues to prime auditory neurons before the sounds are heard," he said.

The study indicated that integrating visual and auditory cues makes a person get more accurate and efficient speech information, significantly enhancing communication abilities.

Brang and colleagues sought to understand how the visual signals during lip reading are represented in the auditory system. They used fMRI data from healthy adults and intracranial recordings from electrodes implanted in patients with epilepsy during auditory and visual speech perception tasks.

The findings revealed that lip-read words could be classified at earlier time points compared to heard words. This suggests that lip reading might involve a predictive mechanism that facilitates speech processing before auditory information becomes available, Brang said.

The results support a model in which the auditory system combines the neural distributions evoked by heard and lip-read words to generate a more precise estimate of what was said. Brang noted these findings suggest that the auditory system quickly integrates lip-reading information to enhance hearing capabilities, especially in challenging environments like noisy restaurants. Observing a speaker’s lips can influence our auditory perception even before any sounds are produced.

For people with hearing loss, this rapid use of lip-reading information is likely even more pronounced. "As hearing abilities decline, people increasingly rely on visual cues to aid their understanding," Brang said. "The ability of visual speech to activate and encode information in the auditory cortex appears to be a crucial compensatory mechanism."

This helps people maintain their hearing capacities as they age, underscoring the value of face-to-face communication in supporting auditory comprehension.

The study, which appears in Current Biology, was co-authored by Karthik Ganesan, Cody Zhewei Cao, Michael Demidenko, Andrew Jahn, William Stacey, and Vibhangini Wasade.

###

ORGANIZATIONS IN THIS STORY

!RECEIVE ALERTS

The next time we write about any of these orgs, we’ll email you a link to the story. You may edit your settings or unsubscribe at any time.
Sign-up

DONATE

Help support the Metric Media Foundation's mission to restore community based news.
Donate

MORE NEWS