Tuesday, December 5, 2023

Listening In On Eye Movement




Every time our eyes move, so do our eardrums. That connection allows the auditory system to “listen” to the eyes, according to researchers at Duke University. Now, the researchers have eavesdropped on that signal to better understand how the brain connects what it sees with what it hears. They report their results in the Proceedings of the National Academy of Science.

Our ears can tell where a sound is coming from based on the timing of its arrival at the left and right ears. But the alignment of the auditory and visual scene is constantly changing. “Every time we move our eyes, we’re yanking that camera to look in a new direction. But unless you move your head, that timing difference is not going to change,” says senior author Jennifer Groh, a psychology and neuroscience professor at Duke University.

To figure out how the brain coordinates the two systems, Groh and her co-authors placed small microphones in the ear canal. They then recorded minute sounds in the eardrum while prompting the research subject to follow visual cues with their eyes. Previous work from the research group had shown that these sounds exist. Now, they have shown that the sounds consist of horizontal and vertical components that precisely correspond to how the eyes move.

The researchers were able to use the correspondence to predict where the eyes were going to look, after averaging out the ambient noise. While the technique is not usable in noisier settings, gaining a better understanding of the mechanism behind these auditory signals could lead to advances in hearing aid technology, for example.

Spying on the brain

When the brain sends the eyes a signal to prompt movement in a certain direction, it simultaneously sends a copy of that signal to the ears, like a “report card,” Groh says. This coordination happens in other contexts to keep track of the body’s movement, such as recognizing the sound of your own footsteps. This type of processing happens in the brain, but Groh’s research shows that information about vision is present earlier in the processing of sound than previously thought. “We’re using this microphone to spy on the brain,” Groh says.

She thinks that middle-ear muscles and inner-ear hair cells are both likely involved in transporting that signal to an earlier point in the auditory pathway. The structures influence different parts of hearing, so when scientists know more about the mechanisms of the signal, Groh suspects more precise hearing tests could be developed.

woman sitting at a desk with earphone in ear and looking at a screen Research subjects wore earbud microphones while performing eye movement tasksDuke University

Another key application of this research could be in hearing aid technology.

Hearing aid developers have struggled to refine the technology to localize where sound is coming from, and the devices are made to amplify all sounds equally. That can be frustrating for users, especially when they’re in noisy environments. For example, current hearing aids will amplify the noise from an air conditioner as much as a person’s voice. Visual cues could help direct hearing aids to address this problem.

“If you could tell your hearing aid who you’re looking at, you could alter the hearing aid algorithm to pay attention to that person,” explains Sunil Puria, a research scientist at Mass Eye and Ear and associate professor at Harvard Medical School. Puria says there is “phenomenal” potential for the research to eventually be applied in this type of technology—though Groh and other researchers first must figure out the mechanisms involved.

Before the findings are applied in “smart” hearing devices, it’s also important to figure out whether the signal affects hearing behaviors, says Christoph Kayser, a neuroscience professor at Bielefeld University in Germany. Kayser has not found any interference with hearing in his research on eye movement-related oscillations, but he notes that this does not rule out effects on more complex listening tasks, such as localizing sound.

While more science must be borne out for use in these applications, Groh says there is an immediate lesson. “This is revealing how important it is to be able to link what you see and what you hear.”

Reference: https://ift.tt/w6kxIvP

No comments:

Post a Comment

The Top 10 Telecommunications Stories of 2024

For IEEE Spectrum readers following telecommunications news in 2024, signals expanding their reach and range animated readers to read mo...