How Your Brain Tracks Moving Sounds
Jason Corey via Flickr
(Inside Science) -- When an object moves across your field of view, your eyes and brain are able to smoothly track its motion. But what about moving sounds? Until now, we didn’t really know how, or even if, the brain and ears worked together when facing that scenario.
"Researchers have neglected the capacity of humans to track moving sound sources," said John Van Opstal, a neuroscientist at Radboud University in Nijmegen, the Netherlands.
Scientists knew that we tend to be very good at determining the location of sounds that are not moving, but a small number of previous studies suggested that our brains did not actually track sounds in motion. Instead it was thought we relied more on visual cues, or that we took a series of static aural "snapshots" and used them to infer how the sound source was moving, rather than tracking the movement directly.
But a new study by Van Opstal and his colleagues, published today in the journal eNeuro, shows that our auditory system can smoothly track both the position and speed of a sound just as our visual system does with visible objects.
Van Opstal had subjects sit in a dark, soundproof room wearing special headgear that would record their head movements, and asked them to track the path of a sound that was moving randomly and unpredictably back and forth by turning their head to follow it. He found that the subjects could accurately and smoothly track the movement -- something that wouldn’t have been possible if they were just sampling static snapshots of the sound’s position. And they got better at it over the course of the experiment.
"That shows the system is also clever; it can extract information about the motion pattern and adapt its response," said Van Opstal.
The findings indicate that the brain likely has two neural circuits for tracking sound -- one dedicated to determining location and another complementary one dedicated to measuring velocity, said Van Opstal. The velocity is tracked by comparing how the differences in arrival time and sound level between the two ears change over time. The differences provide information about a sound’s location, while the changes in these differences relate to the sound’s velocity. What the researchers don’t yet know is whether it is changes in the timing or intensity of the sound that dominate the process.
The research also shows that moving the head is important to the accurate tracking of sounds, as the movement changes the sounds that enter the ears. In the same way that the retinas in your eyes have a spot of clearest focus, called the fovea, that you try to keep locked onto moving objects, your brain generates a sort of virtual "auditory fovea" directly in front of your head, and you try to keep the source of a moving sound in that area as you track it. "You’ll always try to aim your nose at a moving sound," said Van Opstal.
Nathan Van der Stoep, an experimental psychologist at Utrecht University in the Netherlands who was not involved in the research, said this is much needed work to help better understand dynamic auditory perception.
"Most of what we know about spatial hearing comes from studies in which static sounds are used, which is relevant, but perhaps quite different from the dynamic nature of the sounds we encounter in daily life," he said.
While it may not be surprising that our brains have neural circuits for measuring the direction and velocity of sound, Andrew King, a neuroscientist at Oxford University in the U.K., says he will be interested to see how the underlying neurological process compare with those that govern how we track visual objects, given the very different challenges of tracking moving sounds.
The next steps for Van Opstal and his team are to repeat the experiment with subjects who have hearing difficulties or who are deaf in one ear, and those who use hearing aids or cochlear implants. He expects that both people with hearing difficulties and those who use medical hearing devices will have difficulties tracking velocity, but the work will help to improve and refine hearing devices in the future.
"This will allow us to identify what is going on in the brain, to understand what information is extracted from the environment, and improve the algorithms in hearing devices to provide more of that information," he said.