Husband's Help Hearing in 'Cocktail Party' Noise
Listening to a single conversation amid a noisy environment challenges the brain's auditory system – yet somehow, people are able to tune into the sound of a single voice, understand it and follow a conversation. This phenomenon is called the "cocktail party effect." [Top 10 Mysteries of the Mind]
In studying the effect, scientists have focused on the factors that help the brain tease out voices from a mixture of sounds arriving to the ear. Researchers have looked at the pitch of the voices and where the speakers are located relative to the listener, among other factors.
However, recent research shows that the brain does not rely only on the incoming sounds themselves to understand speech, but also uses information from other senses, as well as past experience. Those findings come from research Johnsrude presented last month at a meeting of the Canadian Association for Neuroscience in Montreal.
Who you choose to talk to matters
In their experiments, Johnsrude and colleagues had participants listen to speech amid interfering voices and noise, and examined how the participants' attention, familiarity with the voices and knowledge of what was being said helped them understand the speech.
In one of these studies, the researchers looked at middle age and elderly couples who had been married for at least 18 years. Published in the journal Psychological Science in 2013, the study found that people are more successful in focusing on a voice and blocking out noise if the voice belongs to their spouse.
"We found that older people really benefit a lot from having a familiar voice in the mix," Johnsrude told Live Science. "Not only do they hear that voice better than a matched stranger's voice, but they can also use the voice they know to ignore it so as to attend to another voice more easily."
Knowledge about the context of the conversation can also help people understand hard-to-hear speech. In a series of experiments, Johnsrude and colleagues had participants listen to a sentence that was read in the context of bad-quality sound, while also seeing the text of the sentence. Each word of the sentence appeared, one by one and on a screen, 200 milliseconds before the word was heard.
Looking at brain scans, the researchers found that reading the sentence was linked to a greater change in the activation of the brain's primary auditory cortex, which handles incoming auditory signals, compared to seeing a meaningless string of consonant letters shortly before hearing each word.
"The auditory cortex was sensitive to the difference between getting that meaningful information visually and not getting it. And we know the auditory cortex doesn't read," Johnsrude said. "So that modulation based on what you read must be coming from somewhere else in the brain." The findings were detailed in the journal Neuroimage in 2012.
Learning from the brain
Researchers have explored the role of knowledge and experience in influencing perception in other senses such as the visual and olfactory systems. For example, people can more easily figure out a scrambled picture if they know what to look for, and they take less time to identify odors they have smelled previously.
The auditory system seems to be no exception, and for older people, this can come handy, Johnsrude said.
"If we are looking at ways to help older people, or if we want to understand how they are able to perceptually organize their world despite diminished hearing, we need to understand how knowledge and experience can influence their performance," Johnsrude said.