This post accompanies the paper “Adaptive Binaural Filtering for a Multiple-Talker Listening System Using Remote and On-Ear Microphones” presented at WASPAA 2021 (PDF).
Wireless assistive listening technology
Hearing aids and other listening devices can help people to hear better by amplifying quiet sounds. But amplification alone is not enough in loud environments like restaurants, where the sound from a conversation partner is buried in background noise, or when the talker is far away, like in a large classroom or a theater. To make sound easier to understand, we need to bring the sound source closer to the listener. While we often cannot physically move the talker, we can do the next best thing by placing a microphone on them.
When a remote microphone is placed on or close to a talker, it captures speech with lower noise than the microphones built into hearing aid earpieces. The sound also has less reverberation since it does not bounce around the room before reaching the listener. In clinical studies, remote microphones have been shown to consistently improve speech understanding in noisy environments. In our interviews of hearing technology users, we found that people who use remote microphones love them – but with the exception of K-12 schools, where remote microphones are often legally required accommodations, very few people bother to use them.
Drawbacks of remote microphone systems
Based on our interviews with hearing aid users and hearing health providers, and on our hands-on testing with commercial products, we have identified several drawbacks that make remote microphones less useful than they could be.
- Today, nearly all hearing aid wireless accessories are proprietary, so users have to buy a microphone from their hearing aid manufacturer. These first-party accessories tend to be expensive, and they lag far behind consumer electronics in performance and ease of use.
- To use remote microphones, users have to carry around an extra device. They also have to self-identify as a person with a disability and ask people to wear it. Even when users feel comfortable asserting their needs, the process is often awkward or impractical.
- Most remote microphones only work with one talker at a time. A few classroom systems work with multiple microphones, but talkers have to take turns. Outside of school, most group discussions are not nearly that disciplined, so listeners would miss out on parts of the conversation.
- Remote microphones transmit the same sound to the left and right ear. Without interaural cues – that is, time differences and level differences between the two ears – it becomes harder for listeners to tell what direction sound is coming from; it seems to come from inside the head. These spatial cues are especially important in group conversations because the brain uses them to attribute speech sounds to different talkers.
Our team would like to address at least some of these drawbacks to make remote microphones more useful for people with hearing loss. We are hopeful that the new Bluetooth Low Energy Audio standard, which has been adopted by all major hearing aid companies, will allow third-party accessories to work with many different hearing aids, driving down cost and accelerating development. To address the inconvenience of carrying extra devices, we are developing cooperative listening systems so that listening devices can connect to microphones that are already installed in a room.
This project focuses on the last two points: group conversations and spatial cues. Using classic signal processing techniques with modern wireless technologies, we can design an immersive listening system that works with multiple talkers, even if they talk over each other.
An immersive remote microphone system for group conversations
Our goal in this project was to design an assistive listening system with four features:
- It should have low background noise, like a conventional remote microphone.
- It should have realistic spatial cues, like listening through hearing aid earpieces.
- It should work with multiple simultaneous talkers.
- It should adapt as the listener and talkers move around.
To achieve these goals, we designed a system that uses both the remote microphones and the microphones in the earpieces. The remote mics have low noise, while the earpiece mics have correct spatial cues. We use a pair of adaptive filters – similar to those used for echo cancellation in video calls – to modify the remote microphone signals. The adaptive system tries to match the magnitudes and phases of the output signals to those from each of the two earpieces. Unlike some other proposed systems, the adaptive filter does not need to separate or localize the sounds in order to restore the spatial cues. Because the signal matches the sound at the two ears, it has the correct interaural time and level differences and also reproduces the acoustic effects of the room. The user will hear sound that seems to be coming through their earpieces, but with less noise. The adaptive algorithm automatically adjusts as the listener or talkers move.
In the paper, we consider two system architectures for different use cases. First, we could clip one microphone on each talker and have a separate pair of filters for each, then add the filtered signals together. The microphones are attached to the talkers, so they will have a good signal-to-noise ratio even when the talkers move around often. Second, we could use an array of microphones placed in the middle of a group of talkers. This system is more convenient if the user wants to hear everything in a certain area, rather than a certain set of individual talkers. A single multiple-input filter automatically focuses on the strongest nearby sound sources. The disadvantage of the array is that the user cannot choose which talkers to focus on, and it might pick up unwanted nearby noise.
We tested both systems in our laboratory using a dummy that simulates the acoustics of a human head. We found that the relatively simple adaptive filtering algorithm is able to track a moving talker with low noise while maintaining realistic spatial cues. In the video demonstration, you can compare the spatial cues with a conventional remote microphone and with the proposed binaural system. Be sure to wear headphones to get the full spatial effect. The paper includes additional experiments with microphone arrays and multiple “talkers” simulated by loudspeakers.
Hearing aids and other listening devices struggle in noisy environments and with faraway sound sources. By incorporating data from other microphones, they can zoom in on distant sounds, making it easier for listeners to hear even in challenging situations. The proposed adaptive filtering system, along with advanced wireless technologies and industry-wide compatibility standards, will enable a new generation of remote microphone systems for dynamic group conversations.