Augmented Listening at Engineering Open House 2019

Have you ever wondered what it would sound like to listen through sixteen ears? This past March, hundreds of Central Illinois children and families experienced microphone-array augmented listening technology firsthand at the annual Engineering Open House (EOH) sponsored by the University of Illinois College of Engineering. At the event, which attracts thousands of elementary-, middle-, and high-school students and local community members, visitors learned about technologies for enhancing human and machine listening.

Listen up (or down): The technology of directional listening

Our team’s award-winning exhibit introduced visitors to several directional listening technologies, which enhance audio by isolating sounds that come from a certain direction. Directional listening is important when the sounds we want to hear are far away, or when there are many different sounds coming from different directions—like at a crowded open house! There are two ways to focus on sounds from one direction: we can physically block sounds from directions we don’t want, or we can use the mathematical tools of signal processing to cancel out those unwanted sounds. At our exhibit in Engineering Hall, visitors could try both.

Ryan holds up an ear horn at EOH 2019

This carefully designed mechanical listening device is definitely not an oil funnel from the local hardware store.

The oldest and most intuitive listening technology is the ear horn, pictured above. This horn literally funnels sound waves from the direction in which it is pointed. The effect is surprisingly strong, and there is a noticeable difference in the acoustics of the two horns we had on display. The shape of the horn affects both its directional pattern and its effect on different sound wavelengths, which humans perceive as pitch. The toy listening dish shown below operates on the same principle, but also includes an electronic amplifier. The funnels work much better for directional listening, but the spy gadget is the clear winner for style.

This toy listening dish is not very powerful, but it certainly looks cool!

These mechanical hearing aids rely on physical acoustics to isolate sound from one direction. To listen in a different direction, the user needs to physically turn them in that direction. Modern directional listening technology uses microphone arrays, which are groups of microphones spread apart from each other in space. We can use signal processing to compare and combine the signals recorded by the microphones to tell what direction a sound came from or to listen in a certain direction. We can change the direction using software, without physically moving the microphones. With sophisticated array signal processing, we can even listen in multiple directions at once, and can compensate for reflections and echoes in the room.

Continue reading

Acoustic Impulse Responses for Wearable Audio Devices

This post describes our new wearable microphone impulse response data set, which is available for download from the Illinois Data Bank and is the subject of a paper at ICASSP 2019.

Acoustic impulse responses were measured from 24 source angles to 80 points across the body.

Have you ever been at a crowded party and struggled to hear the person next to you? Crowded, noisy places are some of the most difficult listening environments, especially for people with hearing loss. Noisy rooms are also a challenge for electronic listening systems, like teleconferencing equipment and smart speakers that recognize users’ voices. That’s why many conference room microphones and smart speakers use as many as eight microphones instead of just one or two. These arrays of microphones, which are usually laid out in a regular pattern like a circle, let the device focus on sounds coming from one direction and block out other sounds. Arrays work like camera lenses: larger lenses can focus light more narrowly, and arrays with more microphones spread out over a larger area can better distinguish between sounds from different directions.

Wearable microphone arrays

Microphone arrays are also sometimes used in listening devices, including hearing aids and the emerging product category of smart headphones. These array-equipped devices can help users to tune out annoying sounds and focus on what they want to hear. Unfortunately, most hearing aids only have two microphones spaced a few millimeters apart, so they aren’t very good at focusing in one direction. What if hearing aids—or smart headphones, or augmented reality headsets—had a dozen microphones instead of just two? What if they had one hundred microphones spread all over the user’s body, attached to their clothing and accessories? In principle, a large wearable array could provide far better sound quality than listening devices today.

Over the years, there have been several papers about wearable arrays: vests, necklaces, eyeglasses, helmets. It’s also a popular idea on crowdfunding websites. But there have been no commercially successful wearable microphone array products. Although several engineers have built these arrays, no one has rigorously studied their design tradeoffs. How many microphones do we need? How far apart should they be? Does it matter what clothes the user is wearing? How much better are they than conventional listening devices? We developed a new data set to help researchers answer these questions and to explore the possibilities of wearable microphone arrays.

Continue reading

What is augmented listening?

Augmented listening systems “remix” the sounds we perceive around us, making some louder and some quieter.

I am one of millions of people who suffer from hearing loss. For my entire life I’ve known the frustration of asking people to repeat themselves, struggling to communicate over the phone, and skipping social events because I know they’ll be too noisy.  Hearing aids do help, but they don’t work well in the noisy, crowded situations where I need them the most. That’s why I decided to devote my PhD thesis to improving the performance of hearing aids in noisy environments.

As my research progressed, I realized that this problem is not limited to hearing aids, and that the technologies I am developing could also help people who don’t suffer from hearing loss. Over the last few years, there has been rapid growth in a product category that I call augmented listening (AL): technologies that enhance human listening abilities by modifying the sounds they hear in real time. Augmented listening  devices include:

  • traditional hearing aids, which are prescribed by a clinician to patients with hearing loss;
  • low-cost personal sound amplification products (PSAPs), which are ostensibly for normal-hearing listeners;
  • advanced headphones, sometimes called “hearables,” that incorporate listening enhancement as well as features like heart-rate sensing; and
  • augmented- and mixed-reality headsets, which supplement real-world sound with extra information.

These product categories have been converging in recent years as hearing aids add new consumer-technology features like Bluetooth and headphone products promise to enhance real-world sounds. Recent regulatory changes that allow hearing aids to be sold over the counter will also help to shake up the market.

Continue reading