An interdisciplinary project led by McMaster Engineering’s Rong Zheng and Ian Bruce will receive $250,000 in federal funding to explore bringing assistive hearing technology and virtual and augmented reality auditory capabilities to personal devices.
As machine learning and the processing power of mobile devices continues to advance, McMaster Engineering researchers are examining how to transform personal devices to assistive ones.
Principal investigators Rong Zheng and Ian Bruce are leading a team of cross-disciplinary experts who will create a novel virtual and augmented acoustic reality (VAAR) framework with a customisable auditory experience on commodity devices.
The team will research and develop the methodology, hardware and software that will allow people to measure their own hearing capabilities with personal devices – like a cell phone and earbuds – and adjust those devices to compensate for their impairments.
The project, “Towards virtual and augmented acoustic reality on commodity devices,” has been awarded funding from the federal government’s New Frontiers in Research Fund.
The team will receive $250,000 over two years through the exploration stream for this high-risk/high-reward research.
“This is a great opportunity…to have innovation funding that allows us to do this kind cutting-edge exploration in a high risk, but also we expect high-yield, area of research is really important,” Bruce said.
In 2021, the World Health Organization estimated that by 2050 over 700 million people worldwide would have a disabling hearing loss.
Zheng, Canada Research Chair in mobile computing, notes that there is great promise for machine learning to help make assistive technology more accessible for those with mild to moderate hearing difficulties.
“We see there’s an opportunity that we can bridge the gap between from customized, specialized and expensive hardware and software with something we can implement. Advanced audio processing techniques would make it available to people who have hearing impairments,” she said.
Zheng also added that it would open doors for those with unperceived hearing losses to realize they could benefit from some assistance.
While an “over-the-counter” process has opened up in the United States and Canada to compete with traditional hearing aids – with earbuds sometimes known as “hearables” marketing personalized audio capabilities – Bruce notes that the technology isn’t as advanced as it needs to be for the functionality and personal customization.
The research project will bring together experts at an intersection of mobile computing, machine learning, acoustic signal processing, auditory perception and psychology.
“I think the research ideas we have of using these methodologies for people to be able to fit the hearing aids themselves… opens up the chance of some very novel signal processing approaches that that haven’t been achievable previously,” Bruce said.
Researchers will use laboratory facilities for audio testing at McMaster University, as well as an anechoic chamber for 3D-environment testing at University of Western Ontario with co-applicant Ewan Macpherson, associate professor at the School of Communication Sciences and Disorders.
Other collaborators include the following:
“Getting this project funded will really enable me to be able to work closely with experts in related fields,” Zheng said. “The hope is that after two years, we have built this team and deliverables along the project so that we can expand it and make it larger scale.”
Bruce, professor of electrical and computer engineering and associate chair (graduate), says that even advanced hearing aids are limited in how they can process and represent spatial audio – sounds coming from different places in space.
“To customize it to an individual person, you need to take detailed measurements on exactly how the ears are processing sounds coming from different directions,” he said.
He notes creating “more realistic sounding spatial audio” experiences rather than generic ones would be particularly crucial for future VAAR applications.
As Zheng dives into the possibilities for all users, including those without hearing impairments, she raises the idea of attending orchestra performances and being able to adjust commodity devices to focus more on the violin section.
“There is a possibility for new applications that can be enabled by what we are developing in this project,” Zheng said.
Other future stages, Zheng said, could explore options to ensure the software is user-friendly, such as paying close attention to interface design principles and increased robustness.
The research is also an important vehicle for training the future workforce.
“Working in this interdisciplinary team will give students the exposure to understand the other side of things to incorporate domain knowledge in our design. I think this would provide unique training experience to our engineering students,” Zheng said.