< Back to previous page

Publication

EEG-based Auditory Attention Decoding: Towards Neuro-steered Hearing Devices

Book - Dissertation

People with hearing impairment often have difficulties to understand speech in noisy environments. This can be partly overcome by the use of noise reduction algorithms in auditory prostheses such as hearing aids or cochlear implants. However, in a multi-speaker scenario, such algorithms do not know which speaker is to be enhanced, and which speaker(s) should be treated as noise. When listening to multiple speakers, neural (cortical) activity has been found to phase-lock with that of the attended speech stream, and decoders can be trained to reconstruct the 'attended' speech envelope from recorded brain activity, e.g., through EEG. This allows to design auditory attention detection (AAD) algorithms which allow to detect which speaker a person is attending to in a multi-speaker scenario. This can be exploited in hearing prostheses, e.g., in the form of an AAD-steered noise reduction algorithm. However, deeper understanding into neural processes behind auditory attention, the influence of acoustic listening conditions, speech intelligibility, and the hearing and cognitive abilities of the listener is necessary to eventually realize real-time (closed-loop) neuro-steered noise reduction algorithms to support hearing prostheses.
Publication year:2020
Accessibility:Closed