< Back to previous page

Project

The information content of dynamic cues in human sound localization.

Understanding the workings of human sound localization, and in particular which acoustic cues we use to perceive our acoustic environment in three dimensions (3D), is not only of fundamental interest, but has become increasingly relevant in the light of nowadays advance of 3D audio displays through headphones. In the past, most research has focused on the role of static cues , i.e. when the head and source are stationary, yet it is known that localization is greatly improved if listeners are allowed to move their head during stimulus presentation. In this project, we investigate the role of dynamic cues provided by small movements of the head or source, within an information- theoretic framework. We use a proven ideal-observer model for static human sound localization and extend it to account for the dynamic acoustic cues involved. First, we study what head movements carry the most information and how this depends on the location of the source. Next, we consider the mirror situation and investigate how much information can be conveyed through small movements of the source. Finally, we study the effects on sound localization when actual head movements are not taken into account correctly, which is the case if a 3D audio display is provided through ordinary headphones. The predictions from the theoretical analysis are validated with psycho-acoustic experiments.
Date:1 Jan 2019 →  31 Dec 2022
Keywords:EXPERIMENTAL STUDY
Disciplines:Signal processing