While the angular spatialization of source sounds through individualized Head-related transfer functions (HRTFs) has been extensively investigated in auditory display research, also leading to effective real-time rendering of these functions, conversely the interactive simulation of egocentric distance information has received less attention. The latter, in fact, suffers from the lack of realtime rendering solutions also due to a too sparse literature on the perception of dynamic distance cues. By adding a virtual environment based on a Digital waveguide mesh (DWM) model simulating a small tubular shape to a binaural rendering system through selection techniques of HRTF, we have come up with an auditory display affording interactive selection of absolute 3D spatial cues of angular spatialization as well as egocentric distance. The tube metaphor in particular minimized loudness changes with distance, hence providing mainly direct-to-reverberant and spectral cues. A goal-reaching experiment assessed the proposed display: participants were asked to explore a virtual map with a pen tablet and reach a sound source (the target) using only auditory information; then, subjective time to reach and traveled distance were analyzed. Results suggest that participants achieved a first level of spatial knowledge, i.e., knowledge about a point in space, by performing comparably to when they relied on more robust, although relative, loudness cues. Further work is needed to add fully physical consistency to the proposed auditory display.
Use of personalized binaural audio and interactive distance cues in an auditory goal-reaching task
GERONAZZO, MICHELE;AVANZINI, FEDERICO;
2015
Abstract
While the angular spatialization of source sounds through individualized Head-related transfer functions (HRTFs) has been extensively investigated in auditory display research, also leading to effective real-time rendering of these functions, conversely the interactive simulation of egocentric distance information has received less attention. The latter, in fact, suffers from the lack of realtime rendering solutions also due to a too sparse literature on the perception of dynamic distance cues. By adding a virtual environment based on a Digital waveguide mesh (DWM) model simulating a small tubular shape to a binaural rendering system through selection techniques of HRTF, we have come up with an auditory display affording interactive selection of absolute 3D spatial cues of angular spatialization as well as egocentric distance. The tube metaphor in particular minimized loudness changes with distance, hence providing mainly direct-to-reverberant and spectral cues. A goal-reaching experiment assessed the proposed display: participants were asked to explore a virtual map with a pen tablet and reach a sound source (the target) using only auditory information; then, subjective time to reach and traveled distance were analyzed. Results suggest that participants achieved a first level of spatial knowledge, i.e., knowledge about a point in space, by performing comparably to when they relied on more robust, although relative, loudness cues. Further work is needed to add fully physical consistency to the proposed auditory display.File | Dimensione | Formato | |
---|---|---|---|
geronazzo_icad15.pdf
accesso aperto
Tipologia:
Postprint (accepted version)
Licenza:
Creative commons
Dimensione
4.44 MB
Formato
Adobe PDF
|
4.44 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.