This paper presents a multimodal interactive system for non-visual (auditory-haptic) exploration of virtual maps. The system is able to display haptically the height profile of a map, through a tactile mouse. Moreover, spatial auditory information is provided in the form of virtual anchor sounds located in specific points of the map, and delivered through headphones using customized Head-Related Transfer Functions (HRTFs). The validity of the proposed approach is investigated through two experiments on non-visual exploration of virtual maps. The first experiment has a preliminary nature and is aimed at assessing the effectiveness and the complementarity of auditory and haptic information in a goal reaching task. The second experiment investigates the potential of the system in providing subjects with spatial knowledge: specifically in helping with the construction of a cognitive map depicting simple geometrical objects. Results from both experiments show that the proposed concept, design, and implementation allow to effectively exploit the complementary natures of the “proximal” haptic modality and the “distal” auditory modality. Implications for orientation & mobility (O&M) protocols for visually impaired subjects are discussed.
Interactive spatial sonification for non-visual exploration of virtual maps
GERONAZZO, MICHELE;BEDIN, ALBERTO;AVANZINI, FEDERICO
2016
Abstract
This paper presents a multimodal interactive system for non-visual (auditory-haptic) exploration of virtual maps. The system is able to display haptically the height profile of a map, through a tactile mouse. Moreover, spatial auditory information is provided in the form of virtual anchor sounds located in specific points of the map, and delivered through headphones using customized Head-Related Transfer Functions (HRTFs). The validity of the proposed approach is investigated through two experiments on non-visual exploration of virtual maps. The first experiment has a preliminary nature and is aimed at assessing the effectiveness and the complementarity of auditory and haptic information in a goal reaching task. The second experiment investigates the potential of the system in providing subjects with spatial knowledge: specifically in helping with the construction of a cognitive map depicting simple geometrical objects. Results from both experiments show that the proposed concept, design, and implementation allow to effectively exploit the complementary natures of the “proximal” haptic modality and the “distal” auditory modality. Implications for orientation & mobility (O&M) protocols for visually impaired subjects are discussed.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.