Auditory Locomotion Guidance System For Spatial Localization

Abstract

Head direction has been identified to anticipate trajectory direction during human locomotion, independently of the visual condition. However, experiments so far have explored this phenomenon with visual instructions. To explore head and gaze anticipation without visual influence, we describe a system for instruction and measurement of auditory instructed locomotion. The objective of this work is to describe the system setup and validate its accuracy for scientific investigations in locomotion research. The system is comprised of an auralization server that plays virtual sound sources via wireless headphones. The auralization is fed with head position and orientation angle measured by a Vicon motion tracking system. Communication between the auralization server and the motion tracking host was facilitating the OSC protocol. Auditory instructed locomotion was performed in two visual condition: eyes open and eyes closed for comparison. First, ten sighted participants localized static virtual sound sources at eight positions to confirm the accuracy of the provided virtual sound cues. Afterwards, they listened and actively followed virtual sound sources that were moving along three trajectories: an eight shape, a circle in clockwise direction and a circle in anticlockwise direction. The virtual sound sources could be localized with an accuracy of 0.16 meters in the eyes open condition and 0.12 meters in the eyes closed condition. Participants also were able to follow the moving sound sources.

Publication
In International Symposium of Micro-NanoMechatronics and Human Science (MHS).