Evaluating the user in a sound localisation task in a virtual reality application. [Poster]
View/ Open
Date
2020-06Author
Moraes, Adrielle Nazar
Flynn, Ronan
Murray, Niall
Metadata
Show full item recordAbstract
Virtual reality technology is fast becoming the accepted platform for delivering high quality immersive audio-visual experiences [1]. It has the potential to enable truly personalised experiences due to its adaptability and flexibility. Although a lot of the discussion on VR has focused on the visual aspect of the experience, but audio also plays a decisive role in terms of user immersion [2]. With the advancement of audio technologies, it is possible to render and spatialise audio with a high level of quality, matching human resolution to discriminate sounds in terms of depth, elevation and lateralisation [3]. Spatialised audio is very important in industries like gaming and for communication tasks, and in environments that require humans to constantly give attention to a specific sound in space [4][5][6]. In this paper, an immersive VR spatial audio application is presented. It enables us to evaluate the ability of users to specify or localise the source of a sound. An integrated sensing system continuously collects relevant data from the user in order to fully understand how to quantify and evaluate spatial auditory skills from a quality of experience (QoE) perspective. QoE gives insight into a user’s state and behaviour. To perform a detailed QoE evaluation of the listening task, implicit and explicit metrics were collected from the user. Data collected from this QoE evaluation gives an insight into a user’s abilities to localise sound sources in VR, and also provides information on behaviour and effort (workload) in performing the task.
Collections
The following license files are associated with this item: