Research Repository

Selectable directional audio for multiple telepresence in immersive intelligent environments

Torrejon, A and Callaghan, V and Hagras, H (2013) Selectable directional audio for multiple telepresence in immersive intelligent environments. In: UNSPECIFIED, ? - ?.

Full text not available from this repository.

Abstract

The general focus of this paper concerns the development of telepresence within intelligent immersive environments. The overall aim is the development of a system that combines multiple audio and video feeds from geographically dispersed people into a single environment view, where sound appears to be linked to the appropriate visual source on a panoramic viewer based on the gaze of the user. More specifically this paper describes a novel directional audio system for telepresence which seeks to reproduce sound sources (conversations) in a panoramic viewer in their correct spatial positions to increase the realism associated with telepresence applications such as online meetings. The intention of this work is that external attendees to an online meeting would be able to move their head to focus on the video and audio stream from a particular person or group so as decrease the audio from all other streams (i.e. speakers) to a background level. The main contribution of this paper is a methodology that captures and reproduces these spatial audio and video relationships. In support of this we have created a multiple camera recording scheme to emulate the behavior of a panoramic camera, or array of cameras, at such meeting which uses the Chroma key photographic effect to integrate all streams into a common panoramic video image thereby creating a common shared virtual space. While this emulation is only implemented as an experiment, it opens the opportunity to create telepresence systems with selectable real time video and audio streaming using multiple camera arrays. Finally we report on the results of an evaluation of our spatial audio scheme that demonstrates that the techniques both work and improve the users' experience, by comparing a traditional omni directional audio scheme versus selectable directional binaural audio scenarios. © 2013 IEEE.

Item Type: Conference or Workshop Item (Paper)
Additional Information: Published proceedings: Proceedings - 9th International Conference on Intelligent Environments, IE 2013
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Science and Health > Computer Science and Electronic Engineering, School of
Depositing User: Jim Jamieson
Date Deposited: 08 Jan 2015 16:17
Last Modified: 09 Apr 2018 13:15
URI: http://repository.essex.ac.uk/id/eprint/12217

Actions (login required)

View Item View Item