Research Repository

A multi-modal human machine interface for controlling an intelligent wheelchair using face movements

Wei, L and Hu, H (2011) A multi-modal human machine interface for controlling an intelligent wheelchair using face movements. In: UNSPECIFIED, ? - ?.

Full text not available from this repository.

Abstract

This paper introduces a novel face movement based human machine interface (HMI) that uses jaw clenching and eye closing movements to control an electric powered wheelchair (EPW). A multi-modality HMI derived from both facial EMG and face image information is developed and testified in comparison with a traditional joystick control in an indoor corridor environment. In the experiment, ten repetitive experiments are carried out in a navigation task by controlling an EPW using either face movement control or joystick control. Wheelchair trajectories and execution time during the task are recorded to evaluate the performance of the new face HMI. © 2011 IEEE.

Item Type: Conference or Workshop Item (Paper)
Additional Information: Published proceedings: 2011 IEEE International Conference on Robotics and Biomimetics, ROBIO 2011
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Science and Health > Computer Science and Electronic Engineering, School of
Depositing User: Users 161 not found.
Date Deposited: 17 Dec 2014 11:55
Last Modified: 23 Jan 2019 00:17
URI: http://repository.essex.ac.uk/id/eprint/9180

Actions (login required)

View Item View Item