Research Repository

Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair

Rechy-Ramirez, EJ and Hu, H (2014) 'Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair.' International Journal of Biomechatronics and Biomedical Robotics, 3 (2). 80 - 91. ISSN 1757-6792

Full text not available from this repository.

Abstract

This paper presents a human machine interface (HMI) for hands-free control of an electric powered wheelchair (EPW) based on head movements and facial expressions detected by using the gyroscope and 'cognitiv suite' of an Emotiv EPOC device, respectively. The proposed HMI provides two control modes: 1) control mode 1 uses four head movements to display in its graphical user interface the control commands that the user wants to execute and one facial expression to confirm its execution; 2) control mode 2 employs two facial expressions for turning and forward motion, and one head movement for stopping the wheelchair. Therefore, both control modes offer hands-free control of the wheelchair. Two subjects have used the two control modes to operate a wheelchair in an indoor environment. Five facial expressions have been tested in order to determine if the users can employ different facial expressions for executing the commands. The experimental results show that the proposed HMI is reliable for operating the wheelchair safely.

Item Type: Article
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
R Medicine > R Medicine (General)
Divisions: Faculty of Science and Health > Computer Science and Electronic Engineering, School of
Depositing User: Jim Jamieson
Date Deposited: 20 Jul 2015 15:43
Last Modified: 17 Aug 2017 17:35
URI: http://repository.essex.ac.uk/id/eprint/14308

Actions (login required)

View Item View Item