Wang, Zehui and Chen, Chuangquan and Li, Junhua and Wan, Feng and Sun, Yu and Wang, Hongtao (2023) ST-CapsNet: Linking Spatial and Temporal Attention with Capsule Network for P300 Detection Improvement. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31. pp. 991-1000. DOI https://doi.org/10.1109/tnsre.2023.3237319
Wang, Zehui and Chen, Chuangquan and Li, Junhua and Wan, Feng and Sun, Yu and Wang, Hongtao (2023) ST-CapsNet: Linking Spatial and Temporal Attention with Capsule Network for P300 Detection Improvement. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31. pp. 991-1000. DOI https://doi.org/10.1109/tnsre.2023.3237319
Wang, Zehui and Chen, Chuangquan and Li, Junhua and Wan, Feng and Sun, Yu and Wang, Hongtao (2023) ST-CapsNet: Linking Spatial and Temporal Attention with Capsule Network for P300 Detection Improvement. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31. pp. 991-1000. DOI https://doi.org/10.1109/tnsre.2023.3237319
Abstract
A brain-computer interface (BCI), which provides an advanced direct human-machine interaction, has gained substantial research interest in the last decade for its great potential in various applications including rehabilitation and communication. Among them, the P300-based BCI speller is a typical application that is capable of identifying the expected stimulated characters. However, the applicability of the P300 speller is hampered for the low recognition rate partially attributed to the complex spatio-temporal characteristics of the EEG signals. Here, we developed a deep-learning analysis framework named ST-CapsNet to overcome the challenges regarding better P300 detection using a capsule network with both spatial and temporal attention modules. Specifically, we first employed spatial and temporal attention modules to obtain refined EEG signals by capturing event-related information. Then the obtained signals were fed into the capsule network for discriminative feature extraction and P300 detection. In order to quantitatively assess the performance of the proposed ST-CapsNet, two publicly-available datasets (i.e., Dataset IIb of BCI Competition 2003 and Dataset II of BCI Competition III) were applied. A new metric of averaged symbols under repetitions (ASUR) was adopted to evaluate the cumulative effect of symbol recognition under different repetitions. In comparison with several widely-used methods (i.e., LDA, ERP-CapsNet, CNN, MCNN, SWFP, and MsCNN-TL-ESVM), the proposed ST-CapsNet framework significantly outperformed the state-of-the-art methods in terms of ASUR. More interestingly, the absolute values of the spatial filters learned by ST-CapsNet are higher in the parietal lobe and occipital region, which is consistent with the generation mechanism of P300.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | brain-computer interfaces (BCIs); capsule network; P300; attention |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 20 Jan 2023 13:18 |
Last Modified: | 30 Oct 2024 15:51 |
URI: | http://repository.essex.ac.uk/id/eprint/34676 |
Available files
Filename: ST-CapsNet_Linking_Spatial_and_Temporal_Attention_with_Capsule_Network_for_P300_Detection_Improvement.pdf
Licence: Creative Commons: Attribution 4.0