Jin, Jing and Xiao, Wu and Daly, Ian and Weijie, Chen and Xinjie, He and Xingyu, Wang and Andrzej, Cichocki (2024) Squeeze and Excitation-Based Multiscale CNN for Classification of Steady-State Visual Evoked Potentials. IEEE Internet of Things Journal. p. 1. DOI https://doi.org/10.1109/jiot.2024.3488745 (In Press)
Jin, Jing and Xiao, Wu and Daly, Ian and Weijie, Chen and Xinjie, He and Xingyu, Wang and Andrzej, Cichocki (2024) Squeeze and Excitation-Based Multiscale CNN for Classification of Steady-State Visual Evoked Potentials. IEEE Internet of Things Journal. p. 1. DOI https://doi.org/10.1109/jiot.2024.3488745 (In Press)
Jin, Jing and Xiao, Wu and Daly, Ian and Weijie, Chen and Xinjie, He and Xingyu, Wang and Andrzej, Cichocki (2024) Squeeze and Excitation-Based Multiscale CNN for Classification of Steady-State Visual Evoked Potentials. IEEE Internet of Things Journal. p. 1. DOI https://doi.org/10.1109/jiot.2024.3488745 (In Press)
Abstract
Brain-Computer interface (BCI) technology enables the control of external devices by recognizing user intentions. Steady-state visual evoked potential (SSVEP)-based BCI technology has been widely applied in the field of Internet of things (IoT) device control, including smart healthcare, smart homes, and robotics, and has achieved significant results. However, as the field of BCI-based IoT device control is still in its development stage, there remains considerable room for improvement in terms of accuracy, efficiency, and cost. Therefore, enhancing the classification accuracy of SSVEP decoding using a short time window, reducing both human and material costs, and improving work efficiency are crucial for the theoretical research and engineering applications of BCI technology in IoT device control. Based on this, we propose a novel approach to address the challenge of high accuracy feature extraction within brief timeframes. Our approach integrates a multi-scale convolutional neural network with a squeeze excitation module (SEMSCNN). This fusion leverages CNNs' local feature learning capacity and the advantageous feature importance distinction offered by the squeeze excitation mechanism. First, the EEG signals are band- pass filtered into distinct frequency bands and frequency band and channel features are extracted by a two-layer convolution. Then, temporal features are extracted via a multi-branch convolution of different scales. Finally, the squeeze and excitation (SE) module is introduced to learn the interdependence between features to improve the quality of the extracted features. The first stage of training exploits statistical commonalities across research participants by learning the global model, and the second stage fine-tunes each participant’s features separately by exploiting participant-specific differences in features. We evaluate our SEMSCNN model on two large public datasets, Benchmark and BETA, and we compare our model to other state-of-the-art models in order to evaluate the effectiveness of our proposed network. Our experimental results indicate that our method effectively improves the accuracy of target recognition and information transfer rate under short-duration stimuli, showing a significant advantage compared to other baseline methods. This provides a broad prospect for the practical application of BCIs in the field of IoT.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Multiscale fusion; convolutional neural network (CNN); squeeze and excitation module (SEM); Brain–computer interface (BCI); steady-state visual evoked potentials (SSVEP) |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 28 Oct 2024 16:55 |
Last Modified: | 01 Nov 2024 23:33 |
URI: | http://repository.essex.ac.uk/id/eprint/39499 |
Available files
Filename: Squeeze and Excitation-Based Multiscale CNN for Classification of Steady-State Visual Evoked Potentials.pdf