Abstract
Recent advances in brain-computer interface (BCI) techniques have led to increasingly refined interactions between users and external devices. Accurately decoding kinematic information from brain signals is one of the main challenges encountered in the control of human-like robots. In particular, although the forearm of an upper extremity is frequently used in daily life for high-level tasks, only few studies addressed decoding of the forearm movement. In this study, we focus on the classification of forearm movements according to elaborated rotation angles using electroencephalogram (EEG) signals. To this end, we propose a hierarchical flow convolutional neural network (HF-CNN) model for robust classification. We evaluate the proposed model not only with our experimental dataset but also with a public dataset (BNCI Horizon 2020). The grand-average classification accuracies of three rotation angles yield 0.73 (±0.04) for the motor execution (ME) task and 0.65 (±0.09) for the motor imagery (MI) task across ten subjects in our experimental dataset. Further, in the public dataset, the grand-averaged classification accuracies were 0.52 (±0.03) for ME and 0.51 (±0.04) for MI tasks across fifteen subjects. Our experimental results demonstrate the possibility of decoding complex kinematics information using EEG signals. This study will contribute to the development of a brain-controlled robotic arm system capable of performing high-level tasks.
Original language | English |
---|---|
Article number | 9046799 |
Pages (from-to) | 66941-66950 |
Number of pages | 10 |
Journal | IEEE Access |
Volume | 8 |
DOIs | |
Publication status | Published - 2020 |
Bibliographical note
Funding Information:This work was supported in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) funded by the Korea Government under Grant 2017-0-00432 (Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface), in part by the IITP funded by the Korea Government under Grant 2017-0-00451 (Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning), and in part by the IITP funded by the Korea Government under Grant 2019-0-00079 (Department of Artificial Intelligence, Korea University).
Publisher Copyright:
© 2013 IEEE.
Keywords
- Brain-computer interface (BCI)
- convolutional neural network (CNN)
- electroencephalogram (EEG)
- forearm motor execution and motor imagery
ASJC Scopus subject areas
- General Computer Science
- General Materials Science
- General Engineering