Abstract
Brain-machine interfaces (BMIs) can be used to decode brain activity into commands to control external devices. This paper presents the decoding of intuitive upper extremity imagery for multi-directional arm reaching tasks in three-dimensional (3D) environments. We designed and implemented an experimental environment in which electroencephalogram (EEG) signals can be acquired for movement execution and imagery. Fifteen subjects participated in our experiments. We proposed a multi-directional convolution neural network-bidirectional long short-term memory network (MDCBN)-based deep learning framework. The decoding performances for six directions in 3D space were measured by the correlation coefficient (CC) and the normalized root mean square error (NRMSE) between predicted and baseline velocity profiles. The grand-averaged CCs of multi-direction were 0.47 and 0.45 for the execution and imagery sessions, respectively, across all subjects. The NRMSE values were below 0.2 for both sessions. Furthermore, in this study, the proposed MDCBN was evaluated by two online experiments for real-time robotic arm control, and the grand-averaged success rates were approximately 0.60 (±0.14) and 0.43 (±0.09), respectively. Hence, we demonstrate the feasibility of intuitive robotic arm control based on EEG signals for real-world environments.
Original language | English |
---|---|
Article number | 9040397 |
Pages (from-to) | 1226-1238 |
Number of pages | 13 |
Journal | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
Volume | 28 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2020 May |
Bibliographical note
Funding Information:Manuscript received July 11, 2019; revised November 19, 2019 and February 19, 2020; accepted March 12, 2020. Date of publication March 18, 2020; date of current version May 8, 2020. This work was supported in part by the Institute of Information & Communications Technology Planning & Evaluation (IITP) funded by the Korea Government through the Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface under Grant 2017-0-00432, in part by the IITP funded by the Korea Government through the Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning under Grant 2017-0-00451, and in part by the IITP funded by the Korea Government through the Department of Artificial Intelligence, Korea University, under Grant 2019-0-00079. (Corresponding author: Seong-Whan Lee.) Ji-Hoon Jeong, Kyung-Hwan Shim, and Dong-Joo Kim are with the Department of Brain and Cognitive Engineering, Korea University, Seoul 02841, South Korea (e-mail: jh_jeong@korea.ac.kr; kh_shim@korea.ac.kr; dongjookim@korea.ac.kr).
Funding Information:
This work was supported in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) funded by the Korea Government through the Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User's Thought via AR/VR Interface under Grant 2017-0-00432, in part by the IITP funded by the Korea Government through the Development of BCI based Brain and Cognitive Computing Technology for Recognizing User's Intentions using Deep Learning under Grant 2017-0-00451, and in part by the IITP funded by the Korea Government through the Department of Artificial Intelligence, Korea University, under Grant 2019-0-00079.
Publisher Copyright:
© 2001-2011 IEEE.
Keywords
- Brain-machine interface (BMI)
- deep learning
- electroencephalogram (EEG)
- intuitive robotic arm control
- motor imagery
ASJC Scopus subject areas
- Internal Medicine
- Neuroscience(all)
- Biomedical Engineering
- Rehabilitation