Abstract
In this study, we adopted visual motion imagery, which is a more intuitive brain-computer interface (BCI) paradigm, for decoding the intuitive user intention. We developed a 3-dimensional BCI training platform and applied it to assist the user in performing more intuitive imagination in the visual motion imagery experiment. The experimental tasks were selected based on the movements that we commonly used in daily life, such as picking up a phone, opening a door, eating food, and pouring water. Nine subjects participated in our experiment. We presented statistical evidence that visual motion imagery has a high correlation from the prefrontal and occipital lobes. In addition, we selected the most appropriate electroencephalography channels using a functional connectivity approach for visual motion imagery decoding and proposed a convolutional neural network architecture for classification. As a result, the averaged classification performance of the proposed architecture for 4 classes from 16 channels was 67.50 (±1.52)% across all subjects. This result is encouraging, and it shows the possibility of developing a BCI-based device control system for practical applications such as neuroprosthesis and a robotic arm.
Original language | English |
---|---|
Title of host publication | 2020 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2020 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 2966-2971 |
Number of pages | 6 |
ISBN (Electronic) | 9781728185262 |
DOIs | |
Publication status | Published - 2020 Oct 11 |
Event | 2020 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2020 - Toronto, Canada Duration: 2020 Oct 11 → 2020 Oct 14 |
Publication series
Name | Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics |
---|---|
Volume | 2020-October |
ISSN (Print) | 1062-922X |
Conference
Conference | 2020 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2020 |
---|---|
Country/Territory | Canada |
City | Toronto |
Period | 20/10/11 → 20/10/14 |
Bibliographical note
Funding Information:This work was partly supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-00432, Development of NonInvasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface; No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning; No. 2019-0-00079, Artificial Intelligence Graduate School Program (Korea University))
Publisher Copyright:
© 2020 IEEE.
Keywords
- brain-computer interface
- electroencephalography
- functional connectivity
- visual motion imagery
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Control and Systems Engineering
- Human-Computer Interaction