TY - GEN
T1 - Motor Imagery Classification Based on CNN-GRU Network with Spatio-Temporal Feature Representation
AU - Bang, Ji Seon
AU - Lee, Seong Whan
N1 - Funding Information:
Keywords: Brain-computer interface (BCI) · Electroencephalography (EEG) · Motor imagery (MI) · Convolutional neural network (CNN) · Gated recurrent unit (GRU) This work was partly supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2015-0-00185, Development of Intelligent Pattern Recognition Softwares for Ambulatory Brain Computer Interface, No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning, No. 2019-0-00079, Artificial Intelligence Graduate School Program (Korea University)).
Publisher Copyright:
© 2022, Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - Recently, various deep neural networks have been applied to classify electroencephalogram (EEG) signal. EEG is a brain signal that can be acquired in a non-invasive way and has a high temporal resolution. It can be used to decode the intention of users. As the EEG signal has a high dimension of feature space, appropriate feature extraction methods are needed to improve classification performance. In this study, we obtained spatio-temporal feature representation and classified them with the combined convolutional neural networks (CNN)-gated recurrent unit (GRU) model. To this end, we obtained covariance matrices in each different temporal band and then concatenated them on the temporal axis to obtain a final spatio-temporal feature representation. In the classification model, CNN is responsible for spatial feature extraction and GRU is responsible for temporal feature extraction. Classification performance was improved by distinguishing spatial data processing and temporal data processing. The average accuracy of the proposed model was 77.70% (±15.39) for the BCI competition IV_2a data set. The proposed method outperformed all other methods compared as a baseline method.
AB - Recently, various deep neural networks have been applied to classify electroencephalogram (EEG) signal. EEG is a brain signal that can be acquired in a non-invasive way and has a high temporal resolution. It can be used to decode the intention of users. As the EEG signal has a high dimension of feature space, appropriate feature extraction methods are needed to improve classification performance. In this study, we obtained spatio-temporal feature representation and classified them with the combined convolutional neural networks (CNN)-gated recurrent unit (GRU) model. To this end, we obtained covariance matrices in each different temporal band and then concatenated them on the temporal axis to obtain a final spatio-temporal feature representation. In the classification model, CNN is responsible for spatial feature extraction and GRU is responsible for temporal feature extraction. Classification performance was improved by distinguishing spatial data processing and temporal data processing. The average accuracy of the proposed model was 77.70% (±15.39) for the BCI competition IV_2a data set. The proposed method outperformed all other methods compared as a baseline method.
KW - Brain-computer interface (BCI)
KW - Convolutional neural network (CNN)
KW - Electroencephalography (EEG)
KW - Gated recurrent unit (GRU)
KW - Motor imagery (MI)
UR - http://www.scopus.com/inward/record.url?scp=85130395433&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-02375-0_8
DO - 10.1007/978-3-031-02375-0_8
M3 - Conference contribution
AN - SCOPUS:85130395433
SN - 9783031023743
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 104
EP - 115
BT - Pattern Recognition - 6th Asian Conference, ACPR 2021, Revised Selected Papers
A2 - Wallraven, Christian
A2 - Liu, Qingshan
A2 - Nagahara, Hajime
PB - Springer Science and Business Media Deutschland GmbH
T2 - 6th Asian Conference on Pattern Recognition, ACPR 2021
Y2 - 9 November 2021 through 12 November 2021
ER -