Abstract
In recent years, electromyography (EMG)-based practical myoelectric interfaces have been developed to improve the quality of daily life for people with physical disabilities. With these interfaces, it is very important to decode a user's movement intention, to properly control the external devices. However, improving the performance of these interfaces is difficult due to the high variations in the EMG signal patterns caused by intra-user variability. Therefore, this paper proposes a novel subject-transfer framework for decoding hand movements, which is robust in terms of intra-user variability. In the proposed framework, supportive convolutional neural network (CNN) classifiers, which are pre-trained using the EMG data of several subjects, are selected and fine-tuned for the target subject via single-trial analysis. Then, the target subject's hand movements are classified by voting the outputs of the supportive CNN classifiers. The feasibility of the proposed framework is validated with NinaPro databases 2 and 3, which comprise 49 hand movements of 40 healthy and 11 amputee subjects, respectively. The experimental results indicate that, when compared to the self-decoding framework, which uses only the target subject's data, the proposed framework can successfully decode hand movements with improved performance in both healthy and amputee subjects. From the experimental results, the proposed subject-transfer framework can be seen to represent a useful tool for EMG-based practical myoelectric interfaces controlling external devices.
Original language | English |
---|---|
Article number | 8865669 |
Pages (from-to) | 94-103 |
Number of pages | 10 |
Journal | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
Volume | 28 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2020 Jan |
Bibliographical note
Funding Information:Manuscript received November 7, 2018; revised April 12, 2019, August 22, 2019, and October 1, 2019; accepted October 4, 2019. Date of publication October 11, 2019; date of current version January 8, 2020. This work was supported in part by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea Government (Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface) under Grant 2017-0-00432 and in part by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea Government (Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning) under Grant 2017-0-00451. (Corresponding author: Seong-Whan Lee.) K.-T. Kim and S.-W. Lee are with the Department of Brain and Cognitive Engineering, Korea University, Seoul 02841, South Korea (e-mail: [email protected]; [email protected]).
Funding Information:
This work was supported in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korea Government (Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User's Thought via AR/VR Interface) under Grant 2017-0-00432 and in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korea Government (Development of BCI based Brain and Cognitive Computing Technology for Recognizing User's Intentions using Deep Learning) under Grant 2017-0-00451.
Publisher Copyright:
© 2001-2011 IEEE.
Keywords
- Subject-transfer framework
- convolutional neural networks
- electromyography
- myoelectric interfaces
ASJC Scopus subject areas
- Internal Medicine
- General Neuroscience
- Biomedical Engineering