Brain-machine interface (BMI) provides a new control strategy for both patients and healthy people. An endogenous paradigm such as motor imagery (MI) for BMI is commonly used for detecting user intention without external stimuli. However, manipulating the dexterous robotic arm by using limited MI commands is challenging issues. In this paper, we designed a shared robotic arm control system using the intuitive MI and vision guidance. To accomplish the user's intention on the robotic arm, we used arm reach MI (left, right, and forward), hand grasp MI, and wrist twist MI by using electroencephalogram (EEG) signals. The Kinect sensor is used to match the decoded user intention with the detected object based on the location of the workspace. In addition, to decode intuitive MI successfully, we propose a novel convolutional neural network (CNN) based user intention decoding model. Ten subjects participated in our experiments, and five of them were selected to perform online tasks. The proposed method could decode various user intention (five intuitive MI classes and resting state) with a grand-averaged classification accuracy of 55.91% in offline analysis. For sufficient control on the online shared robotic arm control, the proposed online system was only started, once the patient shows higher performance than 60% in the offline analysis. For the online drinking tasks, we confirmed the averaged 78% success rate. Hence, we confirmed the possibility of the shared robotic arm control based on intuitive BMI and vision guidance with high performance.
|Title of host publication||2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||6|
|Publication status||Published - 2019 Oct|
|Event||2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019 - Bari, Italy|
Duration: 2019 Oct 6 → 2019 Oct 9
|Name||Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics|
|Conference||2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019|
|Period||19/10/6 → 19/10/9|
Bibliographical noteFunding Information:
*Research was partly supported by Institute of Information & Communication Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00432, Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface) and partly funded by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by Korea government (No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning).
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Control and Systems Engineering
- Human-Computer Interaction