Assistive robotic arm control based on brain-machine interface with vision guidance using convolution neural network

Kyung Hwan Shim, Ji Hoon Jeong, Byoung Hee Kwon, Byeong Hoo Lee, Seong Whan Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)

Abstract

Brain-machine interface (BMI) provides a new control strategy for both patients and healthy people. An endogenous paradigm such as motor imagery (MI) for BMI is commonly used for detecting user intention without external stimuli. However, manipulating the dexterous robotic arm by using limited MI commands is challenging issues. In this paper, we designed a shared robotic arm control system using the intuitive MI and vision guidance. To accomplish the user's intention on the robotic arm, we used arm reach MI (left, right, and forward), hand grasp MI, and wrist twist MI by using electroencephalogram (EEG) signals. The Kinect sensor is used to match the decoded user intention with the detected object based on the location of the workspace. In addition, to decode intuitive MI successfully, we propose a novel convolutional neural network (CNN) based user intention decoding model. Ten subjects participated in our experiments, and five of them were selected to perform online tasks. The proposed method could decode various user intention (five intuitive MI classes and resting state) with a grand-averaged classification accuracy of 55.91% in offline analysis. For sufficient control on the online shared robotic arm control, the proposed online system was only started, once the patient shows higher performance than 60% in the offline analysis. For the online drinking tasks, we confirmed the averaged 78% success rate. Hence, we confirmed the possibility of the shared robotic arm control based on intuitive BMI and vision guidance with high performance.

Original languageEnglish
Title of host publication2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2785-2790
Number of pages6
ISBN (Electronic)9781728145693
DOIs
Publication statusPublished - 2019 Oct
Event2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019 - Bari, Italy
Duration: 2019 Oct 62019 Oct 9

Publication series

NameConference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
Volume2019-October
ISSN (Print)1062-922X

Conference

Conference2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019
Country/TerritoryItaly
CityBari
Period19/10/619/10/9

Bibliographical note

Funding Information:
*Research was partly supported by Institute of Information & Communication Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00432, Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface) and partly funded by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by Korea government (No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning).

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Assistive robotic arm control based on brain-machine interface with vision guidance using convolution neural network'. Together they form a unique fingerprint.

Cite this