Intention recognition method for sit-to-stand and stand-to-sit from electromyogram signals for overground lower-limb rehabilitation robots

Sang Hun Chung, Jong Min Lee, Seung-Jong Kim, Yoha Hwang, Jinung An

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This paper presents a framework for classifying sit-to-stand and stand-to-sit from just two channel EMG signals taken from the left leg. Our proposed framework uses linear discriminant analysis (LDA) as the classifier and a multi-window feature extraction approach termed Consecutive Time-Windowed Feature Extraction (CTFE). We present the prelimnary results from 2 healthy subjects as a proof of concept. With the two tested subjects, we got predictive accuracies above 90%. The results show promise for a framework capable of recognizing the user's intention of sit-to-stand and stand-to-sit. Potential applications include rehabilitation robots for hemiparesis patients and exoskeleton control.

Original languageEnglish
Title of host publicationAIM 2015 - 2015 IEEE/ASME International Conference on Advanced Intelligent Mechatronics
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages418-421
Number of pages4
ISBN (Electronic)9781467391078
DOIs
Publication statusPublished - 2015 Aug 25
Externally publishedYes
EventIEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2015 - Busan, Korea, Republic of
Duration: 2015 Jul 72015 Jul 11

Publication series

NameIEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM
Volume2015-August

Other

OtherIEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2015
Country/TerritoryKorea, Republic of
CityBusan
Period15/7/715/7/11

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Intention recognition method for sit-to-stand and stand-to-sit from electromyogram signals for overground lower-limb rehabilitation robots'. Together they form a unique fingerprint.

Cite this