Human-robot interaction by whole body gesture spotting and recognition

Hee Deok Yang, A. Yeon Park, Seong Whan Lee

Research output: Contribution to journalConference articlepeer-review

12 Citations (Scopus)


An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relative entropy. The proposed method has been tested with 20 persons' samples and 200 synthetic data. The proposed method achieved a reliability rate of 94.8% in spotting task and a recognition rate of 97.4% from an isolated gesture.

Original languageEnglish
Article number1699955
Pages (from-to)774-777
Number of pages4
JournalProceedings - International Conference on Pattern Recognition
Publication statusPublished - 2006
Event18th International Conference on Pattern Recognition, ICPR 2006 - Hong Kong, China
Duration: 2006 Aug 202006 Aug 24

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Human-robot interaction by whole body gesture spotting and recognition'. Together they form a unique fingerprint.

Cite this