Gesture spotting and recognition for human-robot interaction

Hee Deok Yang, A. Yeon Park, Seong Whan Lee

Research output: Contribution to journalArticlepeer-review

117 Citations (Scopus)


Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences.

Original languageEnglish
Pages (from-to)256-270
Number of pages15
JournalIEEE Transactions on Robotics
Issue number2
Publication statusPublished - 2007 Apr


  • Gesture spotting
  • Hidden Markov model (HMM)
  • Human-robot interaction (HRI)
  • Mobile robot
  • Transition gesture model
  • Whole-body gesture recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering


Dive into the research topics of 'Gesture spotting and recognition for human-robot interaction'. Together they form a unique fingerprint.

Cite this