TY - GEN
T1 - Automatic gesture recognition for intelligent human-robot interaction
AU - Lee, Seong Whan
N1 - Copyright:
Copyright 2008 Elsevier B.V., All rights reserved.
PY - 2006
Y1 - 2006
N2 - An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.
AB - An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.
UR - http://www.scopus.com/inward/record.url?scp=33750815390&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33750815390&partnerID=8YFLogxK
U2 - 10.1109/FGR.2006.25
DO - 10.1109/FGR.2006.25
M3 - Conference contribution
AN - SCOPUS:33750815390
SN - 0769525032
SN - 9780769525037
T3 - FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
SP - 645
EP - 650
BT - FGR 2006
T2 - FGR 2006: 7th International Conference on Automatic Face and Gesture Recognition
Y2 - 10 April 2006 through 12 April 2006
ER -