Abstract
This paper introduces a face robot named 'Buddy' which can perform facial expressions, such as eye-tracking and lip synchronization, via movements of its facial elements (i.e., eyeballs, eyebrows, eyelids, and lips). Buddy has 14 degrees of freedom. To produce the realistic motion of Buddy, we built a 'Reactive Behavior Decision Model' which decides not only how to control the rotation angles and speed of facial elements, but to exhibit as well particular emotions that could express the robot's personality. Buddy's personality is formed in the model by the accumulated external stimuli and internal status. The process to automatically achieve reactive behavior in the model is classified into three steps: (1) to analyze the external stimuli and identify variations in Buddy's internal status; (2) to decide the type and degree of emotion based on the robot's personality; and (3) to generate specific facial expressions and gestures by combining the appropriate primitive behaviors chosen from emotion databases. By using this model, we have proven that Buddy can display various facial expressions and behaviors, at times very reasonable but quite unexpected.
Original language | English |
---|---|
Pages (from-to) | 769-774 |
Number of pages | 6 |
Journal | Journal of Mechanical Science and Technology |
Volume | 24 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2010 Mar |
Externally published | Yes |
Bibliographical note
Funding Information:This paper was performed for the Intelligent Robotics Development Program, one of the 21st Century Frontier R&D Programs funded by the Ministry of Knowledge Economy of Korea.
Copyright:
Copyright 2010 Elsevier B.V., All rights reserved.
Keywords
- Emotional expression
- Face robot
- Human-robot interaction (HRI)
- Intelligent robot
ASJC Scopus subject areas
- Mechanics of Materials
- Mechanical Engineering