TY - GEN
T1 - Automatic physiognomic analysis by classifying facial component features
AU - Yang, Hee Deok
AU - Lee, Seong Whan
PY - 2006
Y1 - 2006
N2 - This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The Active Appearance Model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5% for all facial components feature.
AB - This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The Active Appearance Model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5% for all facial components feature.
UR - http://www.scopus.com/inward/record.url?scp=34047223598&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=34047223598&partnerID=8YFLogxK
U2 - 10.1109/ICPR.2006.1196
DO - 10.1109/ICPR.2006.1196
M3 - Conference contribution
AN - SCOPUS:34047223598
SN - 0769525210
SN - 9780769525211
T3 - Proceedings - International Conference on Pattern Recognition
SP - 1212
EP - 1215
BT - Proceedings - 18th International Conference on Pattern Recognition, ICPR 2006
T2 - 18th International Conference on Pattern Recognition, ICPR 2006
Y2 - 20 August 2006 through 24 August 2006
ER -