TY - JOUR
T1 - Learning to Feel Textures
T2 - Predicting Perceptual Similarities From Unconstrained Finger-Surface Interactions
AU - Richardson, Benjamin A.
AU - Vardar, Yasemin
AU - Wallraven, Christian
AU - Kuchenbecker, Katherine J.
N1 - Funding Information:
The work of Christian Wallraven was supported by the Institute for Information and Communications Technology Promotion (IITP), in part by Korea Government, under Grants 2019-0-00079 and 2017-0-00451, and in part by the National Research Foundation of Korea under Grant NRF- 2017M3C7A1041824. This work was supported by the German Ministry of Education and Research (BMBF) through the Tübingen AI Center under Grant FKZ 01IS18039B.
Publisher Copyright:
© 2008-2011 IEEE.
PY - 2022/10/1
Y1 - 2022/10/1
N2 - Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.
AB - Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.
KW - Texture perception
KW - finger-surface interaction
KW - machine learning
KW - predicting human tactile perception
KW - probabilistic representation
UR - http://www.scopus.com/inward/record.url?scp=85139831748&partnerID=8YFLogxK
U2 - 10.1109/TOH.2022.3212701
DO - 10.1109/TOH.2022.3212701
M3 - Article
C2 - 36215359
AN - SCOPUS:85139831748
SN - 1939-1412
VL - 15
SP - 705
EP - 717
JO - IEEE Transactions on Haptics
JF - IEEE Transactions on Haptics
IS - 4
ER -