In this paper, we introduce “HandyTool” a method (and an interface) for virtual object manipulation based on a metaphorical/structural mapping of various everyday tools to our hands and fingers. The basic idea is to virtually transform the hand/fingers into a proper tool (e.g. a fist becoming a hammer head) and gesturally apply it (e.g. hammer in to insert) to and manipulate the target object (e.g. a nail) directly. The main intended objective of HandyTool is to enhance the tool usage experience by one (or one’s body part) becoming the tool itself and thereby also possibly improving the task performance. A usability experiment was carried out to assess the projected merits, comparing HandyTool to the case of the as-is emulation of the tool usage (i.e. the tracked hand/finger controlling the tool to apply it to the target object) and to the case of using the controller. Our experiment was not able to show the clear and full potential of HandyTool because of the current performance limitation of the hand/fingers tracking sensor and due to the simplicity in the structural mapping between the tool and hand/fingers. The structural metaphor itself was still shown to be helpful when the controller was used (i.e. stable sensing).
|Title of host publication||HCI International 2019 - Posters - 21st International Conference, HCII 2019, Proceedings|
|Number of pages||8|
|Publication status||Published - 2019|
|Event||21st International Conference on Human-Computer Interaction, HCI International 2019 - Orlando, United States|
Duration: 2019 Jul 26 → 2019 Jul 31
|Name||Communications in Computer and Information Science|
|Conference||21st International Conference on Human-Computer Interaction, HCI International 2019|
|Period||19/7/26 → 19/7/31|
Bibliographical noteFunding Information:
Acknowledgments. This work was partially supported by the Global Frontier R&D Program on <Human-centered Interaction for Coexistence> funded by the National Research Foundation of Korea grant funded by the Korean Government (MEST) (NRF-2015M3A6A3076490), and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (NRF-2017M3C1B6070980).
© Springer Nature Switzerland AG 2019.
- Interactive learning environments
- Mixed/augmented reality
- Virtual reality
ASJC Scopus subject areas
- Computer Science(all)