HapMotion: motion-to-tactile framework with wearable haptic devices for immersive VR performance experience

Kyungeun Jung, Sangpil Kim, Seungjae Oh, Sang Ho Yoon

Research output: Contribution to journalArticlepeer-review

Abstract

We present a novel haptic rendering framework that translates the performer’s motions into wearable vibrotactile feedback for an immersive virtual reality (VR) performance experience. Here, we employ a rendering pipeline that extracts meaningful vibrotactile parameters including intensity and location. We compute these parameters from the performer’s upper-body movements which play a significant role in a dance performance. Therefore, we customize a haptic vest and sleeves to support vibrotactile feedback on the frontal and back parts of the torso and shoulders as well. To capture essential movements from the VR performance, we propose a method called motion salient triangle (MST). MST utilizes key skeleton joints’ movements to compute the associated haptic parameters. Our method supports translating both choreographic and communicative motions into vibrotactile feedback. Through a series of user studies, we validate the user preference for our method compared to the conventional motion-to-tactile and audio-to-tactile methods.

Original languageEnglish
Article number13
JournalVirtual Reality
Volume28
Issue number1
DOIs
Publication statusPublished - 2024 Mar

Bibliographical note

Publisher Copyright:
© 2024, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.

Keywords

  • Haptics
  • Performance
  • Virtual reality
  • Wearables

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'HapMotion: motion-to-tactile framework with wearable haptic devices for immersive VR performance experience'. Together they form a unique fingerprint.

Cite this