Abstract
We present a novel haptic rendering framework that translates the performer’s motions into wearable vibrotactile feedback for an immersive virtual reality (VR) performance experience. Here, we employ a rendering pipeline that extracts meaningful vibrotactile parameters including intensity and location. We compute these parameters from the performer’s upper-body movements which play a significant role in a dance performance. Therefore, we customize a haptic vest and sleeves to support vibrotactile feedback on the frontal and back parts of the torso and shoulders as well. To capture essential movements from the VR performance, we propose a method called motion salient triangle (MST). MST utilizes key skeleton joints’ movements to compute the associated haptic parameters. Our method supports translating both choreographic and communicative motions into vibrotactile feedback. Through a series of user studies, we validate the user preference for our method compared to the conventional motion-to-tactile and audio-to-tactile methods.
Original language | English |
---|---|
Article number | 13 |
Journal | Virtual Reality |
Volume | 28 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2024 Mar |
Bibliographical note
Publisher Copyright:© 2024, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.
Keywords
- Haptics
- Performance
- Virtual reality
- Wearables
ASJC Scopus subject areas
- Software
- Human-Computer Interaction
- Computer Graphics and Computer-Aided Design