Abstract
Interacting in a relatively small mobile/hand-held auto-stereoscopic display volume (3D "phone" space) can be difficult because of the lack of accurate tracking of an interaction proxy, and having to maintain a fixed viewpoint and adapt to a different level of depth perception sensitivity. In this work, we first introduce an articulated mechanical stylus with joint sensors for 3D tracking the interaction point in the phone space. We also investigate a way to assist the user in selecting an object in the phone space through supplementary multimodal feedback, such as sound and tactility. We have carried out experiments in two conditions, stationary and moving, comparing the effects of various combinations of multimodal feedback to object selection performance. We have found that multimodal feedback was generally significantly helpful for auto-stereoscopic 3D object selection, and particularly more so when the user is moving, considering the added difficulty
Original language | English |
---|---|
Pages (from-to) | 4647-4663 |
Number of pages | 17 |
Journal | International Journal of Innovative Computing, Information and Control |
Volume | 9 |
Issue number | 12 |
Publication status | Published - 2013 Dec |
Keywords
- 3D tracking
- Auto-stereoscopy
- Depth perception
- Mobile interaction
- Multimodal interaction
- Selection
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Information Systems
- Computational Theory and Mathematics