3D object selection with multimodal feedback in mobile auto-stereoscopic display

Euijai Ahn, Hyunseok Yang, Gerard J. Kim

Research output: Contribution to journalArticlepeer-review


Interacting in a relatively small mobile/hand-held auto-stereoscopic display volume (3D "phone" space) can be difficult because of the lack of accurate tracking of an interaction proxy, and having to maintain a fixed viewpoint and adapt to a different level of depth perception sensitivity. In this work, we first introduce an articulated mechanical stylus with joint sensors for 3D tracking the interaction point in the phone space. We also investigate a way to assist the user in selecting an object in the phone space through supplementary multimodal feedback, such as sound and tactility. We have carried out experiments in two conditions, stationary and moving, comparing the effects of various combinations of multimodal feedback to object selection performance. We have found that multimodal feedback was generally significantly helpful for auto-stereoscopic 3D object selection, and particularly more so when the user is moving, considering the added difficulty

Original languageEnglish
Pages (from-to)4647-4663
Number of pages17
JournalInternational Journal of Innovative Computing, Information and Control
Issue number12
Publication statusPublished - 2013 Dec


  • 3D tracking
  • Auto-stereoscopy
  • Depth perception
  • Mobile interaction
  • Multimodal interaction
  • Selection

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Information Systems
  • Computational Theory and Mathematics


Dive into the research topics of '3D object selection with multimodal feedback in mobile auto-stereoscopic display'. Together they form a unique fingerprint.

Cite this