Abstract
Few HMD-based virtual environment systems display a rendering of the user's own body. Subjectively, this often leads to a sense of disembodiment in the virtual world. We explore the effect of being able to see one's own body in such systems on an objective measure of the accuracy of one form of space perception. Using an actionbased response measure, we found that participants who explored near space while seeing a fully-articulated and tracked visual representation of themselves subsequently made more accurate judgments of absolute egocentric distance to locations ranging from 4 m to 6 m away from where they were standing than did participants who saw no avatar. A nonanimated avatar also improved distance judgments, but by a lesser amount. Participants who viewed either animated or static avatars positioned 3 m infront of their own position made subsequent distance judgments with similar accuracy to the participants who viewed the equivalent animated or static avatar positioned at their own location. We discuss the implications of these results on theories of embodied perception in virtual environments.
Original language | English |
---|---|
Pages (from-to) | 230-242 |
Number of pages | 13 |
Journal | Presence: Teleoperators and Virtual Environments |
Volume | 19 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2010 Jun |
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Human-Computer Interaction
- Computer Vision and Pattern Recognition