We present a mobile system for tracking the gaze of an observer in real-time as they move around freely and interact with a wall-sized display. The system combines a head-mounted eye tracker with a motion capture system for tracking markers attached to the eye tracker. Our open-source software library libGaze provides routines for calibrating the system and computing the viewer's position and gaze direction in real-time. The modular architecture of our system supports simple replacement of each of the main components with alternative technology. We use the system to perform a psychophysical user-study, designed to measure how users visually explore large displays. We find that observers use head movements during gaze shifts, even when these are well within the range that can be comfortably reached by eye movements alone. This suggests that free movement is important in normal gaze behaviour, motivating further applications in which the tracked user is free to move.