Learning View Graphs for Robot Navigation

Matthias O. Franz, Bernhard Schölkopf, Hanspeter A. Mallot, Heinrich H. Bülthoff

Research output: Contribution to journalArticlepeer-review

155 Citations (Scopus)


We present a purely vision-based scheme for learning a topological representation of an open environment. The system represents selected places by local views of the surrounding scene, and finds traversable paths between them. The set of recorded views and their connections are combined into a graph model of the environment. To navigate between views connected in the graph, we employ a homing strategy inspired by findings of insect ethology. In robot experiments, we demonstrate that complex visual exploration and navigation tasks can thus be performed without using metric information.

Original languageEnglish
Pages (from-to)111-125
Number of pages15
JournalAutonomous Robots
Issue number1
Publication statusPublished - 1998

Bibliographical note

Funding Information:
The present work has profited from discussions and technical support by Philipp Georg, Susanne Huber, and Titus Neumann. We thank Fiona Newell and our reviewers for helpful comments on the manuscript. Financial support was provided by the Max-Planck-Gesellschaft and the Studienstiftung des deutschen Volkes.


  • Cognitive maps
  • Environment modeling
  • Exploration
  • Mobile robots
  • Omnidirectional sensor
  • Topological maps
  • Visual navigation

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'Learning View Graphs for Robot Navigation'. Together they form a unique fingerprint.

Cite this