Towards a Realistic Indoor World Reconstruction: Preliminary Results for an Object-Oriented 3D RGB-D Mapping

Chang Hyun Jun, Jaehyeon Kang, Suyong Yeon, Hyunga Choi, Tae Young Chung, Nakju Lett Doh

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)


A real world reconstruction that generates cyberspace not from a computer graphics tool, but from the real world, has been one of the main issues in two different communities of robotics and computer vision under different names of Simultaneous Localization And Mapping (SLAM) and Structure from Motion (SfM). However, there have been few trials that actively integrate SLAM and SfM for possible synergy. This paper shows the real world reconstruction can be enabled through this integration. As a result, the preliminary map has been generated of which five subgoals are: Realistic view (RGB), accurate geometry (depth), applicability to multi-floor indoor building, initial classification of a possible set of objects, and full automation. To this end, an engineering framework of “Acquire-Build-Comprehend (ABC)” is proposed, through which a sensor system acquires an RGB-Depth point cloud from the real world, builds a three-dimensional map, and comprehends this map to yield the possible set of objects. Its performance is demonstrated by building a map for three levels of indoor building of which volume is 1,408 m3.

Original languageEnglish
Pages (from-to)207-218
Number of pages12
JournalIntelligent Automation and Soft Computing
Issue number2
Publication statusPublished - 2017 Apr 3

Bibliographical note

Publisher Copyright:
© 2016 TSI Press.


  • 3-dimensional map
  • RGB-D
  • Real world reconstruction
  • SLAM
  • SfM

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Theoretical Computer Science
  • Computational Theory and Mathematics


Dive into the research topics of 'Towards a Realistic Indoor World Reconstruction: Preliminary Results for an Object-Oriented 3D RGB-D Mapping'. Together they form a unique fingerprint.

Cite this