Generative multiview inpainting for object removal in large indoor spaces

Joohyung Kim, Janghun Hyeon, Nakju Doh

    Research output: Contribution to journalArticlepeer-review

    5 Citations (Scopus)

    Abstract

    As interest in image-based rendering increases, the need for multiview inpainting is emerging. Despite of rapid progresses in single-image inpainting based on deep learning approaches, they have no constraint in obtaining color consistency over multiple inpainted images. We target object removal in large-scale indoor spaces and propose a novel pipeline of multiview inpainting to achieve color consistency and boundary consistency in multiple images. The first step of the pipeline is to create color prior information on masks by coloring point clouds from multiple images and projecting the colored point clouds onto the image planes. Next, a generative inpainting network accepts a masked image, a color prior image, imperfect guideline, and two different masks as inputs and yields the refined guideline and inpainted image as outputs. The color prior and guideline input ensure color and boundary consistencies across multiple images. We validate our pipeline on real indoor data sets quantitatively using consistency distance and similarity distance, metrics we defined for comparing results of multiview inpainting and qualitatively.

    Original languageEnglish
    JournalInternational Journal of Advanced Robotic Systems
    Volume18
    Issue number2
    DOIs
    Publication statusPublished - 2021

    Bibliographical note

    Publisher Copyright:
    © The Author(s) 2021.

    Keywords

    • Multiview inpainting
    • boundary consistency
    • color consistency
    • generative adversarial network
    • object removal

    ASJC Scopus subject areas

    • Software
    • Computer Science Applications
    • Artificial Intelligence

    Fingerprint

    Dive into the research topics of 'Generative multiview inpainting for object removal in large indoor spaces'. Together they form a unique fingerprint.

    Cite this