Panoptic blind image inpainting

Hyungjoon Kim, Chung Il Kim, Hyeonwoo Kim, Seongkuk Cho, Eenjun Hwang

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

In autonomous driving, scene understanding is a critical task in recognizing the driving environment or dangerous situations. Here, a variety of factors, including foreign objects on the lens, cloudy weather, and light blur, often reduce the accuracy of scene recognition. In this paper, we propose a new blind image inpainting model that accurately reconstructs images in a real environment where there is no ground truth for restoration. To this end, we first introduce a panoptic map to represent content information in detail and design an encoder–decoder structure to predict the panoptic map and the corrupted region mask. Then, we construct an image inpainting model that utilizes the information of the predicted map. Lastly, we present a mask refinement process to improve the accuracy of map prediction. To evaluate the effectiveness of the proposed model, we compared the restoration results of various inpainting methods on the cityscapes and coco datasets. Experimental results show that the proposed model outperforms other blind image inpainting models in terms of L1/L2 losses, PSNR and SSIM, and achieves similar performance to other image inpainting techniques that utilize additional information.

Original languageEnglish
Pages (from-to)208-221
Number of pages14
JournalISA Transactions
Volume132
DOIs
Publication statusPublished - 2023 Jan

Bibliographical note

Publisher Copyright:
© 2022 ISA

Keywords

  • Blind image inpainting
  • Contextual information
  • Generative Adversarial Networks
  • Image restoration
  • Panoptic segmentation

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Instrumentation
  • Computer Science Applications
  • Electrical and Electronic Engineering
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Panoptic blind image inpainting'. Together they form a unique fingerprint.

Cite this