Image generation with self pixel-wise normalization

Yoon Jae Yeo, Min Cheol Sagong, Seung Park, Sung Jea Ko, Yong Goo Shin

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

Region-adaptive normalization (RAN) methods have been widely used in the generative adversarial network (GAN)-based image-to-image translation technique. However, since these approaches need a mask image to infer pixel-wise affine transformation parameters, they are not applicable to general image generation models having no paired mask images. To resolve this problem, this paper presents a novel normalization method, called self pixel-wise normalization (SPN), which effectively boosts the generative performance by carrying out the pixel-adaptive affine transformation without an external guidance map. In our method, the transforming parameters are derived from a self-latent mask that divides the feature map into foreground and background regions. The visualization of the self-latent masks shows that SPN effectively captures a single object to be generated as the foreground. Since the proposed method produces the self-latent mask without external data, it is easily adaptable to existing generative models. Extensive experiments on various datasets reveal that our SPN significantly improves the performance of image generation technique in terms of Frechet inception distance (FID) and Inception score (IS).

Original languageEnglish
Pages (from-to)9409-9423
Number of pages15
JournalApplied Intelligence
Volume53
Issue number8
DOIs
Publication statusPublished - 2023 Apr

Bibliographical note

Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

  • Generative adversarial networks
  • Image generation
  • Normalization
  • Region-adaptive normalization

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Image generation with self pixel-wise normalization'. Together they form a unique fingerprint.

Cite this