Abstract
A deep texture adaptive denoising method is proposed to achieve high perceptual image quality. Textual information is learned through a designed loss function utilizing a pre-generated texture map to distinguish textual areas from flat areas. In the training process, the proposed network internally finds texture and flat regions and differs in denoising strength in the two regions. Unlike existing DNN-based denoising methods, the proposed method retains high-frequency textual information while removing residual noise in flat regions as much as possible. The gradient distribution of the image before and after the denoising was compared. The proposed method outperformed the existing methods with higher PSNR and SSIM scores in visual quality. In addition, the strength of removing textual noise was controllable with a single parameter. Thus, the proposed method is practically feasible as a denoising apparatus.
Original language | English |
---|---|
Pages (from-to) | 412-420 |
Number of pages | 9 |
Journal | IEIE Transactions on Smart Processing and Computing |
Volume | 11 |
Issue number | 6 |
DOIs | |
Publication status | Published - 2022 |
Bibliographical note
Funding Information:This paper was supported by the Education and Research promotion program of KOREATECH in 2020.
Publisher Copyright:
Copyrights © 2022 The Institute of Electronics and Information Engineers.
Keywords
- DNN-based
- Image denoising
- Perceptual image quality
- Texture segmentation
- Texture-adaptive denoising
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering