Abstract
This paper presents how to synthesize a texture in a procedural way that preserves the features of the input exemplar. The exemplar is analyzed in both spatial and frequency domains to be decomposed into feature and non-feature parts. Then, the non-feature parts are reproduced as a procedural noise, whereas the features are independently synthesized. They are combined to output a non-repetitive texture that also preserves the exemplar’s features. The proposed method allows the user to control the extent of extracted features and also enables a texture to edited quite effectively.
Original language | English |
---|---|
Pages (from-to) | 761-768 |
Number of pages | 8 |
Journal | Visual Computer |
Volume | 33 |
Issue number | 6-8 |
DOIs | |
Publication status | Published - 2017 Jun 1 |
Bibliographical note
Funding Information:This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government (MSIP) (No. NRF-2016R1A2B3014319) and by Institute for Information and Communications Technology Promotion (IITP) Grant funded by the Korea government (MSIP) (No. R0115-16-1011).
Publisher Copyright:
© 2017, Springer-Verlag Berlin Heidelberg.
Keywords
- Feature preservation
- Noise by example
- Procedural texturing
- Texture analysis
ASJC Scopus subject areas
- Software
- Computer Vision and Pattern Recognition
- Computer Graphics and Computer-Aided Design