TY - JOUR
T1 - Real-time depth-of-field rendering using anisotropically filtered mipmap interpolation
AU - Lee, Sungkil
AU - Kim, Gerard Jounghyun
AU - Choi, Seungmoon
N1 - Funding Information:
The authors would like to thank the anonymous reviewers for their insightful comments. This work was supported in parts by Grants (R01-2006-000-11142-0 and R0A-2008-000-20087-0) funded from KOSEF and by an ITRC support program (IITA-2008-C1090-0804-0002) from IITA, all by the Korean Government. The “Sponza Atrium,” “Can and Corn box,” “Forest,” “Dragon,” and “Neptune” models are provided through the courtesies of Marko Dabrovic, 3d02.com, admin2 (at share-cg.com), the Stanford 3D Scanning Repository, and IMATI and INRIA by the AIM@SHAPE Shape Repository, respectively. “Elephant and Zebra” models were taken from [39]. Correspondence concerning this article can be addressed to Seungmoon Choi.
PY - 2009/5
Y1 - 2009/5
N2 - This article presents a real-time GPU-based post-filtering method for rendering acceptable depth-of-field effects suited for virtual reality. Blurring is achieved by nonlinearly interpolating mipmap images generated from a pinhole image. Major artifacts common in the post-filtering techniques such as bilinear magnification artifact, intensity leakage, and blurring discontinuity are practically eliminated via magnification with a circular filter, anisotropic mipmapping, and smoothing of blurring degrees. The whole framework is accelerated using GPU programs for constant and scalable real-time performance required for virtual reality. We also compare our method to recent GPU-based methods in terms of image quality and rendering performance.
AB - This article presents a real-time GPU-based post-filtering method for rendering acceptable depth-of-field effects suited for virtual reality. Blurring is achieved by nonlinearly interpolating mipmap images generated from a pinhole image. Major artifacts common in the post-filtering techniques such as bilinear magnification artifact, intensity leakage, and blurring discontinuity are practically eliminated via magnification with a circular filter, anisotropic mipmapping, and smoothing of blurring degrees. The whole framework is accelerated using GPU programs for constant and scalable real-time performance required for virtual reality. We also compare our method to recent GPU-based methods in terms of image quality and rendering performance.
KW - Anisotropic filtering
KW - Depth of field
KW - Mipmap interpolation
KW - Virtual reality
UR - http://www.scopus.com/inward/record.url?scp=62949166410&partnerID=8YFLogxK
U2 - 10.1109/TVCG.2008.106
DO - 10.1109/TVCG.2008.106
M3 - Article
C2 - 19282551
AN - SCOPUS:62949166410
SN - 1077-2626
VL - 15
SP - 453
EP - 464
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 3
M1 - 4641923
ER -