TY - GEN
T1 - Deep Color Constancy Using Spatio-Temporal Correlation of High-Speed Video
AU - Lee, Dong Jae
AU - Lee, Kang Kyu
AU - Kim, Jong Ok
N1 - Funding Information:
This work is supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2019R1A2C1005834). This research was supported by the MSIT(Ministry of Science and ICT), Korea, under the ITRC(Information Technology Research Center) support program(IITP-2021-2018-0-01421) supervised by the IITP(Institute of Information communications Technology Planning Evaluation) Fig. 1. Comparison with existing works. The key feature of the proposed method is to utilize spatio-temporal correlation unlike the conventional temporal correlation only.
Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - After the invention of electric bulbs, most of lights surrounding our worlds are powered by alternative current (AC). This intensity variation can be captured with a high-speed camera, and we can utilize the intensity difference between consecutive video frames for various vision tasks. For color constancy, conventional methods usually focus on exploiting only the spatial feature. To overcome the limitations of conventional methods, a couple of methods to utilize AC flickering have been proposed. The previous work employed temporal correlation between high-speed video frames. To further enhance the previous work, we propose a deep spatio-temporal color constancy method using spatial and temporal correlations. To extract temporal features for illuminant estimation, we calculate the temporal correlation between feature maps where global features as well as local are learned. By learning global features through spatio-temporal correlation, the proposed method can estimate illumination more accurately, and is particularly robust to noisy practical environments. The experimental results demonstrate that the performance of the proposed method is superior to that of existing methods.
AB - After the invention of electric bulbs, most of lights surrounding our worlds are powered by alternative current (AC). This intensity variation can be captured with a high-speed camera, and we can utilize the intensity difference between consecutive video frames for various vision tasks. For color constancy, conventional methods usually focus on exploiting only the spatial feature. To overcome the limitations of conventional methods, a couple of methods to utilize AC flickering have been proposed. The previous work employed temporal correlation between high-speed video frames. To further enhance the previous work, we propose a deep spatio-temporal color constancy method using spatial and temporal correlations. To extract temporal features for illuminant estimation, we calculate the temporal correlation between feature maps where global features as well as local are learned. By learning global features through spatio-temporal correlation, the proposed method can estimate illumination more accurately, and is particularly robust to noisy practical environments. The experimental results demonstrate that the performance of the proposed method is superior to that of existing methods.
KW - AC light
KW - High-speed video
KW - Spatio-temporal correlation
KW - Temporal color constancy
UR - http://www.scopus.com/inward/record.url?scp=85125287531&partnerID=8YFLogxK
U2 - 10.1109/VCIP53242.2021.9675406
DO - 10.1109/VCIP53242.2021.9675406
M3 - Conference contribution
AN - SCOPUS:85125287531
T3 - 2021 International Conference on Visual Communications and Image Processing, VCIP 2021 - Proceedings
BT - 2021 International Conference on Visual Communications and Image Processing, VCIP 2021 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Conference on Visual Communications and Image Processing, VCIP 2021
Y2 - 5 December 2021 through 8 December 2021
ER -