Adaptive smoothness constraints for efficient stereo matching using texture and edge information

Kyung Rae Kim, Chang-Su Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

52 Citations (Scopus)

Abstract

An efficient stereo matching algorithm, which applies adaptive smoothness constraints using texture and edge information, is proposed in this work. First, we determine non-textured regions, on which an input image yields flat pixel values. In the non-textured regions, we penalize depth discontinuity and complement the primary CNN-based matching cost with a color-based cost. Second, by combining two edge maps from the input image and a pre-estimated disparity map, we extract denoised edges that correspond to depth discontinuity with high probabilities. Thus, near the denoised edges, we penalize small differences of neighboring disparities. Based on these adaptive smoothness constraints, the proposed algorithm outperforms the conventional methods significantly and achieves the state-of-the-art performance on the Middlebury stereo benchmark.

Original languageEnglish
Title of host publication2016 IEEE International Conference on Image Processing, ICIP 2016 - Proceedings
PublisherIEEE Computer Society
Pages3429-3433
Number of pages5
Volume2016-August
ISBN (Electronic)9781467399616
DOIs
Publication statusPublished - 2016 Aug 3
Event23rd IEEE International Conference on Image Processing, ICIP 2016 - Phoenix, United States
Duration: 2016 Sept 252016 Sept 28

Other

Other23rd IEEE International Conference on Image Processing, ICIP 2016
Country/TerritoryUnited States
CityPhoenix
Period16/9/2516/9/28

Keywords

  • Adaptive smoothness constraint
  • Edge analysis
  • Stereo matching
  • Texture analysis

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Adaptive smoothness constraints for efficient stereo matching using texture and edge information'. Together they form a unique fingerprint.

Cite this