TY - GEN
T1 - Mobile robot localization using fusion of object recognition and range information
AU - Yim, Byung Doo
AU - Lee, Yong Ju
AU - Song, Jae Bok
AU - Chung, Woojin
PY - 2007
Y1 - 2007
N2 - Most present localization algorithms are either range or vision-based. In many environments, only one type of sensor cannot often ensure successful localization; furthermore, using low-priced range sensors instead of expensive, but accurate, laser scanners often lead to poor performance. This paper proposes an MCL-based localization method that robustly estimates the robot pose with fusion of the range information from a low-cost IR scanner and the SIFT based visual information gathered using a mono camera. With sensor fusion, the rough pose estimation from range-based sensors is compensated by the vision-based sensors and slow object recognition can be overcome by the frequent update of the range information. In order to synchronize the two sensors with different bandwidths, the encoder information gathered during object recognition is exploited. This paper also suggests a method for evaluating localization performance that is based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can estimate the robot pose reasonably well and can accurately evaluate the localization performance.
AB - Most present localization algorithms are either range or vision-based. In many environments, only one type of sensor cannot often ensure successful localization; furthermore, using low-priced range sensors instead of expensive, but accurate, laser scanners often lead to poor performance. This paper proposes an MCL-based localization method that robustly estimates the robot pose with fusion of the range information from a low-cost IR scanner and the SIFT based visual information gathered using a mono camera. With sensor fusion, the rough pose estimation from range-based sensors is compensated by the vision-based sensors and slow object recognition can be overcome by the frequent update of the range information. In order to synchronize the two sensors with different bandwidths, the encoder information gathered during object recognition is exploited. This paper also suggests a method for evaluating localization performance that is based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can estimate the robot pose reasonably well and can accurately evaluate the localization performance.
UR - http://www.scopus.com/inward/record.url?scp=36348935543&partnerID=8YFLogxK
U2 - 10.1109/ROBOT.2007.364019
DO - 10.1109/ROBOT.2007.364019
M3 - Conference contribution
AN - SCOPUS:36348935543
SN - 1424406021
SN - 9781424406029
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 3533
EP - 3538
BT - 2007 IEEE International Conference on Robotics and Automation, ICRA'07
T2 - 2007 IEEE International Conference on Robotics and Automation, ICRA'07
Y2 - 10 April 2007 through 14 April 2007
ER -