This paper deals with slippage detection and pose recovery during the SLAM process of mobile robot navigation. Mobile robots do not have a successful solution to recover when localization fails due to slippage. Unexpected inputs such as wheel slippage lead to false prediction during the SLAM process. In this paper, minimizing the risk of localization failure is proposed by applying optical flow to the ceiling image sequences as a slippage detector. The optical flow-based motion estimation results are applied to the prediction step of EKF-SLAM. Using optical flow, we can calculate a homogenous 2D affine transformation matrix. From this matrix we can calculate the relative pose between the two frames. The reliable motion estimation from the vision sensor enables slip detection during the prediction phase of EKF SLAM. The proposed method was successfully verified by several experiments with deliberate slippage in real environments.