Fusing vision and odometry for accurate indoor robot localization
2012
For service robotics, localization is an essential component required in many applications, e.g. indoor robot navigation. Today, accurate localization relies mostly on high-end devices, such as A.R.T. DTrack, VICON systems or laser scanners. These systems are often expensive and, thus, require substantial investments. In this paper, our focus is on the development of a localization method using low-priced devices, such as cameras, while being sufficiently accurate in tracking performance. Vision data contains much information and potentially yields high tracking accuracy. However, due to high computational requirements vision-based localization can only be performed at a low frequency. In order to speed up the visual localization and increase accuracy, we combine vision information with robots odometry using a Kalman-Filter. The resulting approach enables sufficiently accurate tracking performance (errors in the range of few cm) at a frequency of about 35Hz. To evaluate the proposed method, we compare our tracking performance with the high precision A.R.T. DTrack localization as ground truth. The evaluations on real robot show that our low-priced localization approach is competitive for indoor robot localization tasks.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
12
References
15
Citations
NaN
KQI