Navigational Drift Analysis for Visual Odometry

Authors

  • Hongtao Liu School of Mechanical Engineering, Shanghai Jiao Tong University, Dongchuan Road 800, Shanghai
  • Ruyi Jiang School of Mechanical Engineering, Shanghai Jiao Tong University, Dongchuan Road 800, Shanghai
  • Weng Hu School of Mechanical Engineering, Shanghai Jiao Tong University, Dongchuan Road 800, Shanghai
  • Shigang Wang School of Mechanical Engineering, Shanghai Jiao Tong University, Dongchuan Road 800, Shanghai

Keywords:

Visual odometry, error analysis, motion concatenation, ego-motion, navigational drift

Abstract

Visual odometry estimates a robot's ego-motion with cameras installed on itself. With the advantages brought by camera being a sensor, visual odometry has been widely adopted in robotics and navigation fields. Drift (or error accumulation) from relative motion concatenation is an intrinsic problem of visual odometry in long-range navigation, as visual odometry is a sensor based on relative measurements. General error analysis using ``mean'' and ``covariance'' of positional error in each axis is not fully capable to describe the behavior of drift. Moreover, no theoretic drift analysis is available for performance evaluation and algorithms comparison. Drift distribution is established in the paper, as a function of the covariance matrix from positional error propagation model. To validate the drift model, experiment with a specific setting is conducted.

Downloads

Download data is not yet available.

Downloads

How to Cite

Liu, H., Jiang, R., Hu, W., & Wang, S. (2015). Navigational Drift Analysis for Visual Odometry. Computing and Informatics, 33(3), 685–706. Retrieved from http://147.213.75.17/ojs/index.php/cai/article/view/2796