Visual-Inertial Navigation System (VINS) using a Rolling-Shutter Camera

  • Motivation
  • In order to develop Vision-aided Inertial Navigation Systems (VINS) on mobile devices, such as cell phones and tablets, one needs to consider two important issues, both due to the commercial-grade underlying hardware:
    • The rolling-shutter effect caused by CMOS sensors
    • The unknown and varying time offset between the camera and IMU clocks
    Without appropriately modelling their effect and compensating for them online, the navigation accuracy will significantly degrade.


    A comparison between global-shutter and rolling-shutter camera images
    (from https://graphics.stanford.edu/papers/stabilization/karpenko_gyro.pdf)


    Rolling-shutter and time synchronization cause a time-varying offset between the IMU and camera measurements' arrival time
  • Summary
  • In this work, we present a high-precision VINS that explicitly considers and accounts for both the rolling-shutter and time synchronization issues of an IMU-camera system. In particular, our main contributions are:
    • We introduce an interpolation model for expressing the camera pose of each visual measurement as a function of neighboring IMU poses, which introduces a significant speedup as compared to alternative constant-velocity-based models
    • We analytically determine the system’s unobservable directions when applying our interpolation measurement model, and use them to improve the VINS consistency and accuracy by employing the observability-constrained extended Kalman filter (OC-EKF)
    Experimental validation has been done using a wide range of mobile devices, such as the Galaxy S4 cell phone, the Google Glass, and the Bebop quadrotor.
  • Videos

  • Relevant Publications
  • C1. C.X. Guo, D.G. Kottas, R.C. DuToit, A. Ahmed, R. Li, and S.I. Roumeliotis, "Efficient Visual-Inertial Navigation using a Rolling-Shutter Camera with Inaccurate Timestamps," Robotics: Science and Systems Conference (RSS), Berkeley, CA, Jul. 12-16, 2014 (pdf).