Map-based localization

  • Summary
  • The goal of this work is to create the Google Maps experience indoors; that is, to enable a person or a robot to find their position on a blueprint in real-time using visual and inertial data from a mobile device. Algorithm steps:
    • 1. Map building - A map of the area of interest, consisting of 3D landmarks with associated feature descriptors (i.e., ORB or FREAK), is constructed using the cooperative mapping (CM) algorithm.
    • 2. Blueprint-map alignment - For visualization purposes, the point cloud of landmarks is aligned to the area's blueprint.
    • 3. Real-time, map-based localization - Initially, the mobile device uses the multi-state constrained Kalman filter (MSCKF) to track its 3D pose with respect to its starting point. In parallel, each acquired image is compared against those used for constructing the area's map, so as to determine matches between features detected in the user's surroundings and those appearing in the map. These correspondences are processed by the MSCKF to improve the positioning accuracy and localize against the map/blueprint.
    • Images

    • An image of a map produced by CM (see onionmaps.info for an interactive visualization)

    • Features found in current image (left) corresponding to mapped features (right)

    • Screenshot of real-time localization (lower left insert is current image, arrow denotes position and heading)
      • Videos

      • CVPR'15 demo (Boston, MA)
        • Relevant Publications