Spacecraft 3D position and orientation, obtained by integration of IMU measurements, are updated using camera observations of ground features with known global coordinates. These are obtained from satellite orthoimagery, combined with digital elevation maps. Feature matching and extraction can be done by a number of different approaches, e.g. using crater detection algorithms, SIFT keys, or by 2D correlation between map and camera image. Techniques such as RANSAC, Mahalanobis distance gating, state augmentation, and iterated extended Kalman filter updates, are used to address outlier rejection, image processing delays and the nonlinear measurement model.
Experimental Results: The algorithm has been successfully tested on actual NASA datasets, e.g., those used to validate the Descent Image Motion Estimation System (DIMES) for the Mars Exploration Rover (MER) missions, and one acquired during the Mars Science Lab (MSL) subsonic parachute drop test. In both data sets, our algorithm achieved final position and orientation uncertainty of less than 3 m and 0.5 deg in magnitude.
- Parachute Drop Test
Examples of matched SIFT features from this experiment are shown below (left), between an aerial image from the Parachute Drop Test and a map of the landing area, given in the form of an 11x12 km patch of grayscale orthoimagery. The image was taken at an altitude of approximately 3.5 km above ground. The trajectory of the parachute is shown on the right, with the green line based on pure IMU integration (dead-reckoning), and the red line showing the estimate of our proposed filter.
- DIMES Field Test
- Simulation Results
- Relevant Publications
- C1. N. Trawny, A. I. Mourikis, S. Roumeliotis, A. E. Johnson, and J. Montgomery. Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks. Journal of Field Robotics, Special Issue on Space Robotics, vol. 24, no. 5, pp. 357-378, 17 Apr. 2007.