Vison-Based Navigation System for Mobile Platforms

Ref-Nr:

processed image with craters detected

Technology abstract

NGC Aerospace is a Canadian SME specialising in the design and deployment of artificial vision and guidance, navigation, and control (GNC) systems for space vehicles. These GNC systems enable autonomous operation for satellites, landers, and rovers, and are now being used to increase autonomy for terrestrial vehicles such as UAVs. The Vision-Based Navigation system uses state estimation software to provide terrain-relative navigation, covering both absolute and relative navigation.

Technology Description

In order to meet accuracy and autonomy requirements of future planetary exploration missions, NGC provides state-of-the-art image processing software for terrain-relative navigation, covering both absolute navigation (determination of absolute inertial position, velocity and orientation) and relative navigation (determination of surface-relative motion).
 
The Vision-Based Navigation software system includes functions such as:

  • Image processing for stereo vision
  • Feature detection and tracking
  • Crater detection and matching for orbiter and Lander applications
  • Visual odometry
  • Sensor fusion
  • State estimation
  • Sensor calibration

 
The software system processes in real-time images from cameras and other available sensors (IMU, wheel odometry, radar, star tracker, etc). This system is applicable to autonomous orbiter, lander or rover platforms and is of particular relevance in GPS-denied environments.
 
Absolute navigation is realised by matching the perceived environment with an on-board map while relative navigation is performed by comparing information from successive images. The image processing software is complemented with extended Kalman filtering algorithms to provide real-time state estimation.
 
The absolute navigation function is designed for orbiting and descent applications. It detects in real time features in the camera image and matches the detected features with an on-board map to provide a measurement of the orbiter location over the viewed surface.
 
The relative navigation is designed for descent & landing applications and rover applications. It can operate with a single camera (for landers) or stereo cameras (for rovers).
 
The Vision-Based Navigation system is ideal for Unmanned Ground Vehicles (UGVs) in GPS-denied areas, and is designed to work effectively at a wide range of operational speeds and in varying illumination conditions.

Innovations & Advantages

  • Provides full state estimation in GPS-denied or GPS-degraded environments
  • Designed to be robust for a wide range of operational speeds and illumination conditions
  • Can be integrated over a range of different platforms and system architectures due to its modular design
  • Flight-proven system 

Current and Potential Domains of Application

Current:

  • Autonomous landing systems for planetary exploration
  • Asteroid exploration missions
  • Rover systems

 
Potential:

  • Unmanned Ground Vehicles (UGVs)