Stereo vision for autonomous ferries

Background & motivation

  • An autonomous ferry can use a variety of sensors to keep track of its surroundings. Active sensors (radar, lidar) measure distance to objects. Passive sensors (cameras) cannot do this, and can therefore only provide angle measurements. 
  • Passive sensors can measure distance in a stereo configuration. The full-scale autonomous ferry milliAmpere2 (mA2) is therefore expected to use optical stereo vision to a large extent as part of its sensor fusion system.

Problem description

  • Research papers from the autonomous driving community have argued that stereo vision can reach sufficient accuracy to replace lidars.
  • The student will followup work on stereo vision for milliAmpere 1 and transition it to mA2
  • The main goal is to investigate the performance gain of stereo vision versus lidars.

Work proposal

  • Integrate stereo equipment on milliAmpere 2, including integration with navigation system.
  • Verification of intrinsic calibration. Extrinsic calibration against navigation system. 
  • Experiment: Record video of other boats with ground truth through GPS and/or lidar.
  • Compare the performance of different stereo vision algorithms with ground truth.
  • Write report.

Figure 1: Two synchronised cameras provides the distance to objects.