Vertical Landing for Micro Air Vehicles using Event-Based Optical Flow

Kirk Scheper, Guido de Croon, B.J. Pijnacker Hordijk

Research output: Contribution to journalSpecial issueScientificpeer-review

16 Citations (Scopus)

Abstract

Small flying robots can perform landing maneuvers using bio-inspired optical flow by maintaining a constant divergence. However, optical flow is typically estimated from frame sequences recorded by standard miniature cameras. This requires processing full images onboard, which limits the update rate of divergence measurements and thus the speed of the control loop and the robot. Event-based cameras overcome these limitations by only measuring pixel-level brightness changes at microsecond temporal accuracy, hence providing an efficient mechanism for optical flow estimation. This paper presents, to the best of our knowledge, the first work integrating event-based optical flow estimation into the control loop of a flying robot. We extend an existing “local plane fitting” algorithm to obtain an improved and more computationally efficient optical flow estimation method, which is valid for a wide range of optical flow velocities. This method is validated for real event sequences. In addition, a method for estimating the divergence from event-based optical flow is introduced that accounts for the aperture problem. The developed algorithms are implemented in a constant divergence landing controller onboard a quadrotor. Experiments show that, using event-based optical flow, accurate divergence estimates can be obtained over a wide range of speeds. This enables the quadrotor to perform very fast landing maneuvers.
Original languageEnglish
Pages (from-to)69-90
JournalJournal of Field Robotics
Volume35
Issue number1
DOIs
Publication statusPublished - 15 Dec 2017

Keywords

  • Aerial robotics
  • Bio-inspired methods
  • Optical flow
  • Perception
  • Sensors

Fingerprint Dive into the research topics of 'Vertical Landing for Micro Air Vehicles using Event-Based Optical Flow'. Together they form a unique fingerprint.

Cite this