Enhancing optical-flow-based control by learning visual appearance cues for flying robots

Research output: Contribution to journalArticleScientificpeer-review

32 Citations (Scopus)

Abstract

Flying insects employ elegant optical-flow-based strategies to solve complex tasks such as landing or obstacle avoidance. Roboticists have mimicked these strategies on flying robots with only limited success, because optical flow (1) cannot disentangle distance from velocity and (2) is less informative in the highly important flight direction. Here, we propose a solution to these fundamental shortcomings by having robots learn to estimate distances to objects by their visual appearance. The learning process obtains supervised targets from a stability-based distance estimation approach. We have successfully implemented the process on a small flying robot. For the task of landing, it results in faster, smooth landings. For the task of obstacle avoidance, it results in higher success rates at higher flight speeds. Our results yield improved robotic visual navigation capabilities and lead to a novel hypothesis on insect intelligence: behaviours that were described as optical-flow-based and hardwired actually benefit from learning processes.
Original languageEnglish
Pages (from-to)33-41
Number of pages9
JournalNature Machine Intelligence
Volume3
Issue number1
DOIs
Publication statusPublished - 2021

Fingerprint

Dive into the research topics of 'Enhancing optical-flow-based control by learning visual appearance cues for flying robots'. Together they form a unique fingerprint.

Cite this