Self-Supervised Neuromorphic Perception for Autonomous Flying Robots

Research output: ThesisDissertation (TU Delft)

42 Downloads (Pure)

Abstract

In the ever-evolving landscape of robotics, the quest for advanced synthetic machines that seamlessly integrate with human lives and society becomes increasingly paramount. At the heart of this pursuit lies the intrinsic need for these machines to perceive, understand, and navigate their surroundings autonomously. Among the senses, vision emerges as a cornerstone of human perception, providing a wealth of information about the world we inhabit. Thus, it comes as no surprise that equipping robots with vision-based perception capabilities, or computer vision, has captivated researchers for decades. Recent breakthroughs, fueled by the advent of deep learning, have propelled computer vision to new heights. However, challenges persist in leveraging the power of deep learning, as its hunger for computational resources poses hurdles in the realm of robotics, particularly for small flying robots with their inherent limitations of payload and power consumption.

This dissertation embarks on a journey that begins at the intersection of two groundbreaking technologies with the potential to revolutionize computer vision and enhance its accessibility to small robots: event-based cameras and neuromorphic processors. These two technologies draw inspiration from the information processing mechanisms employed by biological brains. Event-based cameras output sparse events encoding pixel-level brightness changes at microsecond resolution, while neuromorphic processors leverage the power of spiking neural networks to realize a sparse and asynchronous processing pipeline.

Throughout this dissertation, comprehensive investigations have been conducted, presenting innovative solutions and advancements in the fields of computer vision and robotics. The thesis begins by presenting the winning solution of the 2019 AIRR autonomous drone racing competition, which showcases a monocular vision-based navigation approach specifically designed to address the limitations of conventional sensing and processing methods. Moreover, it explores the bridging of the gap between event-based and framebased domains, enabling the application of conventional computer vision algorithms on event-camera data. Building upon these achievements, the thesis introduces a pioneering spiking architecture that enables the estimation of event-based optical flow, with emergent selectivity to local and global motion through unsupervised learning. Additionally, the thesis presents a framework that addresses the practicality and deployability of the models by training spiking neural networks to estimate low-latency, event-based optical flow with self-supervised learning. Finally, this dissertation culminates with a demonstration of the integration of neuromorphic computing in autonomous flight. By utilizing an eventbased camera and neuromorphic processor in the control loop of a small flying robot for optical-flow-based navigation, this research showcases the practical implementation of neuromorphic systems in real-world scenarios. Overall, our studies demonstrate the benefits of incorporating neuromorphic technology into the vision-based state estimation pipeline of autonomous flying robots, paving the way for the development of more power-efficient and faster neuromorphic vision systems.
Original languageEnglish
Awarding Institution
  • Delft University of Technology
Supervisors/Advisors
  • de Croon, G.C.H.E., Supervisor
  • de Wagter, C., Advisor
Award date17 Nov 2023
Print ISBNs978-94-6366-755-5
DOIs
Publication statusPublished - 2023

Keywords

  • Artificial neural networks
  • Autonomous drone racing
  • Deep learning
  • Event-based cameras
  • Flying robots
  • Neuromorphic computing
  • Optical flow
  • Self-supervised learning
  • Spiking neural networks
  • Unsupervised learning

Fingerprint

Dive into the research topics of 'Self-Supervised Neuromorphic Perception for Autonomous Flying Robots'. Together they form a unique fingerprint.

Cite this