Skip to main navigation Skip to search Skip to main content

Data underlying the publication: All Eyes, no IMU: Learning Flight Attitude from Vision Alone

Dataset

Description

Datasets used for training, validating and testing of the networks deployed on the drone for the publication: All Eyes, no IMU: Learning Flight Attitude from Vision Alone as published in NPJ Robotics (link will be provided when it is available).

In this work, the authors have demonstrated for the first time that a quadrotor can be controlled solely based on vision inputs. A recurrent network was trained to estimate both attitude and angular rate, which was used in the control loop of a flying drone.

The output of the event based camera, together with sensor data from the flight controller and the control commands is given here as rosbags, ulg files from the PX4 autopilot, .csvs from the motion capture system and some videos showing the performance as .MOV and .mp4.
Date made available15 Dec 2025
PublisherTU Delft - 4TU.ResearchData

Cite this