TY - JOUR
T1 - FAITH
T2 - Fast Iterative Half-Plane Focus of Expansion Estimation Using Optic Flow
AU - Dinaux, Raoul
AU - Wessendorp, Nikhil
AU - Dupeyroux, Julien
AU - Croon, Guido C.H.E.De
PY - 2021
Y1 - 2021
N2 - Course estimation is a key component for the development of autonomous navigation systems for robots. While state-of-the-art methods widely use visual-based algorithms, it is worth noting that most fail to deal with the complexity of the real world. They often require obstacles to be highly textured to improve the overall performance, particularly when the obstacle is located within the focus of expansion (FOE) where the optic flow (OF) is almost null. This study proposes the FAst ITerative Half-plane (FAITH) method to determine the course of a micro air vehicle (MAV). This is achieved by means of an event-based camera, along with a fast RANSAC-based algorithm that uses event-based OF to determine the FOE. The performance is validated by means of a benchmark on a simulated environment and then tested on a dataset collected for indoor obstacle avoidance. Our results show that the computational efficiency outperforms state-of-the-art methods while keeping a high level of accuracy. This has been further demonstrated onboard an MAV equipped with an event-based camera, thus showing that the FAITH algorithm can be achieved online and is suitable for autonomous obstacle avoidance and navigation onboard MAVs.
AB - Course estimation is a key component for the development of autonomous navigation systems for robots. While state-of-the-art methods widely use visual-based algorithms, it is worth noting that most fail to deal with the complexity of the real world. They often require obstacles to be highly textured to improve the overall performance, particularly when the obstacle is located within the focus of expansion (FOE) where the optic flow (OF) is almost null. This study proposes the FAst ITerative Half-plane (FAITH) method to determine the course of a micro air vehicle (MAV). This is achieved by means of an event-based camera, along with a fast RANSAC-based algorithm that uses event-based OF to determine the FOE. The performance is validated by means of a benchmark on a simulated environment and then tested on a dataset collected for indoor obstacle avoidance. Our results show that the computational efficiency outperforms state-of-the-art methods while keeping a high level of accuracy. This has been further demonstrated onboard an MAV equipped with an event-based camera, thus showing that the FAITH algorithm can be achieved online and is suitable for autonomous obstacle avoidance and navigation onboard MAVs.
KW - Aerial systems: perception and autonomy
KW - vision-based navigation
UR - http://www.scopus.com/inward/record.url?scp=85112642623&partnerID=8YFLogxK
U2 - 10.1109/LRA.2021.3100153
DO - 10.1109/LRA.2021.3100153
M3 - Article
AN - SCOPUS:85112642623
VL - 6
SP - 7627
EP - 7634
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
SN - 2377-3766
IS - 4
M1 - 9497701
ER -