A large number of traffic accidents, especially those involving vulnerable road users such as pedestrians and cyclists, are due to blind spots for the driver, for example when a vehicle takes a turn with poor visibility or when a pedestrian crosses from behind a parked vehicle. In these accidents, the consequences for the vulnerable road users are dramatic. Autonomous cars have the potential to drastically reduce traffic accidents thanks to high-performance sensing and reasoning. However, their perception capabilities are still limited to the field of view of their sensors. We propose to extend the perception capabilities of a vehicle, autonomous or human-driven, with a small Unmanned Aerial Vehicle (UAV) capable of taking off from the car, flying around corners to gather additional data from blind spots and landing back on the car after a mission. We present a holistic framework to detect blind spots in the map that is built by the car, plan an informative path for the drone, and detect potential threats occluded to the car. We have tested our approach with an autonomous car equipped with a drone.
|Name||Springer Proceedings in Advanced Robotics (SPAR)|
|Conference||FSR 2017: 11th International Conference on Field and Service Robotics|
|Period||12/09/17 → 15/09/17|