A Framework for Fast Prototyping of Photo-realistic Environments with Multiple Pedestrians

S. Casao, Andrés Otero, A. Serra Gomez, Ana C. Murillo, J. Alonso Mora, Eduardo Montijano

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review


Robotic applications involving people often require advanced perception systems to better understand complex real-world scenarios. To address this challenge, photo-realistic and physics simulators are gaining popularity as a means of generating accurate data labeling and designing scenarios for evaluating generalization capabilities, e.g., lighting changes, camera movements or different weather conditions. We develop a photo-realistic framework built on Unreal Engine and AirSim to generate easily scenarios with pedestrians and mobile robots. The framework is capable to generate random and customized trajectories for each person and provides up to 50 ready-to-use people models along with an API for their metadata retrieval. We demonstrate the usefulness of the proposed framework with a use case of multi-target tracking, a popular problem in real pedestrian scenarios. The notable feature variability in the obtained perception data is presented and evaluated.
Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Robotics and Automation (ICRA 2023)
ISBN (Print)979-8-3503-2365-8
Publication statusPublished - 2023
EventICRA 2023: International Conference on Robotics and Automation - London, United Kingdom
Duration: 29 May 20232 Jun 2023


ConferenceICRA 2023: International Conference on Robotics and Automation
Country/TerritoryUnited Kingdom

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care
Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.

Cite this