Driver and Pedestrian Mutual Awareness for Path Prediction and Collision Risk Estimation

Research output: Contribution to journalArticleScientificpeer-review


We present a novel method for vehicle-pedestrian path prediction that takes into account the awareness of the driver and the pedestrian towards each other. The method jointly models the paths of vehicle and pedestrian within a single Dynamic Bayesian Network (DBN). In this DBN, sub-graphs model the environment and entity-specific context cues of the vehicle and pedestrian (incl. awareness), which affect their future motion and allow to increase the prediction horizon. These sub-graphs share a latent state which models whether vehicle and pedestrian are on collision course; this accounts for a certain degree of motion coupling. The method was validated with real-world data obtained by onboard vehicle sensing (stereo vision, GNSS and proprioceptive). Data consist of 93 vehicle and pedestrian encounters, spanning various awareness conditions and dynamic characteristics of the participants. In ablation studies, we quantify the benefits of various components of our proposed DBN model for path prediction and collision risk estimation. Results show that at a prediction horizon of 1.5 s, context aware models outperform context-agnostic models in path prediction for scenarios with a dynamics change, while performing similarly otherwise. Results further indicate that driver attention aware models improve collision risk estimation compared to driver-agnostic models.

Original languageEnglish
Number of pages12
JournalIEEE Transactions on Intelligent Vehicles
Publication statusAccepted/In press - 2022


  • Collision Risk Estimation
  • Context modeling
  • Driver Awareness
  • Dynamics
  • Estimation
  • Intelligent vehicles
  • Path Prediction
  • Pedestrian Awareness
  • Predictive models
  • Roads
  • Vehicle dynamics


Dive into the research topics of 'Driver and Pedestrian Mutual Awareness for Path Prediction and Collision Risk Estimation'. Together they form a unique fingerprint.

Cite this