Human detection from a mobile robot using fusion of laser and vision information

Efstathios P. Fotiadis, Mario Garzón, Antonio Barrientos

Research output: Contribution to journalArticleScientificpeer-review

29 Citations (Scopus)


This paper presents a human detection system that can be employed on board a mobile platform for use in autonomous surveillance of large outdoor infrastructures. The prediction is based on the fusion of two detection modules, one for the laser and another for the vision data. In the laser module, a novel feature set that better encapsulates variations due to noise, distance and human pose is proposed. This enhances the generalization of the system, while at the same time, increasing the outdoor performance in comparison with current methods. The vision module uses the combination of the histogram of oriented gradients descriptor and the linear support vector machine classifier. Current approaches use a fixed-size projection to define regions of interest on the image data using the range information from the laser range finder. When applied to small size unmanned ground vehicles, these techniques suffer from misalignment, due to platform vibrations and terrain irregularities. This is effectively addressed in this work by using a novel adaptive projection technique, which is based on a probabilistic formulation of the classifier performance. Finally, a probability calibration step is introduced in order to optimally fuse the information from both modules. Experiments in real world environments demonstrate the robustness of the proposed method.

Original languageEnglish
Pages (from-to)11603-11635
Number of pages33
JournalSensors (Switzerland)
Issue number9
Publication statusPublished - 1 Jan 2013
Externally publishedYes


  • Human detection
  • Laser range finder
  • Monocular vision
  • Outdoors surveillance
  • Sensor fusion
  • Unmanned ground vehicle


Dive into the research topics of 'Human detection from a mobile robot using fusion of laser and vision information'. Together they form a unique fingerprint.

Cite this