Automated Surveillance Systems of Smart Cameras in Trains

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

Abstract

Last year's we can observe an enormous grow of surveillance systems in the public domain. In many countries surveillance systems are under discussion because of the privacy aspects of people. In the nineties many multimodal systems were developed for surveillance in Dutch trains. The use of camera systems was complicated because of bad lighting conditions, occlusion and posture. Only recently it is possible to use smart cameras to localize and track faces and displayed facial expressions by smart cameras installed in the public domain. Most systems are based on the Active Appearance model. In this paper we present a real version of a surveillance systems called Image Processing System developed in the vision lab of TUDelft. In an experiment we recorded the behavior of stand-up comedians in trains acting in several scenarios. The facial expressions in the video footage were analyzed.

Original languageEnglish
Title of host publicationComputer Systems and Technologies
Subtitle of host publication21st International Conference, CompSysTech 2020 - Proceedings
EditorsTzvetomir Vassilev, Roumen Trifonov
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Pages116-121
Number of pages6
ISBN (Print)978-1-4503-7768-3
DOIs
Publication statusPublished - 2020
Event21st International Conference on Computer Systems and Technologies, CompSysTech 2020 - Ruse, Online, Bulgaria
Duration: 19 Jun 202020 Jun 2020

Conference

Conference21st International Conference on Computer Systems and Technologies, CompSysTech 2020
CountryBulgaria
CityRuse, Online
Period19/06/2020/06/20

Keywords

  • Active Appearance model
  • Emotion Recognition
  • Face recognition
  • Surveillance Systems

Fingerprint Dive into the research topics of 'Automated Surveillance Systems of Smart Cameras in Trains'. Together they form a unique fingerprint.

Cite this