Temporal Attention-Gated Model for Robust Sequence Classification

Wenjie Pei, Tadas Baltrusaitis, David Tax, Louis-Philippe Morency

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

42 Citations (Scopus)


Typical techniques for sequence classification are designed for well-segmented sequences which have been edited to remove noisy or irrelevant parts. Therefore, such methods cannot be easily applied on noisy sequences expected in real-world applications. In this paper, we present the Temporal Attention-Gated Model (TAGM) which integrates ideas from attention models and gated recurrent networks to better deal with noisy or unsegmented sequences. Specifically, we extend the concept of attention model to measure the relevance of each observation (time step) of a sequence. We then use a novel gated recurrent network to learn the hidden representation for the final prediction. An important advantage of our approach is interpretability since the temporal attention weights provide a meaningful value for the salience of each time step in the sequence. We demonstrate the merits of our TAGM approach, both for prediction accuracy and interpretability, on three different tasks: spoken digit recognition, text-based sentiment analysis and visual event recognition.
Original languageEnglish
Title of host publication2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
EditorsL. O'Conner
Place of PublicationPiscataway
Number of pages10
ISBN (Electronic)978-1-5386-0457-1
ISBN (Print)978-1-5386-0458-8
Publication statusPublished - 2017
Event30th IEEE Conference on Computer Vision and Pattern Recognition - Honolulu, United States
Duration: 21 Jul 201726 Jul 2017


Conference30th IEEE Conference on Computer Vision and Pattern Recognition
Abbreviated titleCVRP 2017
CountryUnited States


  • Hidden Markov models
  • Logic gates
  • Noise measurement
  • Mathematical model
  • Computational modeling
  • Data models


Dive into the research topics of 'Temporal Attention-Gated Model for Robust Sequence Classification'. Together they form a unique fingerprint.

Cite this