A Support Tensor Train Machine

Cong Chen, Kim Batselier, Ching Yun Ko, Ngai Wong

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

15 Citations (Scopus)
89 Downloads (Pure)

Abstract

There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one tensor constraint, and STuM is not scalable because of the exponentially sized Tucker core tensor. To overcome these limitations, we introduce a novel and effective support tensor train machine (STTM) by employing a general and scalable tensor train as the parameter model. Experiments validate and confirm the superiority of the STTM over SVM, STM and STuM.

Original languageEnglish
Title of host publicationProceedings of the 2019 International Joint Conference on Neural Networks (IJCNN 2019)
Place of PublicationPiscataway, NJ, USA
PublisherIEEE
Number of pages8
ISBN (Electronic)978-1-7281-1985-4
ISBN (Print)978-1-7281-2009-6
DOIs
Publication statusPublished - 2019

Bibliographical note

Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care

Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.

Keywords

  • classification
  • support vector machine
  • tensor train

Fingerprint

Dive into the research topics of 'A Support Tensor Train Machine'. Together they form a unique fingerprint.

Cite this