Optimistic semi-supervised least squares classification

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

4 Citations (Scopus)

Abstract

The goal of semi-supervised learning is to improve supervised classifiers by using additional unlabeled training examples. In this work we study a simple self-learning approach to semi-supervised learning applied to the least squares classifier. We show that a soft-label and a hard-label variant of self-learning can be derived by applying block coordinate descent to two related but slightly different objective functions. The resulting soft-label approach is related to an idea about dealing with missing data that dates back to the 1930s. We show that the soft-label variant typically outperforms the hard-label variant on benchmark datasets and partially explain this behaviour by studying the relative difficulty of finding good local minima for the corresponding objective functions.
Original languageEnglish
Title of host publication2016 23rd International Conference on Pattern Recognition (ICPR)
PublisherIEEE
Pages1677-1682
Number of pages6
ISBN (Electronic)978-1-5090-4847-2
ISBN (Print)978-1-5090-4848-9
DOIs
Publication statusPublished - 2016
EventICPR 2016: 23rd International Conference on Pattern Recognition - Cancún, Mexico
Duration: 4 Dec 20168 Dec 2016
Conference number: 23

Conference

ConferenceICPR 2016
CountryMexico
CityCancún
Period4/12/168/12/16

Keywords

  • Linear programming
  • Semisupervised learning
  • Labeling
  • Training
  • Encoding
  • Optimization
  • Convergence

Fingerprint Dive into the research topics of 'Optimistic semi-supervised least squares classification'. Together they form a unique fingerprint.

Cite this