The Pessimistic Limits and Possibilities of Margin-based Losses in Semi-supervised Learning

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

2 Citations (Scopus)

Abstract

Consider a classification problem where we have both labeled and unlabeled data available. We show that for linear classifiers defined by convex margin-based surrogate losses that are decreasing, it is impossible to construct any semi-supervised approach that is able to guarantee an improvement over the supervised classifier measured by this surrogate loss on the labeled and unlabeled data. For convex margin-based loss functions that also increase, we demonstrate safe improvements are possible
Original languageEnglish
Title of host publicationNIPS'18
Subtitle of host publicationProceedings of the 32nd International Conference on Neural Information Processing Systems
EditorsS. Bengio, H.M. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi
PublisherCurran Associates, Inc.
Pages1793-1802
Number of pages10
Publication statusPublished - 2018
EventNIPS 2018: 32nd Conference on Neural Information Processing Systems - Montréal, Canada
Duration: 3 Dec 20188 Dec 2018
Conference number: 32

Conference

ConferenceNIPS 2018
CountryCanada
CityMontréal
Period3/12/188/12/18

Fingerprint Dive into the research topics of 'The Pessimistic Limits and Possibilities of Margin-based Losses in Semi-supervised Learning'. Together they form a unique fingerprint.

Cite this