Minimizers of the empirical risk and risk monotonicity

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

67 Downloads (Pure)

Abstract

Plotting a learner’s average performance against the number of training samples results in a learning curve. Studying such curves on one or more data sets is a way to get to a better understanding of the generalization properties of this learner. The behavior of learning curves is, however, not very well understood and can display (for most researchers) quite unexpected behavior. Our work introduces the formal notion of risk monotonicity, which asks the risk to not deteriorate with increasing training set sizes in expectation over the training samples. We then present the surprising result that various standard learners, specifically those that minimize the empirical risk, can act nonmonotonically irrespective of the training sample size. We provide a theoretical underpinning for specific instantiations from classification, regression, and density estimation. Altogether, the proposed monotonicity notion opens up a whole new direction of research.
Original languageEnglish
Title of host publicationNeural Information Processing Systems
Number of pages11
Publication statusPublished - 2019
Event33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: 8 Dec 201914 Dec 2019

Publication series

NameAdvances in Neural Information Processing Systems

Conference

Conference33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Country/TerritoryCanada
CityVancouver
Period8/12/1914/12/19

Fingerprint

Dive into the research topics of 'Minimizers of the empirical risk and risk monotonicity'. Together they form a unique fingerprint.

Cite this