Adaptive learning with robust generalization guarantees

Rachel Cummings, Katrina Ligett, Kobbi Nissim, Aaron Roth, Zhiwei Steven Wu

Research output: Contribution to journalConference articlepeer-review

27 Scopus citations

Abstract

The traditional notion of generalization-i.e., learning a hypothesis whose empirical error is close to its true error-is surprisingly brittle. As has recently been noted (Dwork et al., 2015c), even if several algorithms have this guarantee in isolation, the guarantee need not hold if the algorithms are composed adaptively. In this paper, we study three notions of generalization-increasing in strength-that are robust to postprocessing and amenable to adaptive composition, and examine the relationships between them. We call the weakest such notion Robust Generalization. A second, intermediate, notion is the stability guarantee known as differential privacy. The strongest guarantee we consider we call Perfect Generalization. We prove that every hypothesis class that is PAC learnable is also PAC learnable in a robustly generalizing fashion, with almost the same sample complexity. It was previously known that differentially private algorithms satisfy robust generalization. In this paper, we show that robust generalization is a strictly weaker concept, and that there is a learning task that can be carried out subject to robust generalization guarantees, yet cannot be carried out subject to differential privacy. We also show that perfect generalization is a strictly stronger guarantee than differential privacy, but that, nevertheless, many learning tasks can be carried out subject to the guarantees of perfect generalization.

Original languageEnglish (US)
Pages (from-to)772-814
Number of pages43
JournalProceedings of Machine Learning Research
Volume49
Issue numberJune
StatePublished - Jun 6 2016
Externally publishedYes
Event29th Conference on Learning Theory, COLT 2016 - New York, United States
Duration: Jun 23 2016Jun 26 2016

Bibliographical note

Publisher Copyright:
© 2016 R. Cummings, K. Ligett, K. Nissim, A. Roth & Z.S. Wu.

Keywords

  • Adaptive learning
  • Composition
  • Compression schemes
  • Generalizations

Fingerprint

Dive into the research topics of 'Adaptive learning with robust generalization guarantees'. Together they form a unique fingerprint.

Cite this