Header logo is

PAC-Bayesian Generic Chaining

2004

Conference Paper

ei


There exist many different generalization error bounds for classification. Each of these bounds contains an improvement over the others for certain situations. Our goal is to combine these different improvements into a single bound. In particular we combine the PAC-Bayes approach introduced by McAllester, which is interesting for averaging classifiers, with the optimal union bound provided by the generic chaining technique developed by Fernique and Talagrand. This combination is quite natural since the generic chaining is based on the notion of majorizing measures, which can be considered as priors on the set of classifiers, and such priors also arise in the PAC-bayesian setting.

Author(s): Audibert, J-Y. and Bousquet, O.
Book Title: Advances in Neural Information Processing Systems 16
Journal: Advances in Neural Information Processing Systems 16
Pages: 1125-1132
Year: 2004
Month: June
Day: 0
Editors: Thrun, S., L.K. Saul, B. Sch{\"o}lkopf
Publisher: MIT Press

Department(s): Empirical Inference
Bibtex Type: Conference Paper (inproceedings)

Event Name: Seventeenth Annual Conference on Neural Information Processing Systems (NIPS 2003)
Event Place: Vancouver, BC, Canada

Address: Cambridge, MA, USA
Digital: 0
ISBN: 0-262-20152-6
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF
Web

BibTex

@inproceedings{2341,
  title = {PAC-Bayesian Generic Chaining},
  author = {Audibert, J-Y. and Bousquet, O.},
  journal = {Advances in Neural Information Processing Systems 16},
  booktitle = {Advances in Neural Information Processing Systems 16},
  pages = {1125-1132 },
  editors = {Thrun, S., L.K. Saul, B. Sch{\"o}lkopf},
  publisher = {MIT Press},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  address = {Cambridge, MA, USA},
  month = jun,
  year = {2004},
  doi = {},
  month_numeric = {6}
}