Regularizing AdaBoost

Gunnar Rätsch, Takashi Onoda, Klaus R. Müller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

20 Citations (Scopus)


Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoostreg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 11 - Proceedings of the 1998 Conference, NIPS 1998
PublisherNeural information processing systems foundation
Number of pages7
ISBN (Print)0262112450, 9780262112451
Publication statusPublished - 1999
Externally publishedYes
Event12th Annual Conference on Neural Information Processing Systems, NIPS 1998 - Denver, CO, United States
Duration: 1998 Nov 301998 Dec 5

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258


Other12th Annual Conference on Neural Information Processing Systems, NIPS 1998
Country/TerritoryUnited States
CityDenver, CO

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing


Dive into the research topics of 'Regularizing AdaBoost'. Together they form a unique fingerprint.

Cite this