We show via an equivalence of mathematical programs that a support vector (SV) algorithm can be translated into an equivalent boosting-like algorithm and vice versa. We exemplify this translation procedure for a new algorithm-one-class leveraging-starting from the one-class support vector machine (1-SVM). This is a first step toward unsupervised learning in a boosting framework. Building on so-called barrier methods known from the theory of constrained optimization, it returns a function, written as a convex combination of base hypotheses, that characterizes whether a given test point is likely to have been generated from the distribution underlying the training data. Simulations on one-class classification problems demonstrate the usefulness of our approach.
|Number of pages
|IEEE Transactions on Pattern Analysis and Machine Intelligence
|Published - 2002 Sept
Bibliographical noteFunding Information:
The authors would like to thank Manfred Warmuth, Alex Smola, Bob Williamson, and Ayhan Demiriz for valuable discussions. They would also like to thank the anonymous referees for thorough reviews, valuable comments, and suggestions that significantly improved this work. This work was partially funded by DFG under contract JA 379/9-1, JA 379/7-1, MU 987/1-1, and by EU in the NeuroColt2 project. Furthermore, G. Rätsch would like to thank CRIEPI, ANU, and the University of California Santa Cruz for their warm hospitality.
- Novelty detection
- One-class classification
- Unsupervised learning
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
- Artificial Intelligence
- Applied Mathematics