Abstract
We describe recent developments and results of statistical learning theory. In the framework of learning from examples, two factors control generalization ability: explaining the training data by a learning machine of a suitable complexity. We describe kernel algorithms in feature spaces as elegant and efficient methods of realizing such machines. Examples thereof are Support Vector Machines (SVM) and Kernel PCA (Principal Component Analysis). More important than any individual example of a kernel algorithm, however, is the insight that any algorithm that can be cast in terms of dot products can be generalized to a nonlinear setting using kernels. Finally, we illustrate the significance of kernel algorithms by briefly describing industrial and academic applications, including ones where we obtained benchmark record results.
Translated title of the contribution | Learning with kernel: Support vector methods for the analysis of highly dimensional data |
---|---|
Original language | German |
Pages (from-to) | 154-163 |
Number of pages | 10 |
Journal | Informatik - Forschung und Entwicklung |
Volume | 14 |
Issue number | 3 |
Publication status | Published - 1999 Sept |
Externally published | Yes |
Keywords
- Classification
- Data mining
- Digit recognition
- Feature extraction
- Kernel methods
- Machine learning
- Neural networks
- Pattern recognition
- Regression
- Time series prediction
ASJC Scopus subject areas
- General Computer Science