Lernen mit kernen: Support-vektor-methoden zur analyse hochdimensionaler daten

Translated title of the contribution: Learning with kernel: Support vector methods for the analysis of highly dimensional data

Bernhard Schölkopf, Klaus Robert Müller, Alexander J. Smola

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

We describe recent developments and results of statistical learning theory. In the framework of learning from examples, two factors control generalization ability: explaining the training data by a learning machine of a suitable complexity. We describe kernel algorithms in feature spaces as elegant and efficient methods of realizing such machines. Examples thereof are Support Vector Machines (SVM) and Kernel PCA (Principal Component Analysis). More important than any individual example of a kernel algorithm, however, is the insight that any algorithm that can be cast in terms of dot products can be generalized to a nonlinear setting using kernels. Finally, we illustrate the significance of kernel algorithms by briefly describing industrial and academic applications, including ones where we obtained benchmark record results.

Translated title of the contributionLearning with kernel: Support vector methods for the analysis of highly dimensional data
Original languageGerman
Pages (from-to)154-163
Number of pages10
JournalInformatik - Forschung und Entwicklung
Volume14
Issue number3
Publication statusPublished - 1999 Sept
Externally publishedYes

Keywords

  • Classification
  • Data mining
  • Digit recognition
  • Feature extraction
  • Kernel methods
  • Machine learning
  • Neural networks
  • Pattern recognition
  • Regression
  • Time series prediction

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Learning with kernel: Support vector methods for the analysis of highly dimensional data'. Together they form a unique fingerprint.

Cite this