Analyzing local structure in kernel-based learning: Explanation, complexity, and reliability assessment

Gregoire Montavon, Mikio L. Braun, Tammo Krueger, Klaus Robert Muller

Research output: Contribution to journalArticlepeer-review

37 Citations (Scopus)


Over the last decade, nonlinear kernel-based learning methods have been widely used in the sciences and in industry for solving, e.g., classification, regression, and ranking problems. While their users are more than happy with the performance of this powerful technology, there is an emerging need to additionally gain better understanding of both the learning machine and the data analysis problem to be solved. Opening the nonlinear black box, however, is a notoriously difficult challenge. In this review, we report on a set of recent methods that can be universally used to make kernel methods more transparent. In particular, we discuss relevant dimension estimation (RDE) that allows to assess the underlying complexity and noise structure of a learning problem and thus to distinguish high/low noise scenarios of high/low complexity respectively. Moreover, we introduce a novel local technique based on RDE for quantifying the reliability of the learned predictions. Finally, we report on techniques that can explain the individual nonlinear prediction. In this manner, our novel methods not only help to gain further knowledge about the nonlinear signal processing problem itself, but they broaden the general usefulness of kernel methods in practical signal processing applications.

Original languageEnglish
Article number6530740
Pages (from-to)62-74
Number of pages13
JournalIEEE Signal Processing Magazine
Issue number4
Publication statusPublished - 2013

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering
  • Applied Mathematics


Dive into the research topics of 'Analyzing local structure in kernel-based learning: Explanation, complexity, and reliability assessment'. Together they form a unique fingerprint.

Cite this