The subspace information criterion for infinite dimensional hypothesis spaces

Masashi Sugiyama, Klaus Robert Müller

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)


The paper extended the range of applicability of subspace information criterion (SIC). It was showed that even if the reproducing kernels centered on training sample points do not span the whole space, SIC was an unbiased estimator of an essential part of the generalization error. The extension allowed the use of any reproducing kernel Hilbert spaces (RKHS) including infinite dimension ones.

Original languageEnglish
Pages (from-to)323-359
Number of pages37
JournalJournal of Machine Learning Research
Issue number2
Publication statusPublished - 2003 Feb 15


  • Cross-validation
  • Finite sample statistics
  • Gaussian processes
  • Generalization error
  • Kernel regression
  • Model selection
  • Reproducing kernel Hilbert space
  • Subspace information criterion
  • Unbiased estimators

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence


Dive into the research topics of 'The subspace information criterion for infinite dimensional hypothesis spaces'. Together they form a unique fingerprint.

Cite this