Selecting ridge parameters in infinite dimensional hypothesis spaces

Masashi Sugiyama, Klaus Robert Müller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Previously, an unbiased estimator of the generalization error called the subspace information criterion (SIC) was proposed for a finite dimensional reproducing kernel Hilbert space (RKHS). In this paper, we extend SIC so that it can be applied to any RKHSs including infinite dimensional ones. Computer simulations show that the extended SIC works well in ridge parameter selection.

Original languageEnglish
Title of host publicationArtificial Neural Networks, ICANN 2002 - International Conference, Proceedings
EditorsJose R. Dorronsoro, Jose R. Dorronsoro
PublisherSpringer Verlag
Pages528-534
Number of pages7
ISBN (Print)9783540440741
DOIs
Publication statusPublished - 2002
Externally publishedYes
Event2002 International Conference on Artificial Neural Networks, ICANN 2002 - Madrid, Spain
Duration: 2002 Aug 282002 Aug 30

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume2415 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other2002 International Conference on Artificial Neural Networks, ICANN 2002
Country/TerritorySpain
CityMadrid
Period02/8/2802/8/30

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Selecting ridge parameters in infinite dimensional hypothesis spaces'. Together they form a unique fingerprint.

Cite this