Abstract
The paper extended the range of applicability of subspace information criterion (SIC). It was showed that even if the reproducing kernels centered on training sample points do not span the whole space, SIC was an unbiased estimator of an essential part of the generalization error. The extension allowed the use of any reproducing kernel Hilbert spaces (RKHS) including infinite dimension ones.
Original language | English |
---|---|
Pages (from-to) | 323-359 |
Number of pages | 37 |
Journal | Journal of Machine Learning Research |
Volume | 3 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2003 Feb 15 |
Keywords
- Cross-validation
- Finite sample statistics
- Gaussian processes
- Generalization error
- Kernel regression
- Model selection
- Reproducing kernel Hilbert space
- Subspace information criterion
- Unbiased estimators
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence