Abstract
We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.
Original language | English |
---|---|
Title of host publication | 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 2841-2845 |
Number of pages | 5 |
ISBN (Print) | 9781538646588 |
DOIs | |
Publication status | Published - 2018 Sept 10 |
Event | 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Calgary, Canada Duration: 2018 Apr 15 → 2018 Apr 20 |
Publication series
Name | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
---|---|
Volume | 2018-April |
ISSN (Print) | 1520-6149 |
Other
Other | 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 |
---|---|
Country/Territory | Canada |
City | Calgary |
Period | 18/4/15 → 18/4/20 |
Bibliographical note
Publisher Copyright:© 2018 IEEE.
Keywords
- Kernel PCA
- Nonlinear regression
- Reproducing kernel Hilbert space
- Spectral decomposition
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering