Abstract
In this paper a correspondence is derived between regularization operators used in regularization networks and support vector kernels. We prove that the Green's Functions associated with regularization operators are suitable support vector kernels with equivalent regularization properties. Moreover, the paper provides an analysis of currently used support vector kernels in the view of regularization theory and corresponding operators associated with the classes of both polynomial kernels and translation invariant kernels. The latter are also analyzed on periodical domains. As a by-product we show that a large number of radial basis functions, namely conditionally positive definite functions, may be used as support vector kernels.
Original language | English |
---|---|
Pages (from-to) | 637-649 |
Number of pages | 13 |
Journal | Neural Networks |
Volume | 11 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1998 Jun |
Bibliographical note
Funding Information:The authors thank Chris Burges, Federico Girosi, Leo van Hemmen, Takashi Onoda, John Shawe-Taylor, Vladimir Vapnik, Grace Wahba, and Alan Yuille for helpful discussions and comments. A.J. Smola is supported by a grant from the DFG (JA 379/71), and B. Schölkopf is supported by a grant from the Studienstiftung des deutschen Volkes.
Keywords
- Conditionally positive definite functions
- Green's functions
- Mercer kernel
- Polynomial kernels
- Radial basis functions
- Regularization networks
- Ridge regression
- Support vector machines
ASJC Scopus subject areas
- Cognitive Neuroscience
- Artificial Intelligence