Abstract
Instance-based learning (IBL), so called memory-based reasoning (MBR), is a commonly used non-parametric learning algorithm. k-nearest neighbor (k-NN) learning is the most popular realization of IBL. Due to its usability and adaptability, k-NN has been successfully applied to a wide range of applications. However, in practice, one has to set important model parameters only empirically: the number of neighbors (k) and weights to those neighbors. In this paper, we propose structured ways to set these parameters, based on locally linear reconstruction (LLR). We then employed sequential minimal optimization (SMO) for solving quadratic programming step involved in LLR for classification to reduce the computational complexity. Experimental results from 11 classification and eight regression tasks were promising enough to merit further investigation: not only did LLR outperform the conventional weight allocation methods without much additional computational cost, but also LLR was found to be robust to the change of k.
Original language | English |
---|---|
Pages (from-to) | 3507-3518 |
Number of pages | 12 |
Journal | Pattern Recognition |
Volume | 41 |
Issue number | 11 |
DOIs | |
Publication status | Published - 2008 Nov |
Externally published | Yes |
Bibliographical note
Funding Information:This work was supported by the Korea Science and Engineering Foundation (KOSEF) grant funded by the Korea government (R01-2005-000-103900-0), the Brain Korea 21 program in 2007, and partially supported by Engineering Research Institute of SNU.
Keywords
- Instance-based learning
- Local reconstruction
- Memory-based reasoning
- Weight allocation
- k-nearest neighbor
ASJC Scopus subject areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Artificial Intelligence