TY - GEN
T1 - Importance-weighted cross-validation for covariate shift
AU - Sugiyama, Masashi
AU - Blankertz, Benjamin
AU - Krauledat, Matthias
AU - Dornhege, Guido
AU - Müller, Klaus Robert
PY - 2006
Y1 - 2006
N2 - A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.
AB - A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.
UR - http://www.scopus.com/inward/record.url?scp=33750229534&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33750229534&partnerID=8YFLogxK
U2 - 10.1007/11861898_36
DO - 10.1007/11861898_36
M3 - Conference contribution
AN - SCOPUS:33750229534
SN - 3540444122
SN - 9783540444121
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 354
EP - 363
BT - Pattern Recognition - 28th DAGM Symposium, Proceedings
PB - Springer Verlag
T2 - 28th Symposium of the German Association for Pattern Recognition, DAGM 2006
Y2 - 12 September 2006 through 14 September 2006
ER -