A K-fold averaging cross-validation procedure

Yoonsuh Jung, Jianhua Hu

Research output: Contribution to journalArticlepeer-review

221 Citations (Scopus)


Cross-validation (CV) type of methods have been widely used to facilitate model estimation and variable selection. In this work, we suggest a new K-fold CV procedure to select a candidate ‘optimal’ model from each hold-out fold and average the K candidate ‘optimal’ models to obtain the ultimate model. Due to the averaging effect, the variance of the proposed estimates can be significantly reduced. This new procedure results in more stable and efficient parameter estimation than the classical K-fold CV procedure. In addition, we show the asymptotic equivalence between the proposed and classical CV procedures in the linear regression setting. We also demonstrate the broad applicability of the proposed procedure via two examples of parameter sparsity regularisation and quantile smoothing splines modelling. We illustrate the promise of the proposed method through simulations and a real data example.

Original languageEnglish
Pages (from-to)167-179
Number of pages13
JournalJournal of Nonparametric Statistics
Issue number2
Publication statusPublished - 2015 Apr 3
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2015, © 2015 American Statistical Association and Taylor & Francis.


  • cross-validation
  • model averaging
  • model selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'A K-fold averaging cross-validation procedure'. Together they form a unique fingerprint.

Cite this