Variable-selection consistency of linear quantile regression by validation set approach

  • Suin Kim
  • , Sarang Lee
  • , Nari Shin
  • , Yoonsuh Jung*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of variable selection in the quantile regression model by cross-validation. Although cross-validation is commonly used in quantile regression for model selection, its theoretical justification has not yet been verified. In this work, we prove that cross-validation with the check loss function can lead to variable-selection consistency in quantile regression. Specifically, we investigate its asymptotic properties in linear quantile regression and its penalized version under both fixed and diverging number of parameters. For penalized models, penalties with the oracle property combined with cross-validation are shown to provide variable-selection consistency. In general, one of the crucial requirements for this consistency to hold is that the validation set size should be asymptotically equivalent to the total number of observations, which is also required in the conditional mean linear regression.

Original languageEnglish
Article number110431
JournalStatistics and Probability Letters
Volume223
DOIs
Publication statusPublished - 2025 Aug
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2025 Elsevier B.V.

Keywords

  • Check loss
  • Cross-validation
  • High-dimensional quantile regression
  • Model selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Variable-selection consistency of linear quantile regression by validation set approach'. Together they form a unique fingerprint.

Cite this