Principal quantile regression for sufficient dimension reduction with heteroscedasticity

Chong Wang, Seung Jun Shin, Yichao Wu

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Sufficient dimension reduction (SDR) is a successful tool for reducing data dimensionality without stringent model assumptions. In practice, data often display heteroscedasticity which is of scientific importance in general but frequently overlooked since a primal goal of most existing statistical methods is to identify conditional mean relationship among variables. In this article, we propose a new SDR method called principal quantile regression (PQR) that efficiently tackles heteroscedasticity. PQR can naturally be extended to a nonlinear version via kernel trick. Asymptotic properties are established and an efficient solution path-based algorithm is provided. Numerical examples based on both simulated and real data demonstrate the PQR’s advantageous performance over existing SDR methods. PQR still performs very competitively even for the case without heteroscedasticity.

Original languageEnglish
Pages (from-to)2114-2140
Number of pages27
JournalElectronic Journal of Statistics
Volume12
Issue number2
DOIs
Publication statusPublished - 2018

Keywords

  • Heteroscedasticity
  • Kernel quantile regression
  • Principal quantile regression
  • Sufficient dimension reduction

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Principal quantile regression for sufficient dimension reduction with heteroscedasticity'. Together they form a unique fingerprint.

Cite this