Abstract
Sufficient dimension reduction (SDR) is a successful tool for reducing data dimensionality without stringent model assumptions. In practice, data often display heteroscedasticity which is of scientific importance in general but frequently overlooked since a primal goal of most existing statistical methods is to identify conditional mean relationship among variables. In this article, we propose a new SDR method called principal quantile regression (PQR) that efficiently tackles heteroscedasticity. PQR can naturally be extended to a nonlinear version via kernel trick. Asymptotic properties are established and an efficient solution path-based algorithm is provided. Numerical examples based on both simulated and real data demonstrate the PQR’s advantageous performance over existing SDR methods. PQR still performs very competitively even for the case without heteroscedasticity.
Original language | English |
---|---|
Pages (from-to) | 2114-2140 |
Number of pages | 27 |
Journal | Electronic Journal of Statistics |
Volume | 12 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2018 |
Bibliographical note
Funding Information:We thank two reviewers, an associate editor, and the editor for their most helpful comments. Shin is partially supported by National Research Foundation of Korea (NRF) grant No. 2015R1C1A1A01054913. Wu is partiallly supported by National Science Foundation grants DMS-1055210 and DMS-1812354.
Publisher Copyright:
© 2018, Institute of Mathematical Statistics. All rights reserved.
Keywords
- Heteroscedasticity
- Kernel quantile regression
- Principal quantile regression
- Sufficient dimension reduction
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty