Abstract
One benefit of the bias-variance tradeoff is that regression estimators do not have to be strictly unbiased. However, to take full advantage of allowing bias, shrinkage regression estimators require that the appropriate level of bias is chosen carefully. Because the conventional grid search for the shrinkage parameters requires heavy computation, it is practically difficult to incorporate more than two shrinkage parameters. In this paper, we propose a class of shrinkage regression estimators which differently shrink each regression parameter. For this purpose, we set the number of shrinkage parameters to be the same as the number of regression coefficients. The ideal shrinkage for each parameter is suggested, meaning that a burdensome tuning process is not required for each parameter. The (Formula presented.) -consistency and oracle property of the suggested estimators are established. The application of the proposed methods to simulated and real data sets produces the favorable performance of the suggested regression shrinkage methods without the need for a grid search of the entire parameter space.
Original language | English |
---|---|
Pages (from-to) | 4490-4505 |
Number of pages | 16 |
Journal | Communications in Statistics - Theory and Methods |
Volume | 49 |
Issue number | 18 |
DOIs | |
Publication status | Published - 2020 Sept 16 |
Bibliographical note
Publisher Copyright:© 2019 Taylor & Francis Group, LLC.
Keywords
- Bias-variance tradeoff
- oracle property
- shrinkage estimator
- sparsity
- tuning parameter
ASJC Scopus subject areas
- Statistics and Probability