Abstract
In high-dimensional data analysis, sufficient dimension reduction (SDR) has been considered as an attractive tool for reducing the dimensionality of predictors while preserving regression information. The principal support vector machine (PSVM) (Li et al., 2011) offers a unified approach for both linear and nonlinear SDR. This article comprehensively explores a variety of SDR methods based on the PSVM, which we call principal machines (PM) for SDR. The PM achieves SDR by solving a sequence of convex optimizations akin to popular supervised learning methods, such as the support vector machine, logistic regression, and quantile regression, to name a few. This makes the PM straightforward to handle and extend in both theoretical and computational aspects, as we will see throughout this article.
| Original language | English |
|---|---|
| Pages (from-to) | 235-246 |
| Number of pages | 12 |
| Journal | Communications for Statistical Applications and Methods |
| Volume | 31 |
| Issue number | 2 |
| DOIs | |
| Publication status | Published - 2024 |
Bibliographical note
Publisher Copyright:© 2024 The Korean Statistical Society, and Korean International Statistical Society. All Rights Reserved.
Keywords
- M-estimation
- convex optimization
- principal machine
- principal support vector machine
- sufficient dimension reduction
ASJC Scopus subject areas
- Statistics and Probability
- Modelling and Simulation
- Finance
- Statistics, Probability and Uncertainty
- Applied Mathematics