A concise overview of principal support vector machines and its generalization

Jungmin Shin, Seung Jun Shin

Research output: Contribution to journalArticlepeer-review


In high-dimensional data analysis, sufficient dimension reduction (SDR) has been considered as an attractive tool for reducing the dimensionality of predictors while preserving regression information. The principal support vector machine (PSVM) (Li et al., 2011) offers a unified approach for both linear and nonlinear SDR. This article comprehensively explores a variety of SDR methods based on the PSVM, which we call principal machines (PM) for SDR. The PM achieves SDR by solving a sequence of convex optimizations akin to popular supervised learning methods, such as the support vector machine, logistic regression, and quantile regression, to name a few. This makes the PM straightforward to handle and extend in both theoretical and computational aspects, as we will see throughout this article.

Original languageEnglish
Pages (from-to)235-246
Number of pages12
JournalCommunications for Statistical Applications and Methods
Issue number2
Publication statusPublished - 2024

Bibliographical note

Publisher Copyright:
© 2024 The Korean Statistical Society, and Korean International Statistical Society. All Rights Reserved.


  • convex optimization
  • M-estimation
  • principal machine
  • principal support vector machine
  • sufficient dimension reduction

ASJC Scopus subject areas

  • Statistics and Probability
  • Modelling and Simulation
  • Finance
  • Statistics, Probability and Uncertainty
  • Applied Mathematics


Dive into the research topics of 'A concise overview of principal support vector machines and its generalization'. Together they form a unique fingerprint.

Cite this