Abstract
Since the proposal of the seminal sliced inverse regression (SIR), inverse-type methods have proved to be canonical in sufficient dimension reduction (SDR). However, they often underperform in binary classification because the binary responses yield two slices at most. In this article, we develop a forward SDR approach in binary classification based on weighted large-margin classifiers. First, we show that the gradient of a large-margin classifier is unbiased for SDR as long as the corresponding loss function is Fisher consistent. This leads us to propose the weighted outer-product of gradients (wOPG) estimator. The wOPG estimator can recover the central subspace exhaustively without linearity (or constant variance) conditions, which despite being routinely required, they are untestable assumption. We propose the gradient-based formulation for the large-margin classifier to estimate the gradient function of the classifier directly. We also establish the consistency of the proposed wOPG estimator and demonstrate its promising finite-sample performance through both simulated and real data examples.
Original language | English |
---|---|
Article number | 199 |
Journal | Journal of Machine Learning Research |
Volume | 23 |
Publication status | Published - 2022 Jul 1 |
Bibliographical note
Funding Information:We thank the action-editor, Ryan Tibshirani, and two anonymous reviewers for their constructive comments and suggestions which have significantly improved the article. This work is supported by National Research Foundation of Korea (NRF) grant funded by the Korea government (MIST), grant numbers 2018R1D1A1B07043034 and 2019R1A4A1028134.
Publisher Copyright:
©2022 Jongkyeong Kang and Seung Jun Shin.
Keywords
- dimension reduction
- Fisher consistency
- gradient learning
- large-margin classifier
- outer-product gradient
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence