Principal weighted support vector machines for sufficient dimension reduction in binary classification

Seung Jun Shin, Yichao Wu, Hao Helen Zhang, Yufeng Liu

    Research output: Contribution to journalArticlepeer-review

    33 Citations (Scopus)

    Abstract

    Sufficient dimension reduction is popular for reducing data dimensionality without stringent model assumptions. However, most existing methods may work poorly for binary classification. For example, sliced inverse regression (Li, 1991) can estimate at most one direction if the response is binary. In this paper we propose principal weighted support vector machines, a unified framework for linear and nonlinear sufficient dimension reduction in binary classification. Its asymptotic properties are studied, and an efficient computing algorithm is proposed. Numerical examples demonstrate its performance in binary classification.

    Original languageEnglish
    Pages (from-to)67-81
    Number of pages15
    JournalBiometrika
    Volume104
    Issue number1
    DOIs
    Publication statusPublished - 2017 Mar 1

    Bibliographical note

    Funding Information:
    Our research is partially supported by the National Institutes of Health, the National Science Foundation and the National Research Foundation of Korea.

    Publisher Copyright:
    © 2017 Biometrika Trust.

    Keywords

    • Fisher consistency
    • Hyperplane alignment
    • Reproducing kernel Hilbert space
    • Weighted support vector machine

    ASJC Scopus subject areas

    • Statistics and Probability
    • General Mathematics
    • Agricultural and Biological Sciences (miscellaneous)
    • General Agricultural and Biological Sciences
    • Statistics, Probability and Uncertainty
    • Applied Mathematics

    Fingerprint

    Dive into the research topics of 'Principal weighted support vector machines for sufficient dimension reduction in binary classification'. Together they form a unique fingerprint.

    Cite this