Penalized principal logistic regression for sparse sufficient dimension reduction

Seung Jun Shin, Andreas Artemiou

    Research output: Contribution to journalArticlepeer-review

    15 Citations (Scopus)

    Abstract

    Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.

    Original languageEnglish
    Pages (from-to)48-58
    Number of pages11
    JournalComputational Statistics and Data Analysis
    Volume111
    DOIs
    Publication statusPublished - 2017 Jul 1

    Bibliographical note

    Publisher Copyright:
    © 2016 Elsevier B.V.

    Keywords

    • Max-SCAD penalty
    • Principal logistic regression
    • Sparse sufficient dimension reduction
    • Sufficient dimension reduction

    ASJC Scopus subject areas

    • Statistics and Probability
    • Computational Mathematics
    • Computational Theory and Mathematics
    • Applied Mathematics

    Fingerprint

    Dive into the research topics of 'Penalized principal logistic regression for sparse sufficient dimension reduction'. Together they form a unique fingerprint.

    Cite this