Variable selection in AUC-optimizing classification

Research output: Contribution to journalArticlepeer-review

Abstract

Optimizing the receiver operating characteristic (ROC) curve is a popular way to evaluate a binary classifier under imbalanced scenarios frequently encountered in practice. A practical approach to constructing a linear binary classifier is presented by simultaneously optimizing the area under the ROC curve (AUC) and selecting informative variables in high dimensions. In particular, the smoothly clipped absolute deviation (SCAD) penalty is employed, and its oracle property is established, which enables the development of a consistent BIC-type information criterion that greatly facilitates the tuning procedure. Both simulated and real data analyses demonstrate the promising performance of the proposed method in terms of AUC optimization and variable selection.

Original languageEnglish
Article number108256
JournalComputational Statistics and Data Analysis
Volume213
DOIs
Publication statusPublished - 2026 Jan

Bibliographical note

Publisher Copyright:
© 2025 Elsevier B.V.

Keywords

  • Diverging predictors
  • Information criterion
  • Oracle property
  • ROC curve
  • SCAD penalty
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Computational Theory and Mathematics
  • Computational Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Variable selection in AUC-optimizing classification'. Together they form a unique fingerprint.

Cite this