Naive Bayes classifiers boosted by sufficient dimension reduction: applications to top-k classification

Su Hyeong Yang, Seung Jun Shin, Wooseok Sung, Choon Won Lee

Research output: Contribution to journalArticlepeer-review


The naive Bayes classifier is one of the most straightforward classification tools and directly estimates the class probability. However, because it relies on the independent assumption of the predictor, which is rarely satisfied in real-world problems, its application is limited in practice. In this article, we propose employing sufficient dimension reduction (SDR) to substantially improve the performance of the naive Bayes classifier, which is often deteriorated when the number of predictors is not restrictively small. This is not surprising as SDR reduces the predictor dimension without sacrificing classification information, and predictors in the reduced space are constructed to be uncorrelated. Therefore, SDR leads the naive Bayes to no longer be naive. We applied the proposed naive Bayes classifier after SDR to build a recommendation system for the eyewear-frames based on customers’ face shape, demonstrating its utility in the top-k classification problem.

Original languageEnglish
Pages (from-to)603-614
Number of pages12
JournalCommunications for Statistical Applications and Methods
Issue number5
Publication statusPublished - 2022

Bibliographical note

Publisher Copyright:
© 2022 The Korean Statistical Society, and Korean International Statistical Society. All rights reserved.


  • Dimension reduction
  • Recommendation system
  • Soft classification
  • Top-k classification

ASJC Scopus subject areas

  • Statistics and Probability
  • Modelling and Simulation
  • Finance
  • Statistics, Probability and Uncertainty
  • Applied Mathematics


Dive into the research topics of 'Naive Bayes classifiers boosted by sufficient dimension reduction: applications to top-k classification'. Together they form a unique fingerprint.

Cite this