Improving monte carlo dropout uncertainty estimation with stable output layers

  • Suhan Son
  • , Junhee Seok*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Uncertainty estimation in neural networks is important for reliable predictions. Various statistical methodologies, each with their own characteristics, are utilized to estimate uncertainty. Bootstrap provides robust uncertainty estimation, but its high computational cost is required to be repeated. Monte Carlo dropout (MC dropout) approximates Bayesian inference without additional training, but it can induce excessive uncertainty by applying dropout at every layer. This study proposes MC dropout simulation with a Stable Output Layer (SOL) to address these issues. Our method, SOL MC dropout, requires the same amount of time as a standard MC dropout but produces improved uncertainty estimation. It provides bootstrap-like robust prediction distribution with a much lower computational cost. Experiments on benchmark datasets show that SOL MC dropout provides enhanced uncertainty estimation while maintaining the prediction performance of standard MC dropout. These results suggest that SOL MC dropout can be an efficient and practical approach for uncertainty estimation.

Original languageEnglish
Article number131927
JournalNeurocomputing
Volume661
DOIs
Publication statusPublished - 2026 Jan 14

Bibliographical note

Publisher Copyright:
© 2025 Elsevier B.V.

Keywords

  • Bootstrap
  • Monte carlo dropout
  • Uncertainty estimation

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Improving monte carlo dropout uncertainty estimation with stable output layers'. Together they form a unique fingerprint.

Cite this