Resolving challenges in deep learning-based analyses of histopathological images using explanation methods

Miriam Hägele, Philipp Seegerer, Sebastian Lapuschkin, Michael Bockmayr, Wojciech Samek, Frederick Klauschen, Klaus Robert Müller, Alexander Binder

Research output: Contribution to journalArticlepeer-review

101 Citations (Scopus)


Deep learning has recently gained popularity in digital pathology due to its high prediction quality. However, the medical domain requires explanation and insight for a better understanding beyond standard quantitative performance evaluation. Recently, many explanation methods have emerged. This work shows how heatmaps generated by these explanation methods allow to resolve common challenges encountered in deep learning-based digital histopathology analyses. We elaborate on biases which are typically inherent in histopathological image data. In the binary classification task of tumour tissue discrimination in publicly available haematoxylin-eosin-stained images of various tumour entities, we investigate three types of biases: (1) biases which affect the entire dataset, (2) biases which are by chance correlated with class labels and (3) sampling biases. While standard analyses focus on patch-level evaluation, we advocate pixel-wise heatmaps, which offer a more precise and versatile diagnostic instrument. This insight is shown to not only be helpful to detect but also to remove the effects of common hidden biases, which improves generalisation within and across datasets. For example, we could see a trend of improved area under the receiver operating characteristic (ROC) curve by 5% when reducing a labelling bias. Explanation techniques are thus demonstrated to be a helpful and highly relevant tool for the development and the deployment phases within the life cycle of real-world applications in digital pathology.

Original languageEnglish
Article number6423
JournalScientific reports
Issue number1
Publication statusPublished - 2020 Dec 1
Externally publishedYes

Bibliographical note

Funding Information:
This work was supported by the German Ministry for Education and Research as BIFOLD - Berlin Institute for the Foundations of Learning and Data (ref. 01IS18025A and ref. 01IS18037A). KRM is also supported by the Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00451). SL and WS are supported by BMBF TraMeExCo grant 01IS18056A. PS is supported by BMBF MALT III grant 01IS17058. AB was supported by Ministry of Education Tier2 Grant T2MOE1708 and the STEE-SUTD Cyber Security Laboratory. This publication only reflects the authors views. Funding agencies are not liable for any use that may be made of the information contained herein.

Publisher Copyright:
© 2020, The Author(s).

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Resolving challenges in deep learning-based analyses of histopathological images using explanation methods'. Together they form a unique fingerprint.

Cite this