Deep feature learning for pulmonary nodule classification in a lung CT

Bum Chae Kim, Yu Sub Sung, Heung Il Suk

Research output: Chapter in Book/Report/Conference proceedingConference contribution

44 Citations (Scopus)

Abstract

In this paper, we propose a novel method of identifying pulmonary nodules in a lung CT. Specifically, we devise a deep neural network by which we extract abstract information inherent in raw hand-crafted imaging features. We then combine the deep learned representations with the original raw imaging features into a long feature vector. By taking the combined feature vectors, we train a classifier, preceded by a feature selection via t-test. To validate the effectiveness of the proposed method, we performed experiments on our in-house dataset of 20 subjects; 3,598 pulmonary nodules (malignant: 178, benign: 3,420), which were manually segmented by a radiologist. In our experiments, we achieved the maximal accuracy of 95.5%, sensitivity of 94.4%, and AUC of 0.987, outperforming the competing method.

Original languageEnglish
Title of host publication4th International Winter Conference on Brain-Computer Interface, BCI 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781467378413
DOIs
Publication statusPublished - 2016 Apr 20
Event4th International Winter Conference on Brain-Computer Interface, BCI 2016 - Gangwon Province, Korea, Republic of
Duration: 2016 Feb 222016 Feb 24

Publication series

Name4th International Winter Conference on Brain-Computer Interface, BCI 2016

Other

Other4th International Winter Conference on Brain-Computer Interface, BCI 2016
Country/TerritoryKorea, Republic of
CityGangwon Province
Period16/2/2216/2/24

Keywords

  • Deep learning
  • Lung cancer
  • Pulmonary nodule classification
  • Stacked denoising autoencoder

ASJC Scopus subject areas

  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Deep feature learning for pulmonary nodule classification in a lung CT'. Together they form a unique fingerprint.

Cite this