Mutual Information-Driven Subject-Invariant and Class-Relevant Deep Representation Learning in BCI

Eunjin Jeon, Wonjun Ko, Jee Seok Yoon, Heung Il Suk

    Research output: Contribution to journalArticlepeer-review

    28 Citations (Scopus)

    Abstract

    In recent years, deep learning-based feature representation methods have shown a promising impact on electroencephalography (EEG)-based brain-computer interface (BCI). Nonetheless, owing to high intra- and inter-subject variabilities, many studies on decoding EEG were designed in a subject-specific manner by using calibration samples, with no concern of its practical use, hampered by time-consuming steps and a large data requirement. To this end, recent studies adopted a transfer learning strategy, especially domain adaptation techniques. Among those, we have witnessed the potential of adversarial learning-based transfer learning in BCIs. In the meantime, it is known that adversarial learning-based domain adaptation methods are prone to negative transfer that disrupts learning generalized feature representations, applicable to diverse domains, for example, subjects or sessions in BCIs. In this article, we propose a novel framework that learns class-relevant and subject-invariant feature representations in an information-theoretic manner, without using adversarial learning. To be specific, we devise two operational components in a deep network that explicitly estimate mutual information between feature representations: 1) to decompose features in an intermediate layer into class-relevant and class-irrelevant ones and 2) to enrich class-discriminative feature representation. On two large EEG datasets, we validated the effectiveness of our proposed framework by comparing with several comparative methods in performance. Furthermore, we conducted rigorous analyses by performing an ablation study in regard to the components in our network, explaining our model's decision on input EEG signals via layer-wise relevance propagation, and visualizing the distribution of learned features via t-SNE.

    Original languageEnglish
    Pages (from-to)739-749
    Number of pages11
    JournalIEEE Transactions on Neural Networks and Learning Systems
    Volume34
    Issue number2
    DOIs
    Publication statusPublished - 2023 Feb 1

    Bibliographical note

    Funding Information:
    This work was supported in part by the Institute for Information and Communications Technology Promotion (IITP) grant funded by the Korea Government under Grant 2017-0-00451 (Development of BCI based Brain and Cognitive Computing Technology for Recognizing User's Intentions using Deep Learning) and in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korea Government (MSIT) under Grant 2019-0-00079 [Department of Artificial Intelligence (Korea University)].

    Publisher Copyright:
    © 2020 IEEE.

    Keywords

    • Brain-computer interface (BCI)
    • deep learning
    • domain adaptation
    • electroencephalogram
    • motor imagery
    • mutual information
    • subject-independent
    • transfer learning

    ASJC Scopus subject areas

    • Software
    • Computer Science Applications
    • Computer Networks and Communications
    • Artificial Intelligence

    Fingerprint

    Dive into the research topics of 'Mutual Information-Driven Subject-Invariant and Class-Relevant Deep Representation Learning in BCI'. Together they form a unique fingerprint.

    Cite this