Epoch-level and Sequence-level Multi-Head Self-Attention-based Sleep Stage Classification

Koohong Jung, Moogyeong Kim, Wonzoo Chung

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Inthis paper, we propose a Transformer-based sleep stage classification method by effectively combining epoch-level and sequence-level multi-head self-attention with multiple outputs to aid the training. Most of the existing automated sleep staging methods utilized convolutional neural networks (CNNs) for feature extraction and recurrent neural networks (RNNs) or Transformer for capturing global dependencies. However, there are disadvantages when utilizing CNNs and RNNs (e.g., large parameter sizes, information loss). Therefore, Transformer-only architecture with epoch-level multi-head self-attention for capturing intra-epoch dependencies and sequence-level multihead self-attention for capturing inter-epoch dependencies in 30-second PSG signals is utilized. Furthermore, a sequence of learnable class tokens is deployed to store summary of each sleep epoch. As a result, three different outputs are produced, in which two are used as references during the backpropagation in training. Numerical simulation results on the Sleep-EDFX78 dataset confirm that the proposed method show improvements compared to the existing Transformer-based methods.

Original languageEnglish
Title of host publication11th International Winter Conference on Brain-Computer Interface, BCI 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665464444
DOIs
Publication statusPublished - 2023
Event11th International Winter Conference on Brain-Computer Interface, BCI 2023 - Virtual, Online, Korea, Republic of
Duration: 2023 Feb 202023 Feb 22

Publication series

NameInternational Winter Conference on Brain-Computer Interface, BCI
Volume2023-February
ISSN (Print)2572-7672

Conference

Conference11th International Winter Conference on Brain-Computer Interface, BCI 2023
Country/TerritoryKorea, Republic of
CityVirtual, Online
Period23/2/2023/2/22

Bibliographical note

Funding Information:
This work was partly supported by Institute of Information & communications Technology Planning & Evaluation(IITP) grant funded by the Korea government(MSIT) (No. 2019-0-00079, Artificial Intelligence Graduate School Program(Korea University)), Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-00432, Development Of Non-invasive Integrated BCI SW Platform To Control Home Appliance And External Devices By User’s Thought Via AR/VR Interface), Institute for Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning), Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2021-0-02068, Artificial Intelligence Innovation Hub), and the BK21 four program through the National Research Foundation (NRF) funded by the Ministry of Education of Korea.

Publisher Copyright:
© 2023 IEEE.

Keywords

  • Brain-Computer Interface (BCI)
  • Electroencephalogram (EEG)
  • Sleep Stage Classification
  • Transformer.

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Signal Processing

Fingerprint

Dive into the research topics of 'Epoch-level and Sequence-level Multi-Head Self-Attention-based Sleep Stage Classification'. Together they form a unique fingerprint.

Cite this