Abstract
An automation system for detecting the pilot's diversified mental states is an extremely important and essential technology, as it could prevent catastrophic accidents caused by the deteriorated cognitive state of pilots. Various types of biosignals have been employed to develop the system, since they accompany neurophysiological changes corresponding to the mental state transitions. In this study, we aimed to investigate the feasibility of a robust detection system of the pilot's mental states (i.e., distraction, workload, fatigue, and normal) based on multimodal biosignals (i.e., electroencephalogram, electrocardiogram, respiration, and electrodermal activity) and a multimodal deep learning (MDL) network. To do this, first, we constructed an experimental environment using a flight simulator in order to induce the different mental states and to collect the biosignals. Second, we designed the MDL architecture – which consists of a convolutional neural network and long short-term memory models – to efficiently combine the information of the different biosignals. Our experimental results successfully show that utilizing multimodal biosignals with the proposed MDL could significantly enhance the detection accuracy of the pilot's mental states.
Original language | English |
---|---|
Pages (from-to) | 324-336 |
Number of pages | 13 |
Journal | Biocybernetics and Biomedical Engineering |
Volume | 40 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2020 Jan 1 |
Bibliographical note
Funding Information:This work was supported by the Defense Acquisition Program Administration (DAPA) and Agency for Defense Development (ADD) of Korea (06-201-305-001, A Study on Human–Computer Interaction Technology for the Pilot Status Recognition).
Publisher Copyright:
© 2019
Keywords
- ECG
- EDA
- EEG
- MDL
- Pilot's mental states
- Respiration
ASJC Scopus subject areas
- Biomedical Engineering