Emotion recognition research has been conducted using various physiological signals. In this paper, we propose an efficient photoplethysmogram-based method that fuses the deep features extracted by two deep convolutional neural networks and the statistical features selected by Pearson's correlation technique. A photoplethysmogram (PPG) signal can be easily obtained through many devices, and the procedure for recording this signal is simpler than that for other physiological signals. The normal-to-normal (NN) interval values of heart rate variability (HRV) were utilized to extract the time domain features, and the normalized PPG signal was used to acquire the frequency domain features. Then, we selected features that correlated highly with an emotion through Pearson's correlation. These statistical features were fused with deep-learning features extracted from a convolutional neural network (CNN). The PPG signal and the NN interval were used as the inputs of the CNN to extract the features, and the total concatenated features were utilized to classify the valence and the arousal, which are the basic parameters of emotion. The Database for Emotion Analysis using Physiological signals (DEAP) was chosen for the experiment, and the results demonstrated that the proposed method achieved a noticeable performance with a short recognition interval.
Bibliographical noteFunding Information:
This research was supported by the National Research Foundation of Korea (NRF) funded by the Ministry of Science and ICT (NRF-2019R1A2C1089742).
© 2020 by the authors.
- Convolutional neural network
- Emotion recognition
- Feature fusion
- Statistical feature
ASJC Scopus subject areas
- General Materials Science
- General Engineering
- Process Chemistry and Technology
- Computer Science Applications
- Fluid Flow and Transfer Processes