We present a mobile dataset obtained from electroencephalography (EEG) of the scalp and around the ear as well as from locomotion sensors by 24 participants moving at four different speeds while performing two brain-computer interface (BCI) tasks. The data were collected from 32-channel scalp-EEG, 14-channel ear-EEG, 4-channel electrooculography, and 9-channel inertial measurement units placed at the forehead, left ankle, and right ankle. The recording conditions were as follows: standing, slow walking, fast walking, and slight running at speeds of 0, 0.8, 1.6, and 2.0 m/s, respectively. For each speed, two different BCI paradigms, event-related potential and steady-state visual evoked potential, were recorded. To evaluate the signal quality, scalp- and ear-EEG data were qualitatively and quantitatively validated during each speed. We believe that the dataset will facilitate BCIs in diverse mobile environments to analyze brain activities and evaluate the performance quantitatively for expanding the use of practical BCIs.
Bibliographical noteFunding Information:
This work was supported by the Institute for Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korean Government, No. 2017-0-00451: Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning, No. 2015-0-00185: Development of Intelligent Pattern Recognition Softwares for Ambulatory Brain-Computer Interface, and No. 2019-0-00079: Artificial Intelligence Graduate School Program (Korea University). We would like to express our sincere gratitude to Y.-H. Kang and D.-Y. Lee for their assistance in data collection, and N.-S. Kwak for his advice while designing the experiment.
© 2021, The Author(s).
ASJC Scopus subject areas
- Statistics and Probability
- Information Systems
- Computer Science Applications
- Statistics, Probability and Uncertainty
- Library and Information Sciences