Abstract
In this paper, we propose a highly accurate and fast spelling system that employs multi-modal electroencephalography-electrooculography (EEG-EOG) signals and visual feedback technology. Over the last 20 years, various types of speller systems have been developed in brain-computer interface and EOG/eye-tracking research; however, these conventional systems have a tradeoff between the spelling accuracy (or decoding) and typing speed. Healthy users and physically challenged participants, in particular, may become exhausted quickly; thus, there is a need for a speller system with fast typing speed while retaining a high level of spelling accuracy. In this paper, we propose the first hybrid speller system that combines EEG and EOG signals with visual feedback technology so that the user and the speller system can act cooperatively for optimal decision-making. The proposed spelling system consists of a classic row-column event-related potential (ERP) speller, an EOG command detector, and visual feedback modules. First, the online ERP speller calculates classification probabilities for all candidate characters from the EEG epochs. Second, characters are sorted by their probability, and the characters with the highest probabilities are highlighted as visual feedback within the row-column spelling layout. Finally, the user can actively select the character as the target by generating an EOG command. The proposed system shows 97.6% spelling accuracy and an information transfer rate of 39.6 (±13.2) [bits/min] across 20 participants. In our extended experiment, we redesigned the visual feedback and minimized the number of channels (four channels) in order to enhance the speller performance and increase usability. Most importantly, a new weighted strategy resulted in 100% accuracy and a 57.8 (±23.6) [bits/min] information transfer rate across six participants. This paper demonstrates that the proposed system can provide a reliable communication channel for practical speller applications and may be used to supplement existing systems.
Original language | English |
---|---|
Pages (from-to) | 1443-1459 |
Number of pages | 17 |
Journal | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
Volume | 26 |
Issue number | 7 |
DOIs | |
Publication status | Published - 2018 Jul |
Bibliographical note
Funding Information:Manuscript received January 16, 2017; revised July 1, 2017, January 27, 2018, and March 2, 2018; accepted March 13, 2018. Date of publication May 21, 2018; date of current version July 6, 2018. This work was supported in part by the Ministry of Science and ICT, South Korea, through the SW Starlab Support Program supervised by the Institute for Information and Communications Technology Promotion under Grant IITP-2015-1107 and in part by the Korean Government under Grant 2017-0-00451: Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning. (Corresponding author: Seong-Whan Lee.) M.-H. Lee, D.-O. Won, and S.-W. Lee are with the Department of Brain and Cognitive Engineering, Korea University, Seoul 02841, South Korea (e-mail: [email protected]; [email protected]; [email protected]).
Publisher Copyright:
© 2018 IEEE.
Keywords
- Brain-computer interfaces (BCI)
- P300 speller
- electroencephalography (EEG)
- electrooculogram (EOG)
- visual feedback
ASJC Scopus subject areas
- Internal Medicine
- General Neuroscience
- Biomedical Engineering
- Rehabilitation