The Community for Technology Leaders
2018 IEEE International Conference on Pervasive Computing and Communications (PerCom) (2018)
Athens, Greece
March 19, 2018 to March 23, 2018
ISSN: 2474-249X
ISBN: 978-1-5386-3225-3
pp: 1-10
Xiang Zhang , School of Computer Science & Engineering, University of New South Wales, Sydney, Australia
Lina Yao , School of Computer Science & Engineering, University of New South Wales, Sydney, Australia
Quan Z. Sheng , Department of Computing, Macquarie University, Sydney, Australia
Salil S. Kanhere , School of Computer Science & Engineering, University of New South Wales, Sydney, Australia
Tao Gu , School of Science, RMIT University, Melbourne, Australia
Dalin Zhang , School of Computer Science & Engineering, University of New South Wales, Sydney, Australia
ABSTRACT
An electroencephalography (EEG) based Brain Computer Interface (BCI) enables people to communicate with the outside world by interpreting the EEG signals of their brains to interact with devices such as wheelchairs and intelligent robots. More specifically, motor imagery EEG (MI-EEG), which reflects a subject's active intent, is attracting increasing attention for a variety of BCI applications. Accurate classification of MI-EEG signals while essential for effective operation of BCI systems is challenging due to the significant noise inherent in the signals and the lack of informative correlation between the signals and brain activities. In this paper, we propose a novel deep neural network based learning framework that affords perceptive insights into the relationship between the MI-EEG data and brain activities. We design a joint convolutional recurrent neural network that simultaneously learns robust high-level feature presentations through low-dimensional dense embeddings from raw MI-EEG signals. We also employ an Autoencoder layer to eliminate various artifacts such as background activities. The proposed approach has been evaluated extensively on a large-scale public MI-EEG dataset and a limited but easy-to-deploy dataset collected in our lab. The results show that our approach outperforms a series of baselines and the competitive state-of-the-art methods, yielding a classification accuracy of 95.53%. The applicability of our proposed approach is further demonstrated with a practical BCI system for typing.
INDEX TERMS
EEG, brain typing, deep learning, BCI
CITATION

X. Zhang, L. Yao, Q. Z. Sheng, S. S. Kanhere, T. Gu and D. Zhang, "Converting Your Thoughts to Texts: Enabling Brain Typing via Deep Feature Learning of EEG Signals," 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom)(PERCOM), Athens, Greece, 2018, pp. 1-10.
doi:10.1109/PERCOM.2018.8444575
209 ms
(Ver 3.3 (11022016))