IENet: a robust convolutional neural network for EEG based brain-computer interfaces.

Researchers

Journal

Modalities

Models

Abstract

Brain-computer interfaces (BCIs) based on electroencephalogram (EEG) develop into novel application areas with more complex scenarios, which put forward higher requirements for the robustness of EEG signal processing algorithms. Deep learning can automatically extract discriminative features and potential dependencies via deep structures, demonstrating strong analytical capabilities in numerous domains such as computer vision (CV) and natural language processing (NLP). Making full use of deep learning technology to design a robust algorithm that is capable of analyzing EEG across BCI paradigms is our main work in this paper.Inspired by InceptionV4 and InceptionTime architecture, we introduce a neural network ensemble named InceptionEEG-Net (IENet), where multi-scale convolutional layer and convolution of length 1 enable model to extract rich high-dimensional features with limited parameters. In addition, we propose the average receptive ļ¬eld gain for convolutional neural networks (CNNs), which optimizes IENet to detect long patterns at a smaller cost. We compare with the current state-of-the-art method across ļ¬ve EEG-BCI paradigms: steady-state visual evoked potentials, epilepsy EEG, overt attention P300 visual-evoked potentials, covert attention P300 visual-evoked potentials and movement-related cortical potentials.The classification results show that the generalizability of IENet is on par with the state-of-the-art paradigm-agnostic models on test datasets. Furthermore, the feature explainability analysis of IENet illustrates its capability to extract neurophysiologically interpretable features for diļ¬€erent BCI paradigms, ensuring the reliability of algorithm. Signiļ¬cance. It can be seen from our results that IENet can generalize to diļ¬€erent BCI paradigms. And it is essential for deep CNNs to increase the receptive ļ¬eld size using average receptive ļ¬eld gain.Creative Commons Attribution license.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *