| |

Orthogonal convolutional neural networks for automatic sleep stage classification based on single-channel EEG.

Researchers

Journal

Modalities

Models

Abstract

In recent years, several automatic sleep stage classification methods based on convolutional neural networks (CNN) by learning hierarchical feature representation automatically from raw EEG data have been proposed. However, the state-of-the-art of such methods are quite complex. Using a simple CNN architecture to classify sleep stages is important for portable sleep devices. In addition, employing CNNs to learn rich and diverse representations remains a challenge. Therefore, we propose a novel CNN model for sleep stage classification.
Generally, EEG signals are better described in the frequency domain; thus, we convert EEG data to a time-frequency representation via Hilbert-Huang transform. To learn rich and effective feature representations, we propose an orthogonal convolutional neural network (OCNN). First, we construct an orthogonal initialization of weights. Second, to avoid destroying the orthogonality of the weights in the training process, orthogonality regularizations are proposed to maintain the orthogonality of weights. Simultaneously, a squeeze-and-excitation (SE) block is employed to perform feature recalibration across different channels.
The proposed method achieved a total classification accuracy of 88.4% and 87.6% on two public datasets, respectively. The classification performances of different convolutional neural networks models were compared to that of the proposed method. The experiment results demonstrated that the proposed method is effective for sleep stage classification.
Experiment results indicate that the proposed OCNN can learn rich and diverse feature representations from time-frequency images of EEG data, which is important for deep learning. In addition, the proposed orthogonality regularization is simple and can be easily adapted to other architectures.
Copyright © 2019. Published by Elsevier B.V.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *