ECG Classification with a Convolutional Recurrent Neural Network

Jérôme Van Zaen, Halla Sigurthorsdottir, Ricard Delgado-Gonzalo, Mathieu Lemay
Swiss Center for Electronics and Microtechnology (CSEM)


We developed a convolutional recurrent neural network to classify 12-lead ECG signals for the challenge of PhysioNet / Computing in Cardiology 2020. The provided dataset includes 6877 records sampled at 500 Hz, with durations ranging from 6 to 144 seconds, annotated with one or more labels from nine classes (AF, I-AVB, LBBB, Normal, PAC, PVC, RBBB, STD, STE). Entries are evaluated with two scores on a hidden test set: a class-weighted F-score and a class-weighted Jaccard measure.

Our neural network architecture combines convolutional and recurrent layers that takes sliding windows of ECG signals as input and yields the probability of each class as output. The convolutional part extract features from each sliding window with 15 layers composed of one-dimensional convolutions and leaky ReLU activations. Global average pooling layer is applied to obtain 12 features for each window after the convolutional layers. Then, a bi-directional GRU layer with 128 units aggregates the features extracted by the convolutional part from all windows of a record. This recurrent layer is used to handle records of different durations and is followed by a leaky ReLU activation and batch normalization. Finally, a dense layer with sigmoid activation outputs class probabilities.

To train this network, we split the dataset into a training set (80%) and a validation set (20%) stratified by labels and normalized with the standard deviation. We trained the network with the Nesterov Adam optimizer and a learning rate of 0.001. We selected a batch size of 20 and grouped records with similar durations to limit zero padding. We also applied dropout to regularize the convolutional and recurrent layers.

Our network achieved a F-score of 0.843 (0.831) and a Jaccard measure of 0.642 (0.653) on our validation (training) set. On the hidden test set, we obtained 0.810 and 0.599.