Aims: Cardiovascular signals provide relevant information on the state of the heart and the autonomic nervous system. Heart Rate Variability (HRV) has been extensively characterized quantifying its entropy and entropy rate. However, recent studies suggest that the use of parametric estimators of entropy might have advantages, e.g., when dealing with noisy signals. The aim of this study is to investigate the parametric estimation of entropy and its related measures through the usage of Higher Order Markov Chain (HOMC) models. While the evolution of Markov chains depends only on the previous state, in HOMCs the dynamic depends on an arbitrary number of previous steps.
Methods: After obtaining the transition probabilities, entropy and entropy rate were derived in terms of the stationary distribution. First, we empirically confirmed the convergence of the estimated parameters values to the real ones. In particular, by creating synthetic signals from HOMCs (with known parametrizations), we verified that the estimation error depends on the length N of the signal, but the bias is marginal when N>200. Then, we tested the methodology with real signals by taking into account the Physionet Normal Synus Rhythm and Congestive Heart Failure (CHF) databases. We uniformly discretized RR series into few states and estimated entropy and entropy rate varying the HOMC order up to 7. Finally, these two measures were used to train a batch of classification models.
Results: On a population of 98 subjects, the classification models (trained with k=5 fold validation) averagely scored an 80% of accuracy in distinguishing normal and CHF patients, with a maximum score of 87.8% (quadratic discriminant classifier, AUC=0.91). The longer the length of the series considered, the lower the order at which the maximum classification accuracy was reached.
Conclusions: HOMC modeling turned out to be a fast, scalable and sensible approach for evaluating entropy measures.