Parametrizing Drug Effects with Machine Learning in a New Resonant Model of Cell Electrophysiology

Sucheta Sehgal, Nitish Patel, Mark Trew
The University of Auckland


Aims: Drugs often alter the action potential (AP) morphologies of cardiac cells. These changes can induce life-threatening arrhythmias. Although models of cellular electrophysiology (EP) are widely available, we have developed a promising alternative: a novel resonant model (RM) designed around the computational strengths of non-CPU platforms such as GPU and FPGA. However, like most models, the RM requires parameterization. To effectively parameterize the RM to capture drug effects on the action potential, we have investigated a machine learning algorithm (MLA). Methods: To quantitatively incorporate drug effects into an RM we designed and evaluated a feed-forward neural network (FFNN) algorithm. To achieve the best computational performance a rectified linear unit activation function was used. The FFNN algorithm uses an Adam optimization algorithm to accelerate convergence. The FFNN was trained on data obtained by blocking currents in an EP model of the human sinoatrial node (SAN). We used unseen current blockage data as an input to the FFNN for validation. The FFNN was compared with the set of piecewise-linear equations. Results: An RM parameterized using MLA closely matches the AP morphology of a SAN cell model with 97% blockage of If currents. The Pearson correlation coefficient between the two models over multiple beats is 0.98 and the RMSE is less than 9%. Coupling lengths over a range of If values (7-9%) showed the MLA parameterized RM can closely follow predictions from a human SAN EP model. Conclusion: A machine learning algorithm is effective for parameterizing a new resonant model of cell electrical behavior, enabling the capture of current blockage effects. With these parameterizations, the computational strengths of RM models can be applied to study and predict the effects of drugs on large scale simulations of cardiac electrical activity.