Author/Authors :
Lu, Peng School of Information Engineering - Zhengzhou University - Zhengzhou, China , Zhang, Yabin School of Information Engineering - Zhengzhou University - Zhengzhou, China , Zhou, Bing School of Information Engineering - Zhengzhou University - Zhengzhou, China , Zhang, Hongpo State Key Laboratory of Mathematical Engineering and Advanced Computing - Zhengzhou, China , Chen, Liwei Department of Automation - School of Electrical Engineering - Zhengzhou University - Zhengzhou, China , Lin, Yusong School of Information Engineering - Zhengzhou University - Zhengzhou, China , Mao, Xiaobo Department of Automation - School of Electrical Engineering - Zhengzhou University - Zhengzhou, China , Gao, Yang School of Information Engineering - Zhengzhou University - Zhengzhou, China , Xi, Hao School of Information Engineering - Zhengzhou University - Zhengzhou, China
Abstract :
In recent years, deep learning (DNN) based methods have made leapfrogging level breakthroughs in detecting cardiac arrhythmias
as the cost effectiveness of arithmetic power, and data size has broken through the tipping point. However, the inability of these
methods to provide a basis for modeling decisions limits clinicians’ confidence on such methods. In this paper, a Gate Recurrent
Unit (GRU) and decision tree fusion model, referred to as (T-GRU), was designed to explore the problem of arrhythmia
recognition and to improve the credibility of deep learning methods. The fusion model multipathway processing time-frequency
domain featured the introduction of decision tree probability analysis of frequency domain features, the regularization of GRU
model parameters and weight control to improve the decision tree model output weights. The MIT-BIH arrhythmia database
was used for validation. Results showed that the low-frequency band features dominated the model prediction. The fusion
model had an accuracy of 98.31%, sensitivity of 96.85%, specificity of 98.81%, and precision of 96.73%, indicating its high
reliability and clinical significance.