A New Method to Improve Automated Classification of Heart Sound Signals: Filter Bank Learning in Convolutional Neural Networks

Document Type: Original Paper


1 Electrical Engineering Department, Iranian Research Organization for Science and Technology, Tehran, Iran

2 Iranian Research Organization for Science and Technology, Tehran, Iran


Introduction: Recent studies have acknowledged the potential of convolutional neural networks (CNNs) in distinguishing healthy and morbid samples by using heart sound analyses. Unfortunately the performance of CNNs is highly dependent on the filtering procedure which is applied to signal in their convolutional layer. The present study aimed to address this problem by applying filter bank learning concept in CNNs.
Material and Methods: In proposed method, the filter bank of CNN is updated based on a cross-entropy minimization rule to extract higher-level features from spectral characteristics of the heart sound signal. The deeper level of the extracted features in parallel with their spectral-based nature leads to better discrimination between healthy and morbid heart sounds. The proposed method was applied to three different heart sound datasets of PASCAL-A, PASCAL-B, and Kaggle, including normal and abnormal categories.
Results: The proposed method obtained a true positive rate (TPR) between minimally 86% and maximally 96% (if FPR=0%) among all the examined datasets. In addition, the false-positive rate (FPR) was obtained as 7-8% (if TPR=100%) among the mentioned datasets. Finally, the accuracy was achieved in the range of 93-98% when the FPR was 0% and within the range of 96-96.5% when the TRP was 100%.
Conclusion: Increased TPR in the proposed method (96% for the proposed method vs. 87% for CNN) in parallel with a decrease in its FPR (7% for the proposed method vs. 10% for CNN) showed the proposed method's superiority against its well-known alternative in automated self-assessment of the heart.


Main Subjects

Volume 17, Issue 5
September and October 2020
Pages 331-339
  • Receive Date: 05 March 2019
  • Revise Date: 26 October 2019
  • Accept Date: 28 October 2019