Race, Sex, and Age Disparities in the Performance of ECG Deep Learning Models Predicting Heart Failure
Circulation: Heart Failure, Ahead of Print.
BACKGROUND:Deep learning models may combat widening racial disparities in heart failure outcomes through early identification of individuals at high risk. However, demographic biases in the performance of these models have not been well-studied.METHODS:This retrospective analysis used 12-lead ECGs taken between 2008 and 2018 from 326 518 patient encounters referred for standard clinical indications to Stanford Hospital. The primary model was a convolutional neural network model trained to predict incident heart failure within 5 years. Biases were evaluated on the testing set (160 312 ECGs) using the area under the receiver operating characteristic curve, stratified across the protected attributes of race, ethnicity, age, and sex.RESULTS:There were 59 817 cases of incident heart failure observed within 5 years of ECG collection. The performance of the primary model declined with age. There were no significant differences observed between racial groups overall. However, the primary model performed significantly worse in Black patients aged 0 to 40 years compared with all other racial groups in this age group, with differences most pronounced among young Black women. Disparities in model performance did not improve with the integration of race, ethnicity, sex, and age into model architecture, by training separate models for each racial group, or by providing the model with a data set of equal racial representation. Using probability thresholds individualized for race, age, and sex offered substantial improvements in F1 scores.CONCLUSIONS:The biases found in this study warrant caution against perpetuating disparities through the development of machine learning tools for the prognosis and management of heart failure. Customizing the application of these models by using probability thresholds individualized by race, ethnicity, age, and sex may offer an avenue to mitigate existing algorithmic disparities.
Source link