Will computerised medical records and statistical tools really help us move towards precision medicine of heart failure with preserved ejection fraction?

Therapeutic strategy in heart failure with preserved ejection fraction (HFPEF) remains a challenging problem.1 Indeed, the prevalence of this disease is constantly increasing, especially in most high-income countries in connection with the ageing of the population, and is a cause of significant morbidity and mortality.2 Up to now, with the applied medicine, and despite the welcome approval of Sodium-glucose cotransporter type 2 inhibitors (iSGLT2) in this context, we still face incomplete control of the symptoms and the disease progression.3 4

Whereas the management of HFPEF has been greatly modified in recent years by new therapeutic classes, large-scale clinical trials in HFPEF have never been able to show a benefit on mortality until recent trials on SGLT2 inhibitors.5

This is due both to a pathophysiology that is still imperfectly mastered, but above all, to the significant heterogeneity within this syndrome itself, whether in terms of age, sex, comorbidities, and clinical and echocardiographic profiles.1

In this context, many machine learning clustering models have emerged, made possible by the improvement of computer systems, making it possible to analyse a very large number of data and variables in order to …

留言 (0)

沒有登入
gif