The Hidden Diagnostic Information of the Heart Rhythm Revealed by Entropy Proportions

The invention of the steam engine is generally accepted as being one of the most significant factors responsible for unleashing the industrial revolution. However, subsequent technological advances were critically dependent on increasing the efficiency of machines, that is, increasing the delivery of mechanical energy per unit of input thermal energy. These considerations led to investigations into quantifying parameters of the basic laws of thermodynamics involving transformation of thermal energy to mechanical energy. The phenomenon of heat transfer is inherent in the conceptual formulation of energy in the context of thermodynamics. The transfer of heat (Q) can be quantified by the system temperature (T) and a property (S) that defines the change of state of the whole system (i.e., Q = T·S or S = Q/T). Whilst T can be readily measured, S is essentially a conceptual entity that is qualitatively defined by all the individual parts of the system (down to the microparticles) that constitute the system structure. This implies that if there is no change in the system structure, there is no change in the order of the system parts, and so there is no energy transfer. Hence, the corollary is that when energy (heat) is transferred, there is, by necessity, a change in structural elements of the whole system, and the changes can be at either the macroscale or the microscale, or both. Thus, transfer of energy is always associated with an increase in disorder of the system due to the system undergoing transformation. This was the basic notion that led to the original definition of the parameter S as “entropy” by Rudolph Clausius in 1865, using the original Greek roots τροπή for “transformation” [1]. Indeed, for the physical universe, this is fundamental to the second law of thermodynamics, which states that entropy (or disorder) always increases with time.

Structural transformation, that is, the change of state of the system in space or time, suggests the notion of “information.” Hence, from the foregoing description of entropy, it follows that there is an association between the conceptual entities of energy and information. Heat is an expression of the thermal energy of the molecules of a material characterized by the amount of movement, or degree of disorder, or information, contained in the molecular structure of the material. Thus, it is a logical extension that entropy can be calculated from the knowledge of the probabilities of the position of the molecules in space and time. So, if Q is expressed as a function of probabilities (P1, P2, …, Pn) of the position of the system elements at the microscale, Q becomes a function of these probabilities: Q = f(P1 + P1, … + …, Pn), then S = f(P1 + P1, …, + …, Pn)/T. It is this association between energy and information [2] that enabled Claude Shannon in 1948 to publish the seminal and highly important use of the concept of entropy in communication theory [3]. In this work, he defines entropy, as S = −K (P1·ln[P1] + P2·ln[P2], … + Pn·ln[Pn]), where K is a scaling factor and P1, P2, …, Pn are the probabilities of the information in the various channels being true. Because of the fundamental inherent conceptual consistency between energy and information [2], it can be shown that the Clausius definition of entropy at the macroscale of thermodynamics [1] and the Shannon definition at the microscale of information [3] are equivalent. Hence, K becomes the fundamental relation between energy (heat) and temperature, the Boltzmann constant (1.38 × 10−23 J/K).

The notion of entropy thus allows the description of complexity of a system; the complexity encompasses phenomena such as chaos, fractal dimensions, and nonlinear dynamics [4]. These methods of analysis of complexity have been successfully applied to characterize the complex cellular and molecular communication involved in the fractal nature of the chaotic system that is at the basis of the intrinsic variability of the normal pattern of heart rate [5, 6]. Entropy has been shown that it can be used as a multiscale quantity for analysis of heart rate variability [7] and can predict cardiovascular risk due to stroke [8, 9].

In this issue of Pulse, Rodríguez et al. [10] make use of Shannon’s definition of entropy that enables the quantification of information in any sampled time-varying signal by estimating the probability of individual sample points being within defined limits. In this and in a previous study, Rodríguez et al. [10, 11] describe a system where information is extracted from a time-varying signal of heart rate by the difference in the interbeat interval that occurs in sequential cardiac cycles. For long recordings, the cardiac cycle time (or heart rate) is determined for each beat, and each beat is compared with the successive beat to obtain a difference. If the difference is less than a set value (5 beats per minute), the 2 successive beats are assigned the same value; otherwise, the second beat is assigned the different value. This procedure is continued for contiguous beats. For the whole recording, a plot is made where the X and Y axes denote values of heart rate. A delay map is formed by locating ordered pairs of values of heart rates which corresponds to the consecutive combination of 2 heart rates in a range of 5 beats/min. The X, Y coordinates in the delay map then contain numbers that give the cumulative sum of each consecutive pair corresponding to that specific heart rate. For example, if there is no change in heart rate for any beats in a 10-min recording, and the heart rate is 60 beats per minute, the X, Y coordinate (60, 60) would have a value of 600 (60 × 10), and all other coordinates in the X–Y plane would be zero. The greater the variability, the more X, Y coordinates would contain values, generally spread around the mean heart rate of the recording. The space taken up by non-zero values in the X–Y plane is designated an attractor. It was found that these attractors have different spatial dimensions and distributions in the X–Y plane and so can be used to characterize underlying patterns of cardiac function [11].

For each X, Y coordinate in the delay map, a probability (P[x, y]) of the occurrence of that specific heart rate value is computed by the ratio of the number of beats at that specific heart rate and the total number of beats in the whole recording. An entropy calculation is made using Shannon’s definition by summing the function Σnx=1Σny=1P(x, y)·ln(Px, y) across the whole range of the X–Y plane (this is equivalent to normalizing the entropy [S] by the Boltzmann constant [K]). The values are then divided into orders of magnitude of power of 10 (unity, 10, 100, and 1,000) (it is not clear why the powers of 10 are defined as “dozens” in the study by Rodríguez et al. [10]). The metric of “entropy proportions” is thus the proportion of the total computed entropy in each category of a power of 10.

The study by Rodríguez et al. [10] consists of an analysis of 230 Holter registries of at least 18 h in which 30 were normal and 200 diagnosed with a range of cardiac abnormalities by a cardiologist. The 18-h recording was analyzed to produce delay maps and X–Y plots giving different attractors in various regions of the X–Y plane. The entropy values for each X, Y coordinate were ranked in terms of the order of magnitude of powers of 10 and proportions obtained for each order of magnitude. The proportions were then compared with limits of normality and abnormality established from the previous study [11], and recordings were then assigned a normal or abnormal status, with the analysis being blind to the status assigned by the expert cardiologist. The comparison between the clinical diagnosis and the diagnosis by the entropy algorithm gave values of specificity and sensitivity of 100%.

The difference between this present study and the previous study by Rodríguez et al. [10] is that one analyzed 230 recordings of 18 h with 30 normal subjects and the other analyzed 800 recordings of 21–24 h with 50 normal subjects [11]. The application of the entropy algorithm to the previous study also gave 100% for specificity and sensitivity. These results would seem to indicate a perfect diagnostic system. However, this is not typical of any diagnostic system that is based on parameters which rely on probabilistic prediction. In addition, the entropy proportions based on order of magnitudes of decimal numbers would seem somewhat arbitrary. There is no explicit explanation of why this specific numerical system was chosen to compute the entropy proportions, other than being some form of numerical familiarity and convenience.

Notwithstanding the potential limitations, the study by Rodriguez et al. [10] shows the potential for the basic property of entropy in providing a means to extract information for cardiovascular diagnosis. It is the same concept of entropy that is inherent in understanding the fundamental energy considerations involved in exploring the efficiency of machines and the underlying notions of information theory and communication [2]. The study elucidates how entropy can be applied to one of the most basic and essential time-varying physiological signals that can ever be recorded – the beating of the heart, the most important machine to sustain life.

Conflict of Interest Statement

There are no conflicts of interest.

Funding Sources

The author did not receive any funding.

References Clausius R. The mechanical theory of heat: with its applications to the steam-engine and to the physical properties of bodies. London: John Van Voorst; 1867. Tribus M, McIrvine EC. Energy and information. Sci Am. 1971;224:178–84. Shannon CE. A mathematical theory of communication. Bell Syst Tech J. 1948;27:379–423. Goldberger AL, Giles F. Filley lecture. Complex systems. Proc Am Thorac Soc. 2006;3(6):467–71. Lakatta EG, Yaniv Y, Maltsev VA. Minding the gaps that link intrinsic circadian clock within the heart to its intrinsic ultradian pacemaker clocks. Focus on “The cardiomyocyte molecular clock, regulation of Scn5a, and arrhythmia susceptibility”. Am J Physiol Cell Physiol. 2013;304(10):C941–4. Schroder EA, Lefta M, Zhang X, Bartos DC, Feng HZ, Zhao Y, et al. The cardiomyocyte molecular clock, regulation of Scn5a, and arrhythmia susceptibility. Am J Physiol Cell Physiol. 2013;304(10):C954–65. Costa MD, Peng CK, Goldberger AL. Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures. Cardiovasc Eng. 2008;8(2):88–93. Avolio A. Heart rate variability and stroke: strange attractors with loss of complexity. J Hypertens. 2013;31(8):1529–31. Graff B, Gąsecki D, Rojek A, Boutouyrie P, Nyka W, Laurent S, et al. Heart rate variability and functional outcome in ischemic stroke: a multiparameter approach. J Hypertens. 2013;31(8):1629–36. Rodríguez J, Prieto S, Laguado E, Pernett F, Villamizar M, Olivella E, et al. Application of a diagnostic methodology of cardiac systems based on the proportions of entropy in normal patients and with different pathologies. Pulse. 2020;8(3–4):114–9. Rodríguez J, Prieto S, Ramírez L. A novel heart rate atractor for the prediction of cardiovascular disease. Informatics in medicine. Inf Med. 2019;15(100174):1–9. Author Contacts

Alberto Avolio, alberto.avolio@mq.edu.au

Article / Publication Details

Received: April 21, 2021
Accepted: April 21, 2021
Published online: May 27, 2021
Issue release date:

Number of Print Pages: 3
Number of Figures: 0
Number of Tables: 0

ISSN: 2235-8676 (Print)
eISSN: 2235-8668 (Online)

For additional information: https://www.karger.com/PLS

Copyright / Drug Dosage / Disclaimer

Copyright: All rights reserved. No part of this publication may be translated into other languages, reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying, recording, microcopying, or by any information storage and retrieval system, without permission in writing from the publisher.
Drug Dosage: The authors and the publisher have exerted every effort to ensure that drug selection and dosage set forth in this text are in accord with current recommendations and practice at the time of publication. However, in view of ongoing research, changes in government regulations, and the constant flow of information relating to drug therapy and drug reactions, the reader is urged to check the package insert for each drug for any changes in indications and dosage and for added warnings and precautions. This is particularly important when the recommended agent is a new and/or infrequently employed drug.
Disclaimer: The statements, opinions and data contained in this publication are solely those of the individual authors and contributors and not of the publishers and the editor(s). The appearance of advertisements or/and product references in the publication is not a warranty, endorsement, or approval of the products or services advertised or of their effectiveness, quality or safety. The publisher and the editor(s) disclaim responsibility for any injury to persons or property resulting from any ideas, methods, instructions or products referred to in the content or advertisements.

留言 (0)

沒有登入
gif