Statistical Techniques for Improving the Repeatability of Automated ECG Interpretation

McLaughlin, Stephanie C (1992) Statistical Techniques for Improving the Repeatability of Automated ECG Interpretation. PhD thesis, University of Glasgow.

Full text available as:
[thumbnail of 13834191.pdf] PDF
Download (15MB)


The electrocardiogram is a recording of the electrical activity of the heart. By convention, 12 separate leads are studied from which a diagnosis is made. This process of interpretation is a skill acquired by cardiologists over a period of years. During the past twenty-five years, computer assisted techniques have been developed to undertake such interpretations. It has been shown that if consecutive ECGs are recorded on a patient within several minutes, the computer diagnoses may differ because they are made independently of one another although all conditions remain unchanged. This discrepancy occurs when small changes in ECG measurements, from one recording to another, cause threshold values within the diagnostic program to be crossed, thereby producing conflicting diagnoses. The primary aim of the study described in this thesis was to develop techniques which would minimise such problems, thereby enhancing the repeatability of the ECG program developed in the Department of Medical Cardiology at the Royal Infirmary in Glasgow, whilst maintaining the heuristic framework of the diagnostic logic. From a statistical perspective, the problem of lack of repeatability was tackled in the following ways. Firstly, a new approach to defining upper limits of normal ECG measurements was adopted. Conventionally, the Glasgow Royal Infirmary program categorises normal limits with. respect to age and sex. These limits are discontinuous in nature and can contribute to a lack of repeatability in interpretation between consecutive recordings particularly when an individual's age-category has altered between visits. These limits have been replaced with continuous equations which were calculated on the basis of a sample of 1338 'normals' using simple linear regression techniques. It was thought that the use of such equations, which change smoothly and continuously throughout the age range, would alleviate the problem of subtle differences in ECG readings from recording to recording producing inconsistent diagnoses. The second stage of the problem was to consider how to deal with discrete thresholds between normal and abnormal, whether such boundaries were continuous or not. Many of the diagnostic decisions throughout the program have, until now, been determined by whether the observed value of a particular ECG measurement has attained a specified discrete threshold. These decisions can be regarded as score functions for ECG measurements which take the value K if the threshold is attained, and 0 otherwise. No account has been taken of the proximity between the measurement and the boundary value. This 'all or nothing' strategy has meant that an individual whose observed ECG measurement lies very close to, but below, a threshold value on one occasion and equally close to, but above, the same threshold value on a subsequent visit may receive two conflicting diagnoses. A method was developed to replace these discrete thresholds with continuous functions, which take into account the natural day-to- day variation occurring in each ECG measurement, and to assign a new smoothed diagnostic score accordingly. Using the discrete score function as a basis, a smooth alternative was developed which increased gradually from 0 to a maximum of K. This smooth version of the scoring function was based on the family of cumulative distribution functions of the logistic distribution, suitably scaled to give the desired maximum score. In addition, the steepness of the new smoothed step was dictated by the amount of day-to-day variation in the particular ECG measurement under consideration. Often, more than one criterion needed to be satisfied before appropriate action could be taken, so an algebra which would take account of combinations of criteria was required. Some basic mathematical rules such as the intersection rule and the union rule were used as building blocks upon which to construct such a system. (Abstract shortened by ProQuest.).

Item Type: Thesis (PhD)
Qualification Level: Doctoral
Additional Information: Adviser: P W Macfarlane
Keywords: Statistics, Medical imaging
Date of Award: 1992
Depositing User: Enlighten Team
Unique ID: glathesis:1992-76411
Copyright: Copyright of this thesis is held by the author.
Date Deposited: 19 Dec 2019 09:15
Last Modified: 19 Dec 2019 09:15

Actions (login required)

View Item View Item


Downloads per month over past year