Elsevier

Injury

Volume 47, Issue 1, January 2016, Pages 130-134
Injury

Inter-rater agreement on assessment of outcome within a trauma registry

https://doi.org/10.1016/j.injury.2015.08.002Get rights and content

Abstract

Introduction

To better evaluate the degree of ongoing disability in trauma patients, it has been recommended that trauma registries introduce routine long-term outcome measurement. One of the measures recommended for use is the Extended Glasgow Outcome Scale (GOS-E). However, few registries have adopted this measure and further research is required to determine its reliability with trauma populations. This study aimed to evaluate the inter-rater agreement of GOS-E scoring between an expert rater and trauma registry follow-up staff with a sample of detailed trauma case scenarios.

Methods

Sixteen trauma registry telephone interviewers participated in the study. They were provided with a written summary of 15 theoretical adult trauma cases covering a spectrum of disability and asked to rate each case using the structured GOS-E interview. Their ratings were compared with those of an expert rater in order to calculate the inter-rater agreement for each individual rater-expert rater pair. Agreement was reported as the percentage of agreement, the kappa statistic, and weighted kappa. A multi-rater kappa value was also calculated for agreement between the 16 raters.

Results

Across the 15 cases, the percentage of agreement between individual raters and the expert ranged from 63% to 100%. Across the 16 raters, the percentage of agreement with the expert rater ranged from 73–100% (mean = 90%). Kappa values ranged from 0.65 to 1.00 across raters (mean = 0.86) and weighted kappa values ranged from 0.73 to 1.00 (mean = 0.89) The multi-rater kappa value was 0.78 (95% CI: 0.66, 0.89).

Conclusions

Sixteen follow-up staff achieved ‘substantial’ to ‘almost perfect’ agreement with an expert rater using the GOS-E outcome measure to score 15 sample trauma cases. The results of this study lend support to the use of the GOS-E within trauma populations and highlight the importance of ongoing training where multiple raters are involved to ensure reliable outcome reporting. It is also recommended that the structured GOS-E interview guide be used to achieve better agreement between raters. Ensuring the reliability of trauma outcome scores will enable more accurate evaluation of patient outcomes, and ultimately, more targeted trauma care.

Introduction

Recent improvements in trauma care have led to a reduction in injury-related mortality, resulting in an expansion in the focus of trauma care to long-term quality of survival[1], [2]. To better evaluate the degree of ongoing disability and functional-loss in this patient group, it has been recommended that trauma registries introduce routine long-term outcome measurement[3]. However, there is a lack of research validating the use of long-term outcome measures with trauma patients and few registries have adopted this recommendation[4].

One of the functional outcome measures recommended for use in trauma populations is the Extended Glasgow Outcome Scale (GOS-E)[3], [4]. This scale, widely used to evaluate outcomes in brain-injured patients, focuses on the impact of injury on overall function rather than the measurement of specific impairment and, as such, can be applied to different types of patients[5]. The GOS-E was developed as an extension to the original five-point Glasgow Outcome Scale (GOS), enabling differentiation of patients at a higher functional level[6]. To create the extended eight-point scale, the original upper three GOS categories of ‘severe disability’, ‘moderate disability’ and ‘good recovery’ were divided into upper and lower levels, creating the following functional outcome categories: dead, vegetative state, lower severe disability, upper severe disability, lower moderate disability, upper moderate disability, lower good recovery, and upper good recovery (Box 1).

Compared to the GOS, GOS-E scoring is more complex which can lead to increased rater error[7], [8]. To address this issue, a structured interview guide has been developed to assist raters in correctly classifying patients relative to assessment, with questions covering domains such as independence inside and outside the home and resumption of normal social roles, including work, social and leisure activities[5]. However, in order to be confident with using GOS-E with trauma populations, the measure must be shown to have adequate validity (i.e. measuring what it purports to measure), reliability (i.e. consistency over time and between raters) and responsiveness to change (i.e. ability to detect change)[9]. An outcome measure with these attributes will have better capacity to detect important differences between patients and changes over time, which is vital for evaluating health services and the quality of care of patients.

Previous research has established that the GOS-E has sufficient responsiveness and a low ceiling effect (i.e. able to discriminate at higher functional levels) in a major trauma population[10]. Its reliability has also been established with neurological patients, but not yet with trauma patients[5], [11]. To address this knowledge gap, this study aimed to evaluate the inter-rater agreement of GOS-E scoring between an expert rater and trauma registry follow-up staff with a sample of detailed trauma case scenarios.

Section snippets

Raters

Raters included in the study were interviewers employed by the Victorian State Trauma Outcome Registry and Monitoring Group (VSTORM) to collect long-term outcomes data for the Victorian State Trauma Registry[12] and the Victorian Orthopaedic Trauma Outcomes Registry[13]. Ethics approval for the registries is provided by human research ethics committees at all participating hospitals and by the Monash University Human Research Ethics Committee, and covers regular staff training and testing. The

Results

Across the 15 cases, the percentage of agreement between individual raters and the expert ranged from 63% to 100% (Table 1). Two of the five cases with 100% agreement, were those with the lowest (lower severe disability) and the highest (lower good recovery) GOS-E scores. The greatest discrepancy between individual raters and the expert was for cases number six and eight (Box 2) for which only 10 out of 16 raters gave the correct scores; the remaining six raters providing a score of lower

Discussion

The aim of this study was to evaluate the inter-rater agreement of GOS-E-scoring between an expert rater and 16 trained telephone interviewers employed by trauma registries to capture long-term outcomes data. One hundred per cent agreement was reached between all raters and the expert on five out of 15 sample trauma cases and the average kappa weighting across all cases showed ‘almost perfect’ agreement.

Cases with more extreme GOS-E scores (i.e. good recovery or severe disability) generally had

Conclusion

Sixteen follow-up staff achieved ‘substantial’ to ‘almost perfect’ agreement with an expert rater using the GOS-E outcome measure to score 15 sample trauma cases. Percentage of agreement, kappa and weighted kappa values were comparable with or higher than previous inter-rater agreement studies with head-injured patients. The results of this study lend support to the use of the GOS-E within trauma populations and highlight the importance of ongoing training and feedback where multiple raters are

Conflict of interest statement

The authors of this manuscript certify that they have NO affiliations with or involvement in any organisation or entity with any financial interest (such as honoraria; educational grants; participation in speakers’ bureaus; membership, employment, consultancies, stock ownership, or other equity interest; and expert testimony or patent-licensing arrangements), or non-financial interest (such as personal or professional relationships, affiliations, knowledge or beliefs) in the subject matter or

Acknowledgements

The Victorian State Trauma Registry (VSTR) is a Department of Health, State Government of Victoria and Transport Accident Commission (TAC) funded initiative. The Victorian Orthopaedic Trauma Outcomes Registry (VOTOR) is funded by the TAC via the Institute for Safety Compensation and Recovery Research (ISCRR). Professor Gabbe is supported by a National Health and Medical Research Council (NHMRC) Career Development Fellowship (ID: 1048731). Study sponsors had no involvement in the study design,

References (18)

  • A. Ardolino et al.

    Outcome measurements in major trauma-results of a consensus meeting

    Injury

    (2012)
  • D. Urquhart et al.

    Characterisation of orthopaedic trauma admitted to adult Level 1 trauma centres

    Injury

    (2006)
  • P.A. Cameron et al.

    A statewide system of trauma care in Victoria: effect on patient survival

    Med J Aust

    (2008)
  • B.J. Gabbe et al.

    Improved functional outcomes for major trauma patients in a regionalized, inclusive trauma system

    Ann Surg

    (2012)
  • G.K. Sleat et al.

    Outcome measures in major trauma care: a review of current international trauma registry practice

    Emerg Med J

    (2011)
  • J.T. Wilson et al.

    Structured interviews for the Glasgow Outcome Scale and the extended Glasgow Outcome Scale: guidelines for their use

    J Neurotrauma

    (1998)
  • B. Jennett et al.

    Disability after severe head injury: observations on the use of the Glasgow Outcome Scale

    J Neurol Neurosurg Psychiatry

    (1981)
  • A.I. Maas et al.

    Agreement between physicians on assessment of outcome following severe head injury

    J Neurosurg

    (1983)
  • J. Lu et al.

    Effects of Glasgow Outcome Scale Misclassification on Traumatic Brain Injury Clinical Trials

    J Neurotrauma

    (2008)
There are more references available in the full text version of this article.

Cited by (18)

  • Low interobserver and intraobserver reliability using the Matta radiographic system for intraoperative assessment of reduction following acetabular ORIF

    2022, Injury
    Citation Excerpt :

    This would result in a lower kappa value when assessing reliability compared to unweighted kappa [25]. The weighted kappa has been used in multiple orthopedic trauma studies to assess agreement between raters’ evaluations of ranked data [26–28]. Kappa values range from −1 (complete disagreement) to 1 (complete agreement), with 0 indicating agreement no better than chance.

  • Critical evaluation of YouTube videos on colostomy and ileostomy: Can these videos be used as learning resources?

    2022, Patient Education and Counseling
    Citation Excerpt :

    For content accuracy, a list summarizing the five reference resources [1,2,11,22–24] helped researchers in making their scores. The inter-rater agreement between researchers was calculated using Cohen Kappa [31]. A total of 1816 videos were screened, of these 149 were found relevant to colostomy or ileostomy and matching with the inclusion criteria of the study (Fig. 1).

  • Twelve month mortality rates and independent living in people aged 65 years or older after isolated hip fracture: A prospective registry-based study

    2020, Injury
    Citation Excerpt :

    For this study, GOS-E outcomes were dichotomised as independent living (5 to 8) vs death or severe disability (1 to 4). The GOS-E has good inter-rater reliability [27], can be administered by proxy (e.g. next of kin or carer), and is recommended for measuring long-term outcomes after traumatic injury [28–30]. Analyses were performed using Stata, Version 14.

  • Management of polytrauma patients in the UK: Is there a ‘weekend effect’?

    2016, Injury
    Citation Excerpt :

    In case of severe haemodynamic instability, the trauma CT was delayed and where deemed necessary patients underwent immediate surgical intervention. The Trauma Audit and Research Network (UK-TARN, http://www.tarn.ac.uk) database is a multicentre prospective trauma registry, which was initially designed to assess the effectiveness of the management of serious injury in the United Kingdom [14,15]. We requested and obtained the data for all patients treated from April 1st, 2013 to August 31st, 2015 in our institution.

View all citing articles on Scopus
View full text