A usability evaluation of a SNOMED CT based compositional interface terminology for intensive care

https://doi.org/10.1016/j.ijmedinf.2011.09.010Get rights and content

Abstract

Objective

To evaluate the usability of a large compositional interface terminology based on SNOMED CT and the terminology application for registration of the reasons for intensive care admission in a Patient Data Management System.

Design

Observational study with user-based usability evaluations before and 3 months after the system was implemented and routinely used.

Measurements

Usability was defined by five aspects: effectiveness, efficiency, learnability, overall user satisfaction, and experienced usability problems. Qualitative (the Think–Aloud user testing method) and quantitative (the System Usability Scale questionnaire and Time-on-Task analyses) methods were used to examine these usability aspects.

Results

The results of the evaluation study revealed that the usability of the interface terminology fell short (SUS scores before and after implementation of 47.2 out of 100 and 37.5 respectively out of 100). The qualitative measurements revealed a high number (n = 35) of distinct usability problems, leading to ineffective and inefficient registration of reasons for admission. The effectiveness and efficiency of the system did not change over time. About 14% (n = 5) of the revealed usability problems were related to the terminology content based on SNOMED CT, while the remaining 86% (n = 30) was related to the terminology application. The problems related to the terminology content were more severe than the problems related to the terminology application.

Conclusions

This study provides a detailed insight into how clinicians interact with a controlled compositional terminology through a terminology application. The extensiveness, complexity of the hierarchy, and the language usage of an interface terminology are defining for its usability. Carefully crafted domain-specific subsets and a well-designed terminology application are needed to facilitate the use of a complex compositional interface terminology based on SNOMED CT.

Highlights

► User-based usability evaluations of an interface terminology. ► Usability is evaluated on five aspects: effectiveness, efficiency, learnability, overall user satisfaction, and experienced usability problems. ► Detailed insight into how clinicians interact with a controlled compositional terminology through a terminology application. ► The extensiveness, complexity of the hierarchy, and the language usage of an interface terminology is defining for its usability.

Introduction

Interface terminologies are used for data entry into electronic medical records, facilitating collection of clinical data while simultaneously linking users’ own descriptions to structured data elements from a reference terminology such as Systematized Nomenclature of Medicine – Clinical Terms, (SNOMED CT) [1], [2]. Before full implementation, the usability of such an interface terminology should be tested in clinical practice to evaluate whether it meets its intended purpose and to discover areas for improvement. This study reports on the usability evaluation of DICE (Diagnoses for Intensive Care Evaluation), an interface terminology based on SNOMED CT deployed through a terminology application in a Patient Data Management System (PDMS) (Metavision, iMDsoft, Sassenheim, The Netherlands) to register the reasons for admission in Intensive Care (IC) [3].

It has been argued that the correctness and specificity of terminology-based data registration in clinical settings does not only depend on the content of the terminological system, but also on certain user characteristics such as their registration habits and their experience with the terminological system [4], [5]. Furthermore, usability issues concerning the design of the graphical user interface of the terminology application impacts the efficacy of terminology-based data registration [4], [5]. Clinicians will optimally use an interface terminology for structured data entry if its presentation and browsing of information through the terminology application is intuitive, easy to use and not time consuming [6]. Furthermore, for the human–computer interaction to be effective, the action sequences that the users have to carry out in the application should map to the user's mental model [7].

Typically, terminologies are evaluated in terms of content coverage while terminology applications for data entry are evaluated in terms of usability [8], [9], [10], [11]. Combined approaches to examine how clinicians interact with the terminological system during data entry are gaining interest [5], [12], [13], [14]. A user's inability to find a clinical concept using an interface terminology might be caused by a misunderstanding of the terminology content or may be due to the (lack of) functionalities and the graphical user interface design of the terminology application. Therefore, the evaluation of an interface terminology should not only concern the terminology content, but also the data entry application as integrated in a PDMS [1], [2], [5]. In this study both aspects of the DICE system were evaluated. Usability, i.e. the extent to which users can achieve specific sets of tasks in a particular environment [15], was measured on five aspects: effectiveness (accuracy and completeness of the recorded reasons for admission), efficiency (time spent to retrieve the correct concept in relation to the effectiveness), overall user satisfaction (users’ attitude with respect to the usability of the DICE system), learnability (whether the users can easily learn to use DICE and improve their performance over time), and usability problems encountered by the DICE users [1], [15], [16], [17], [18]. To assess the five usability aspects, we examined how clinicians interacted with the interface terminology during data entry. Evaluation was performed both at the baseline and three months after full implementation of the system. The purpose of evaluating the system before and after the implementation was to assess the learnability of DICE and to study whether the usability problems revealed at the baseline persisted after the system had been in routine use for three months [17]. Consequences of usability problems, such as reduced data quality, are out of scope, and hence have not been studied.

Section snippets

Interface terminology and DICE application

The interface terminology based on SNOMED CT contains an IC specific subset of SNOMED CT (referred to as terminology content) with 83,125 disorder and procedure concepts, and 150,657 attribute values to further specify the disorder and procedure concepts and their English terms [3]. This core SNOMED CT subset was extended with 325 concepts, 1243 relationships between these concepts, and 597 descriptions which are not present in SNOMED CT, but are needed to cover reasons of admission in the

Setting

This study was performed in an adult Dutch ICU with 28 beds, with more than 1500 yearly admissions. Since 2002, this ward uses a commercial PDMS. This PDMS is a point-of-care clinical information system, which runs on a Microsoft Windows platform, uses a SQL server database and includes computerized order entry; automatic data collection from bedside devices; some clinical decision support; and (free-text) documentation of clinical phrases (e.g. reasons for admission and complications) during

Overall user satisfaction: the SUS score

The results of the SUS questionnaire indicate that the usability of the DICE system fell short of the users’ expectations. In the pre-implementation test, the average SUS score was 47.2 (range: 32.5–65). In the post-implementation test, this average decreased to 37.5 (range: 0–65). So, according to the scale provided by Bangor et al. [26] (score of 0–25: worst, score of 25–39: poor, score of 39–52: OK, score of 52–85: excellent, and score of 85–100: best imaginable), the acceptability of the

Discussion and conclusions

In this study we performed a usability evaluation of a large compositional interface terminology based on SNOMED CT for registration of reasons for ICU admission. Overall, the results of the usability evaluation revealed that the usability of the DICE system fell short of the users’ expectations. The use of post-coordination is cumbersome and was influenced by the poor usability of the terminology application as well as the size of the terminology content. The qualitative measurements revealed

Conclusions and directions for further research

This study provides a detailed insight in how clinicians interact with a large controlled compositional interface terminology to carry out data entry. The combination of qualitative and quantitative analyses enabled us to observe and analyze clinicians’ interactions with a domain specific interface terminology.

The usability problems could be linked to either the interface terminology or the terminology application. Nevertheless the source for the inadequate user performances was found in the

Authors’ contributions

FBR was the initiator of the study and contributed to conception and design of the study, acquisition of the data, analyses and interpretation of the data. She wrote the first draft of the article and processed the comments of the co-authors. NdK, RC and MJ contributed to conception and design of the study, data interpretation and critically revised all drafts of the article. MD contributed to the acquisition of data, analyses and interpretation of data and reviewed all drafts of the article.

Conflict of interest

None.

Acknowledgements

The authors thank the participants of the usability evaluation sessions for their cooperation in conducting this study. The authors would also like to express their gratitude to the usability experts R. Khajouei and L. Peute for their contribution in rating the severity of the observed usability problems.

Summary points

What was known before the study?

  • Interface terminologies are used for actual data entry into electronic records, facilitating collection of clinical data while simultaneously

References (49)

  • K. Douglas et al.

    System design challenges: the integration of controlled vocabulary use into daily practice

    Stud. Health Technol. Inform.

    (1997)
  • P. Carayon

    Human computer interaction in health care

  • H. Navas et al.

    Creation and evaluation of a terminology server for the interactive coding of discharge summaries

    Stud. Health Technol. Inform.

    (2007)
  • J. Liaskos et al.

    Measuring the user acceptance of a web-based nursing documentation system

    Methods Inf. Med.

    (2006)
  • P.L. Elkin et al.

    Standardized problem list generation, utilizing the Mayo canonical vocabulary embedded within the Unified Medical Language System

    Proc. AMIA Annu. Fall Symp.

    (1997)
  • L.K. McKnight et al.

    Barriers to the clinical implementation of compositionality

    Proc. AMIA Symp.

    (1999)
  • A.D. Poon et al.

    The PEN-Ivory project: exploring user-interface design for the selection of items from large controlled vocabularies of medicine

    J. Am. Med. Inform. Assoc.

    (1996)
  • A. Kushniruk et al.

    Cognitive evaluation of the user interface and vocabulary of an outpatient information system

    Proc. AMIA Annu. Fall Symp.

    (1996)
  • Human-centred design processes for interactive systems. ISO standard 13407:1999,...
  • J. Nielsen

    Usability Engineering

    (1995)
  • J. Kjeldskov et al.

    A longitudinal study of usability in health care: does time heal?

    Int. J. Med. Inform.

    (2008)
  • A.W. Kushniruk et al.

    Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces

    Proc. AMIA Annu. Fall Symp.

    (1997)
  • R. Cornet et al.

    An architecture for standardized terminology services by wrapping and integration of existing applications

    AMIA Annu. Symp. Proc.

    (2003)
  • J.E. Zimmerman et al.

    Acute Physiology and Chronic Health Evaluation (APACHE) IV: hospital mortality assessment for today‘s critically ill patients*

    Crit. Care Med.

    (2006)
  • Cited by (0)

    View full text