A usability study of the obamacare website: Evaluation and recommendations

https://doi.org/10.1016/j.giq.2017.01.003Get rights and content

Highlights

  • We drew on the usability.gov guidelines to assess the usability of healthcare.gov using a survey of 374 U.S. citizens.

  • Five key usability problems emerged: hardware and software, home page, screen, scrolling and paging, and user experience.

  • We found that citizen satisfaction and intention to use the website were rated poorly.

  • The findings suggested interesting patterns based on gender, age, and voting behavior (for Obama or not).

  • We provided illustrations of the usability problems and recommendations based on the usability.gov guidelines.

Abstract

We conducted a usability study of the healthcare.gov website, popularly known as the Obamacare website, using the guidelines available on usability.gov, which were published by the Department of Health and Human Services. The study was conducted among 374 citizens. We found that the interface design, which we conceptualized as 16 dimensions, was rated rather low. Specifically, five dimensions of usability emerged as key to the prediction of overall usability of the website: hardware and software, home page, screen, scrolling and paging, and user experience. We also found that citizen satisfaction and intention to use the website were rated poorly. Based on a break down by gender, age and voting behavior (for Obama or not), we found several interesting patterns of differences. Ultimately, even if the infrastructure issues that have received a bulk of the media attention are miraculously resolved, our findings suggest that the site will be found wanting. The article offers specific illustrative examples of usability problems with the website and specific recommendations drawn from usability.gov. In addition to the practical implications for Obamacare, the article offers significant implications for researchers who seek to evaluate the usability of websites in general and healthcare websites in particular.

Introduction

From its inception, Obamacare and the associated website, healthcare.gov, has been an ambitious e-government initiative that seeks to facilitate health insurance services for millions of Americans. It would be an understatement to say that the website has had a rough start. Although opinions vary on the nature of the website problems, they all agree that such problems could have been avoided with further testing. Contractors in charge of building the website testified that the administration went with the launch of the website despite insufficient testing (Somashekhar & Goldstein, 2013). In particular, a complete end-to-end testing had not been properly conducted (Pear, 2013). Assuming that time was an important factor in launching the website and that agile methods have been adopted, careful testing would still be important to meet the citizens' needs (Kude et al., 2014, Kude and Dibbern, 2009, Kude et al., 2012). As more issues unfold, launching a complete functioning website by the end of November 2013, as originally promised, was deemed unrealistic (Dwyer, 2013) and latest reports reveal that although few navigation issues improved, the website continues to have problems.

The scope and nature of healthcare.gov offer an interesting opportunity to learn from the government's experience in healthcare. From a technological perspective, press reports and experts analyzing the website seem to cluster into two main problem areas: interface design and integration/infrastructure issues coupled with the hiring of incompetent vendors, such as CGI Global (Ferenstein, 2013). The integration issues have attracted much more of the media attention. Integration/infrastructure issues relate to data storage, telecommunications, and interoperability among different systems that communicate with healthcare.gov.

Interface design issues have been the cause of a great share of the website's poor performance since its launch. Interface design issues concern a wide range of usability factors, such as the user experience, navigation and content (Donker-Kuijer et al., 2010, Elling et al., 2012, Huang and Benyoucef, 2014, Shareef et al., 2016, Thong et al., 2006, Venkatesh et al., 2014, Youngblood and Mackiewicz, 2012). Interface design issues have also contributed to citizens' problems with the website (Brown et al., 2012, Dwyer, 2013, Hu et al., 2014, Hu and Hui, 2012, Thong et al., 2002). For example, it may not be clear to users that there is content below the virtual page fold, depending on the resolution with which users view the main page (Cardello, 2013). Error messages do not point out the specific issue that caused the error so that users can fix it easily (Cardello, 2013). Furthermore, instructions and content did not consider low-literacy readers (Cardello, 2013). The content was not adequately streamlined (Shah, 2013). One study (Tomlin, 2013) recruited real users of healthcare.gov and in a user experience test found several user experience issues, such as difficulty in finding information about plans and costs, creating logins and using the chat feature. Navigation was an issue as users needed to click many times to get the information that they need (Shah, 2013). Wong et al. (2014) observed 33 young adults and reported their experience with healthcare.gov. The analysis revealed several issues in the content, such as the need for better explanations of healthcare insurance terminology, affordability provisions for qualifying customers, and options for adult dental coverage (Wong et al., 2014). Overall, the noted recent reports on interface design issues provide interesting insights about the user experience with healthcare.gov. However, these reports are usually based on small sample sizes and/or rely heavily on observations and interviews, which might limit the breadth and depth of the examined usability issues. This trend in analyzing usability issues with e-government websites is understandable given the lack of contextualized usability methods and instruments (Bertot and Jaeger, 2006, de Roiste, 2013, Velsen et al., 2009).

Against this backdrop, when we consider the use of websites, critical drivers include not only basic usability, which is defined by the International Standards Organization as the extent to which a website can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use (Venkatesh & Ramesh, 2006), but also contextualized usability. Specifically, whether extensive usability testing was conducted on the Obamacare website is a potential open question but perhaps more important is the question of whether the website developers listened to the government itself so to speak. Usability.gov is a government website that provides guidelines for developing usable websites. According to the usability.gov site, it “…is the leading resource for user experience (UX) best practices and guidelines, serving practitioners and students in the government and private sectors.” Hence, we apply the usability.gov standards to (a) evaluate the usability of healthcare.gov; and (b) improve our understanding of usability in the context of large scale e-government applications. We tackle the following question: “based on the usability.gov standards, what are the most important drivers of overall usability, citizen satisfaction, and intention to use in the context of large-scale e-government applications?

In the context of large scale e-government applications, which are rapidly gaining popularity as a key way of connecting government to citizens and delivering government services to citizens, usability of websites can drive citizens' trust in the services, government and even overall satisfaction with the government and its services (Brown et al., 2004, Chan et al., 2010).1 In for-profit organizations, usability of websites are known drivers of various outcomes, such as online retail sales, brand image and continued intention to shop on a site (Rai et al., 2013, Rai and Tang, 2014). Well-designed sites, e.g., Amazon, have complemented an effective business strategy to deliver business success. Other relevant technological characteristics, such as compatibility, are important drivers of intention to adopt technology in business-to-business contexts (Fosso Wamba, Gunasekaran, Bhattacharya, & Dubey, 2016). Thus, we build on the existing body of knowledge about usability, with an emphasis on large-scale e-government applications.

The rest of the paper is organized as follows. In the next section, we describe our research approach to evaluate the usability of healthcare.gov based on the usability.gov standards. Then, we present our major findings and provide an illustration of the usability problems based on our findings. Finally, we conclude the paper with a summary of the major findings and implications for practice.

Section snippets

Research approach

We adapted the guidelines provided on usability.gov, published by the U.S. Department of Health and Human Services (U.S. Department of Health and Human Services, 2006), to develop a conceptualization and associated survey instrument to assess the usability of healthcare.gov. This approach is consistent with previous work using the Microsoft guidelines (Agarwal & Venkatesh, 2002) to evaluate several organizational websites (Agarwal and Venkatesh, 2002, Venkatesh and Agarwal, 2006), including

Findings

We first examined which of the 16 dimensions were important in terms of citizens' satisfaction with the website and their likelihood to continue to use the website. Five dimensions emerged as statistically significant: hardware and software, home page, screen, scrolling and paging, and user experience. The remaining analyses focused on these five dimensions along with overall usability, citizen satisfaction, and continued intention to use the website. We provide the average rating of the noted

Problems and recommendations: six illustrations

While a bulk of the attention has been directed at the infrastructure issues of healthcare.gov, there are substantial interface design issues with the website. Those who voted for Obama in 2012 appear to be more favorable, or perhaps more forgiving (e.g., citizens whose age is 30 or above and voted for Obama), in their evaluations. However, the evaluation in all groups is far from acceptable. Even in cases where citizens who favor a particular health policy may be more forgiving, designers need

Conclusion

Interface design issues with the Obamacare website render the website to be less than usable. By a miracle, even if the infrastructure issues were resolved, even Obama backers are likely to continue to find the site to be wanting. Considering that the goal of the website is to reach citizens who are likely to be less computer literate, the usability issues are likely to hinder their use of the website. The illustrative problems noted in this paper and potential solutions, simple as they are,

Viswanath Venkatesh, who completed his PhD at the University of Minnesota in 1997, is a Distinguished Professor and Billingsley Chair in Information Systems at the Walton College of Business, University of Arkansas. Prior to joining Arkansas, he was on the faculty at the University of Maryland. In addition to presenting his work at universities and organizations around the world, he has held visiting appointments at universities around the world. His research focuses on understanding diffusion

References (50)

  • J.Y.L. Thong et al.

    Understanding user acceptance of digital libraries: what are the roles of interface characteristics, organizational context, and individual differences?

    International Journal of Human-Computer Studies

    (2002)
  • J.Y.L. Thong et al.

    The effects of post-adoption beliefs on the expectation-confirmation model for information technology continuance

    International Journal of Human-Computer Studies

    (2006)
  • V. Venkatesh et al.

    Designing e-government services: Key service attributes and citizens' preference structure

    Journal of Operations Management

    (2012)
  • V. Venkatesh et al.

    A usability evaulation of the Obamacare website

    Government Information Quarterly

    (2014)
  • N. Youngblood et al.

    A usability analysis of municipal government website home pages in Alabama

    Government Information Quarterly

    (2012)
  • A. Zuiderwijk et al.

    Acceptance and use predictors of open data technologies: Drawing upon the unified theory of acceptance and use of technology

    Government Information Quarterly

    (2015)
  • R. Agarwal et al.

    Assessing a firm's web presence: a heuristic evaluation procedure for the measurement of usability

    Information Systems Research

    (2002)
  • S.A. Brown et al.

    Who's afraid of the virtual world? Anxiety and computer-mediated communication

    Journal of the Association for Information Systems

    (2004)
  • S.A. Brown et al.

    Expectation confirmation in technology use

    Information Systems Research

    (2012)
  • J. Cardello

    Healthcare.gov's account setup: 10 broken usability guidelines

    (2013)
  • F.K.Y. Chan et al.

    Modeling citizen satisfaction with mandatory adoption of an e-government technology

    Journal of the Association for Information Systems

    (2010)
  • M.W. Donker-Kuijer et al.

    Usable guidelines for usable websites? An analysis of five e-government heuristics

    Government Information Quarterly

    (2010)
  • D. Dwyer

    Healthcare.gov fixes lost in translation: ‘???’

    (2013)
  • G. Ferenstein

    Obamacare's rollout is a disaster that didn't have to happen

    (2013)
  • S. Fosso Wamba et al.

    Factors related to social media adoption and use for emergency services operations: The case of the NSW SES

  • Cited by (27)

    • Usability of state public health department websites for communication during a pandemic: A heuristic evaluation

      2021, International Journal of Industrial Ergonomics
      Citation Excerpt :

      While the main challenges in designing these emergency communication websites initially focused on managing the amount of trustworthy information to share with the public (Green III, 2001), more recent studies evaluating government websites for emergency management and communication have focused on usability and accessibility (Youngblood and Youngblood, 2018). Venkatesh et al. (2017) examined the usability of the healthcare.gov website using the US Department of Health and Human Services usability evaluation guidelines (Shneiderman et al., 2006). The healthcare.gov website was designed to provide health insurance access to the public.

    • Being at the cutting edge of online shopping: Role of recommendations and discounts on privacy perceptions

      2021, Computers in Human Behavior
      Citation Excerpt :

      A large body of research on human-computer interaction has been developed to understand how different “searching” and “browsing” tasks influence shopping tasks influence shopping outcomes (Hong et al., 2004c). In this domain, prior studies also investigated different product listing form (e.g., Hong et al., 2007; 2004a; 2004b), animated products/banner-ads (e.g., Cheung et al., 2017; Hong et al., 2004c), usability (e.g., Hoehle & Venkatesh, 2015; Venkatesh et al., 2017), trust (e.g., Thongpapanl et al., 2018), and brand equity (Xu et al., 2014) and examined the differential effects on online shopping behavior. Such factors should be integrated in our research model to understand their influence in context of recommendation systems and associated outcomes.

    View all citing articles on Scopus

    Viswanath Venkatesh, who completed his PhD at the University of Minnesota in 1997, is a Distinguished Professor and Billingsley Chair in Information Systems at the Walton College of Business, University of Arkansas. Prior to joining Arkansas, he was on the faculty at the University of Maryland. In addition to presenting his work at universities and organizations around the world, he has held visiting appointments at universities around the world. His research focuses on understanding diffusion of technology in organizations and society. His research has been published in leading information systems, marketing, operations management, organizational behavior, human-computer interaction, medical informatics, and psychology journals. He is widely regarded as one of the most influential scholars in business and economics. His papers have been cited about 56,000 times and 13,000 times per Google Scholar and Web of Science, respectively. He has also published a book titled “Road to Success: A Guide for Doctoral Students and Junior Faculty Members in the Behavioral and Social Sciences” (www.vvenkatesh.com/book). In 2014, he was recognized by Thomson Reuters as one of only 95 high-impact scholars in business and economics (highlycited.com). In 2008, his MIS Quarterly (2003) paper was identified as a current classic by Science Watch (a Thomson Reuters' service) and since 2009, it is the most influential article in one of the four Research Front Maps in business and economics. In 2009, he launched an IS research rankings web site (myvisionresearch.com), which is affiliated with the Association of Information Systems, that has received many accolades from the academic community. He has served or is currently serving on the editorial boards of MIS Quarterly, Information Systems Research, Management Science, Production and Operations Management, Journal of the AIS, and Decision Sciences.

    Hartmut Hoehle is an Assistant Professor of Information Systems in the Sam M. Walton School of Business at the University of Arkansas. Dr. Hoehle received a PhD in Information Systems from Victoria University of Wellington, New Zealand. He was a lecturer at the School of Accounting and Business Information Systems, Australian National University. Stemming from his professional experiences gained while working at Deutsche Bank, he is particularly interested in how services and products can be distributed through electronically mediated channels in a retail context. His work has appeared or is forthcoming in MIS Quarterly, Decision Support Systems, Journal of Computer Information Systems and the International Journal of Electronic Finance. He has presented at the International Conference on Information Systems and European Conference on Information Systems. He has also served as a reviewer for leading journals including MIS Quarterly.

    Ruba Aljafari is a doctoral student at the University of Arkansas. She received her Masters degree from University of Nebraska.

    View full text