Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Apr 6, 2021
Date Accepted: Aug 26, 2021

The final, peer-reviewed published version of this preprint can be found here:

Questioning the Yelp Effect: Mixed Methods Analysis of Web-Based Reviews of Urgent Cares

Hu D, Liu CMH, Hamdy R, Cziner M, Fung M, Dobbs S, Rogers L, Turner MM, Broniatowski DA

Questioning the Yelp Effect: Mixed Methods Analysis of Web-Based Reviews of Urgent Cares

J Med Internet Res 2021;23(10):e29406

DOI: 10.2196/29406

PMID: 34623316

PMCID: 8538031

What Yelp Effect? An Analysis of Hundreds of Thousands of Online Reviews of Urgent Cares

  • Dian Hu; 
  • Cindy Meng-Hsin Liu; 
  • Rana Hamdy; 
  • Michael Cziner; 
  • Melody Fung; 
  • Samuel Dobbs; 
  • Laura Rogers; 
  • Monique Mitchell Turner; 
  • David André Broniatowski

ABSTRACT

Background:

Providers of on-demand care, such as in urgent care, may prescribe antibiotics unnecessarily because they fear receiving negative reviews online from unsatisfied patients – the so-called “Yelp Effect”. This effect is hypothesized to be a significant driver of inappropriate antibiotic prescribing, exacerbating antibiotic resistance.

Objective:

In this study, we aim to determine the frequency with which patients left negative reviews online after having expected, but not received, antibiotics in an urgent care setting.

Methods:

We obtained a list of 8662 urgent care facilities from the Yelp Application Programming Interface (API). Using this list, we automatically collected 481825 online reviews from Google Maps between January 21st, and Feb 10th, 2019. We used machine learning algorithms to summarize the contents of these reviews. Additionally, 200 randomly sampled reviews were analyzed by four annotators to verify the types of messages present and whether they were consistent with the “Yelp Effect”.

Results:

We collected 481825 reviews, of which 1696 (95% CI: 1240-2152) exhibited the “Yelp effect”. Instead, negative reviews primarily identified operations issues: wait times, rude staff, billing, and communication.

Conclusions:

Urgent care patients rarely express expectation for antibiotics in negative online reviews. Thus, our findings do not support an association between lack of antibiotic prescriptions and negative online reviews. Rather, patient dissatisfaction in urgent care was most strongly linked to operations issues that are not related to the clinical management plan. Clinical Trial: This research was approved by The George Washington University Committee on Human Research, Institutional Review Board (IRB), FWA00005945 (IRB #180804).


 Citation

Please cite as:

Hu D, Liu CMH, Hamdy R, Cziner M, Fung M, Dobbs S, Rogers L, Turner MM, Broniatowski DA

Questioning the Yelp Effect: Mixed Methods Analysis of Web-Based Reviews of Urgent Cares

J Med Internet Res 2021;23(10):e29406

DOI: 10.2196/29406

PMID: 34623316

PMCID: 8538031

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.

Advertisement