Skip to content
BY 4.0 license Open Access Published by De Gruyter June 15, 2022

Multisite assessment of emergency medicine resident knowledge of evidence-based medicine as measured by the Fresno Test of Evidence-Based Medicine

  • James Katsilometes , Michael Galuska , Chadd K. Kraus , Howard W. Levitin , Scott Leuchten , Jane Daugherty-Luck , Julie Lata , Grace Brannan , Anthony Santarelli EMAIL logo , John Ashurst and and for the FOEM Research Network

Abstract

Context

Evidence-based medicine (EBM) is the application of scientific evidence while treating a patient. To date, however, there is very little evidence describing how residents in emergency medicine understand and incorporate EBM into practice.

Objectives

The aim of this study was to determine EBM theoretical and quantitative knowledge in emergency medicine residents in community hospital-based training programs.

Methods

A sample of emergency medicine residents from nine hospitals was enrolled to complete a cross-sectional assessment of EBM skills from April 2021 through June 2021. Performance on the Fresno Test of Evidence-Based Medicine (FTEBM) was assessed utilizing descriptive statistics, t tests, and one-way analysis of variance.

Results

A total of 50.8% (124/244) of current emergency medicine residents completed the FTEBM during the study period. No significant difference on FTEBM scores was noted between the different types of medical degrees (DO vs. MD) (p=0.511), holding an advanced research degree (p=0.117), or between each postgraduate year of training (p=0.356). The mean score of those residents who rated their knowledge of EBM as average or higher was 36.0% (32.8–39.1%). The mean score of those residents who rated their programs as having an “average” or higher institutional focus on EBM was 34.9% (32.2–37.6%).

Conclusions

Participating emergency medicine residents show an incomplete understanding of EBM both in theory and applied computations despite rating themselves as having an average understanding. Emergency medicine residencies would be well suited to implement a standardized EBM curriculum that focuses on longitudinal reinforcement of key concepts needed for the practicing physician.

Evidence-based medicine (EBM) is defined as the judicious use of the current evidence in making decisions about the healthcare of individual patients [1]. Research has shown that utilizing an evidence-based strategy to treat patients reduces medical errors, increases individualized patient care, and supports the application of best practices in the clinical setting [2, 3]. However, many learners in undergraduate and graduate medical education lack the proficiency needed to adequately interpret and utilize EBM in treating patients despite it being a core competency according to many professional societies [4, 5].

According to the Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Emergency Medicine (ABEM), EBM is a key milestone for those in emergency medicine graduate medical education [6]. The practice-based performance improvement milestone notes that a resident should be competent in performance improvement to optimize emergency department function and self-learning at the time of graduation. Competency in EBM has been defined as the ability for a practitioner to develop structured clinical questions, identify the available literature to answer questions, critically appraise and apply the methodology and results of the literature to clinical practice, and evaluate the approach undertaken during a literature search [7]. Assessing competency in EBM can be completed with the use of the Fresno Test of Evidence-Based Medicine (FTEBM). The FTEBM is a validated assessment tool that evaluates a learner’s ability to frame a research question, search for evidence in the literature, understand the hierarchy of evidence, and interpret magnitude and validity as well as basic statistical and methodological concepts [8].

It has been reported that medical students have rated their knowledge of evidence-based principles higher than objectively measured outcomes on a standardized examination [9]. Research has also shown that senior-level medical students demonstrate competence in only about half of all evidence-based medical topics that are needed in graduate medical education [10]. Without having a baseline knowledge from undergraduate medical education, those in graduate medical education may not reach the highest levels of competency based upon the ACGME milestones. The discrepancy between perceived and objectively measured competence in EBM in medical students has necessitated the systematic study of emergency medicine residents’ knowledge of EBM. The objective of this study was to determine basic EBM theoretical and quantitative knowledge in a cross-sectional sample of emergency medicine residents.

Methods

Setting

The Foundation for Osteopathic Emergency Medicine (FOEM) research network is comprised of a group of ACGME emergency medicine residency training programs across the United States. A total of nine sites agreed to participate after initial contact and disseminated the enrollment survey and the FTEBM to their residents. The sites with the FOEM research network include Kingman Regional Medical Center, McLaren Macomb, Saint Barnabas Medical Center, Sunrise Medical Center, Trinity Health System, Conemaugh Memorial Medical Center, Ohio Health, Geisinger Medical Center, and Ascension Macomb.

Study design

All study procedures were approved by the Kingman Regional Medical Center Institutional Review Board and designated the approval number KHI-0204. Sixteen emergency medicine residencies in the United States were contacted in February 2021 to determine their interest in participating. Nine (56.3%, 9/16) residencies agreed to participate. Written informed consent was obtained from each participant prior to data collection. Participating institutions were provided with physical copies of the enrollment surveys, and the FTEBM were provided to each site’s primary investigator following agreement to participate as a research site. Residencies enrolled PGY-1 through PGY-4 residents between April 1, 2021 and July 30, 2021. At the time of enrollment, residents were provided with an hour to complete the assessment before returning it to the site primary investigator.

Fresno test of evidence-based medicine (FTEBM)

The FTEBM is a questionnaire that consists of 12 items related to EBM principles that measure the understanding and practical application of research concepts such as forming a question, finding evidence to answer that question, internal and external validity, and computing basic clinical statistics. The highest total score that could be obtained is 128. Each section has the following maximum scores: Writing Clinical Questions=6, Information Sources=6, Search Strategies=8, Clinical Study Design=12, Literature Relevancy=12, Internal Validity=24, Magnitude and Significance=12, Diagnostic Accuracy=20, Risk Reduction=16, Statistical Significance=4, Diagnostic Study Design=4, and Prognostic Study Design=4. Performance by question type (theoretical or quantitative) was also assessed: theoretical included questions 1–7, 11, and 12, while quantitative included questions 8, 9, and 10. Upon return of the surveys and FTEBM to the primary investigator, a standardized Fresno grading system was utilized to assess the completed FTEBM by two trained evaluators. Each evaluator was trained by the primary investigator of the project, who also resolved any scoring questions that arose during grading.

Statistical analysis

Data were analyzed utilizing SPSS statistics version 27 (IBM Corp., Armonk, NY). The demographic results of the participating residents were assessed utilizing descriptive statistics. Total performance, performance on the theoretical questions, and performance on the quantitative questions were assessed based on postgraduate year, medical degree, sex, and additional educational degree attainment by the independent-samples t-test. As a baseline for improvement across residencies, we calculated the proportion of residents who performed above the 50% criterion on questions. Individual items on the FTEBM were then assessed for improvement over the years of residency training via an analysis of variance with polynomial contrasting. Statistical significance was set at a p≤0.05.

Results

Of the 124 submitted FTEBMs for grading, only 51.6% (64/124) had all 12 questions answered. A total of 48.4% (60/124) of FTEBMs was submitted with one or more question left unanswered. The most common question left unanswered was quantitative question 9 (risk reduction) (36.3% (45/124)) (Table 1).

Table 1:

Resident response rates by item.

Item Topic Response rate
1 Writing clinical questions 100.0% (124/124)
2 Information sources 100.0% (124/124)
3 Search strategies 98.4% (122/124)
4 Clinical study design 99.2% (123/124)
5 Literature relevancy 98.4% (122/124)
6 Internal validity 95.2% (118/124)
7 Magnitude and significance 91.9% (114/124)
8 Diagnostic accuracy 83.9% (104/124)
9 Risk reduction 63.7% (79/124)
10 Statistical significance 78.2% (97/124)
11 Diagnostic study design 82.3% (102/124)
12 Prognostic study design 83.1% (103/124)

Most residents (50.8%, 124/244) (35.4% PGY-1, 25.8% PGY-2, 26.6% PGY-3, 12.1% PGY-4) completed the demographics survey and FTEBM assessment during the study period (Table 2). Male residents (81/124, 65.3%) outnumbered female residents (34.7%, 43/124), and most residents self-identified as Caucasian (76/124, 61.3%) and held a doctorate in osteopathic medicine (109/124, 87.9%) (Table 2). A minority of the residents held an additional degree at the Master’s level and above (29/124, 23.4%) (Table 2). Residents most frequently reported an “average” ability (72/124, 58.1%) to assess the quality of medical evidence compared with their peers (Table 2). The overall mean score in those residents who reported an “average” or “above average” baseline knowledge in EBM was 36.0% (46.1/128) (32.8–39.1%). No correlation was noted with the self-perceived knowledge in EBM and overall test scores on the FTEBM (T b= 0.021; p=0.768). The residency program’s focus on EBM was most often rated as “average” (41/124, 33.1%) or “above average” (41/124, 33.1%) by the participating residents (Table 2). The overall mean score on the FTEBM of residents rating their institutions as having an “average” or “above average” focus on EBM was 34.9% (44.7/128) (32.2–37.6%). No correlation was noted with the perceived institutional focus on EBM and the overall test scores on the FTEBM (T b =−0.018; p=0.792).

Table 2:

Resident respondent demographics by postgraduate year of study. The data are presented as percent and frequency.

PGY-1 (n=44) PGY-2 (n=32) PGY-3 (n=33) PGY-4 (n=15)
Age 29.0 (28.0–31.3) 31.0 (29.0–33.0) 33.0 (30.5–35.0) 32.0 (31.0–33.0)
Female 38.6% (17/44) 34.4% (11/32) 36.4% (12/33) 20.0% (3/15)

Self-reported ethnicity

White 65.9% (29/44) 56.3% (18/32) 39.7% (23/33) 40.0% (6/15)
Black 9.1% (4/44) 3.1% (1/32) 0% (0/33) 6.75 (1/15)
Asian 9.1% (4/44) 31.3% (10/32) 21.2% (7/33) 33.3% (5/15)
Hispanic 11.4% (5/44) 9.4% (3/32) 6.1% (2/33) 13.3% (2/15)
Other 4.6% (2/44) 0% (0/32) 3.0% (1/33) 6.7% (1/15)

Educational attainment

DO 79.5% (35/44) 84.4% (27/32) 97.0% (32/33) 100.0% (15/15)
Master’s degree 15.9% (7/44) 25.0% (8/32) 33.3% (11/33) 20.0% (3/15)

Science ability assessment

Poor 6.8% (3/44) 3.1% (1/32) 3.0% (1/33) 0% (0/15)
Below average 29.5% (13/44) 25.0% (8/32) 12.1% (4/33) 40.0% (6/15)
Average 54.5% (24/44) 65.6% (21/32) 57.6% (19/33) 53.3% (8/15)
Above average 6.8% (3/44) 6.3% (2/32) 21.25 (7/33) 0% (0/15)
Excellent 2.3% (1/44) 0% (0/32) 6.1% (2/33) 6.7% (1/15)

Institutional EBM focus

Below average 4.5% (2/44) 3.1% (1/32) 15.2% (5/33) 6.7% (1/15)
Average 31.8% (14/44) 40.6% (13/32) 15.2% (5/33) 60.05 (9/15)
Above average 36.4% (16/44) 31.3% (10/32) 39.4% (13/33) 13.3% (2/15)
High 27.3% (12/44) 25.0% (8/32) 30.3% (10/33) 20.0% (3/15)
  1. EBM, evidence-based medicine.

The overall mean performance for the FTEBM was 35.2% (45.1/128) (32.7–37.8%). Performance on the FTEBM did not vary by postgraduate year in residency training with PGY-1 through PGY-4 performing on average at 32.3% (41.3/128), 36.9% (50.7/128), 37.7% (48.3/128), and 34.6% (44.3/128), respectively, on the examination (p=0.356) (Table 3). Residents holding a doctorate in osteopathic medicine performed the same as residents holding a doctorate in allopathic medicine on the overall examination (35.6% [45.6/128] vs. 32.9% [42.1/128]; p=0.511) (Table 3). No effect on overall performance was detected among those holding an additional advanced degree at the Master’s level or above as compared to those without (38.9% [49.8/128] vs. 34.1% [43.6/128]; p=0.117). Male residents significantly outperformed female residents on the FTEBM (37.1% (47.5/128) vs. 31.7% (40.6/128); p=0.049) (Table 3). When assessed by question type (theoretical or quantitative), no significant differences based upon resident demographics were detected (Table 3).

Table 3:

Mean performance on the Fresno Test of Evidence-Based Medicine (FTEBM). The item requiring computation and those with computation absent are presented as quantitative and theoretical, respectively.

Theoretical questions Quantitative questions Overall performance
PGY-1 34.6% (29.8–39.4%) 27.1% (19.4–34.8%) 32.3% (27.4–37.3%)
PGY-2 41.7% (36.5–46.8%) 27.3% (19.3–35.3%) 36.9% (32.2–41.7%)
PGY-3 41.5% (35.8–47.3%) 30.6% (23.1–38.0%) 37.7% (32.9–42.6%)
PGY-4 37.7% (15.5–42.1%) 28.8% (15.5–42.1%) 34.6% (28.0–41.3%)
P=0.132 P=0.917 P=0.356
DO 39.1% (36.3–42.0%) 28.4% (24.1–32.7%) 35.6% (32.9–38.2%)
MD 35.1% (26.6–43.7%) 27.3% (12.8% (41.8%) 32.9% (23.8–42.1%)
P=0.346 P=0.860 P=0.511
Female 36.3% (32.3–40.3%) 24.5% (17.9–31.1%) 31.7% (28.0–35.5%)
Male 39.9% (36.3–43.5%) 30.2% (25.0–35.5%) 37.1% (33.7–40.5%)
P=0.206 P=0.189 P=0.049
MD or DO 37.7% (34.6–40.8%) 26.7% (22.0–31.5%) 34.1% (31.1–37.1%)
MD or DO + MS 41.9% (36.1–47.7%) 33.2% (25.1–41.4%) 38.9% (33.8–44.0%)
P=0.188 P=0.103 P=0.117
  1. Italic values represent the probability of significant differences between the groups listed above the value.

Among the core components of EBM assessed by the FTEBM, residents showed the most room for growth on understanding the components of statistical significance and diagnostic study design, with only 5.6 and 6.5% of residents scoring above 50% on the questions addressing these topics (Table 4). Conversely, 73.4% of residents performed above the 50% criterion on questions addressing clinical study design (Table 4). The training level of residents was significantly associated with increased performance on questions addressing clinical study design (PGY-1=46.6%, PGY-2=53.9%, PGY-3=59.3%, PGY-4=70.0%, p=0.014) and evidence search strategies (PGY-1=39.2%, PGY-2=54.3%, PGY-3=51.5%, PGY-4=30.0%; p=0.029). A linear relationship between years of postgraduate education and clinical study design was detected (p=0.002).

Table 4:

Itemized mean performance by postgraduate year.

Item Topic % >50% PGY-1 PGY-2 PGY-3 PGY-4 Significance between Significance linear
1 Writing clinical questions 41.9% (52/124) 38.6% (32.0–45.3) 41.1% (32.0–50.3) 42.9% (36.8–49.0) 34.4% (23.2–45.7) 0.601 0.591
2 Information sources 37.1% (46/124) 43.6% (36.0–51.1) 47.4% (36.4–58.4) 35.4% (24.6–46.1) 38.9% (27.0–50.8) 0.331 0.313
3 Evidence search strategies 40.3% (50/124) 39.2% (30.4–48.1) 54.3% (41.9–66.7) 51.5% (40.5–62.5) 30.0% (13.9–46.1) 0.029 0.295
4 Clinical study design 73.4% (91/124) 46.6% (39.6–53.6) 53.9% (43.9–63.9) 59.3% (50.1–68.6) 70.0% (56.0–84.0) 0.014 0.002
5 Literature relevancy 21.8% (27/124) 38.3% (31.1–45.5) 44.8% (34.3–55.3) 42.4% (32.4–52.5) 29.4% (14.4–44.5) 0.290 0.251
6 Internal validity 21.8% (27/124) 29.4% (20.0–38.7) 40.5% (32.1–48.9) 37.0% (27.1–46.9) 41.1% (26.2–56.1) 0.280 0.222
7 Magnitude and significance 16.1% (20/124) 31.1% (23.1–39.0) 30.2% (20.8–39.6) 38.1% (27.8–48.4) 21.7% (7.6–35.7) 0.255 0.419
8 Diagnostic accuracy 41.9% (52/124) 36.4% (24.8–47.9) 35.0% (21.7–48.3) 40.0% (28.0–52.0) 42.7% (22.2–63.1) 0.887 0.483
9 Risk reduction 21.8% (27/124) 21.2% (11.2–31.2) 22.9% (10.5–35.3) 27.3% (16.4–38.1) 28.9% (7.0–50.8) 0.810 0.382
10 Statistical significance 5.6% (7/124) 6.8% (0.0–14.6) 3.1% (0.0–9.5) 9.1% (0.0–19.4) 0.0% (0.0–0.0) 0.555 0.505
11 Diagnostic study design 6.5% (8/124) 6.8% (0.0–14.6) 9.4% (0.0–20.1%) 6.1% (0.0–14.7) 0.0% (0.0–0.0) 0.689 0.306
12 Prognostic study design 37.9% (47/124) 38.6% (23.7–53.6) 43.8% (25.6–61.9) 39.4% (51.8–57.0) 16.7% (0.0–36.7) 0.339 0.121
  1. Italic values represent the probability of significant differences between the groups listed above the value. Bold Italic values respresent a probability of significance for the comparison of Post graduate year. “Significance between” represents differences embeded between years. “Significance linear” represents a change across PGY-1 through PGY-4.

Discussion

This study represents the largest independent assessment of emergency medicine residents’ knowledge of evidence-based practice in the literature to date. A recent systematic review on the development of EBM curriculums for emergency medicine noted several studies assessing a resident’s knowledge on topics commonly covered in EBM but that few utilized a validated assessment tool [11]. Conducted in 2010, Friedman et al. [12] sought to describe the impact of EBM on the change in patient-directed care by emergency medicine residents. Although the authors did not assess baseline knowledge, they did describe how following a search and review of the literature, the residents altered the course of care for 16.3% of patients. While the data presented in this manuscript suggest an improvement of literature search skills over the course of residency training, it appears that additional work could be completed by faculty to further train residents on how to interpret the papers that are sourced.

In a group of residents at a single training institution, Mohr et al. [13] reported a baseline score on the FTEBM of 82.3%. In comparison, this graduate medical education program represents an academic health center program that is usually assumed to be more replete in EBM didactics compared to a community hospital-based residency program such as ours. The highest score among residencies with more than 10 participants in our study was 48.0%. Bentley et al. [14] estimate that scores start low in emergency medicine residents and then increase linearly from PGY-1 through PGY-4 with a starting point of 25% and endpoint of 77%.

Based upon the current study and prior literature, it appears that emergency medicine residents have a limited understanding of the basic theoretical and statistical concepts that are needed to effectively understand and practice EBM. Although no overall differences in FTEBM scores were observed based on postgraduate years, more experienced residents did show an increase in knowledge on searching for clinical literature (PGY-3 outperforming PGY-1 residents by 12.3% points) and on study design (PGY-3 residents outscoring PGY-1 residents by 12.7% points). Unlike many of the items on the FTEBM, these items represent the initial phases of clinical evidence. It is likely that residents garner a more ad hoc search strategy as they learn to utilize available online resources over the course of their training. Although residents learn to better identify literature, the remainder of the items on the FTEBM do not show training-year performance-related increases. Thus, while the residents may be able to locate literature, it is unlikely that they are able to evaluate the scope of the reported findings based upon statistical analysis.

Residency programs may be well served to increase the amount of time spent teaching both research-focused concepts and biostatistics in order to better prepare trainees to incorporate the principles of EBM into future practice. Although not addressed in the current study, when asked about attitudes toward research, residents indicate little perceived impact of EBM on their medical training [15]. These attitudes represent a concerning barrier toward future training.

Based upon these results and the current body of literature, the open question that remains is what educational activities can be implemented during residency to improve understanding of the core concepts of EBM. Although sparse in the literature, several interventions have attempted to improve knowledge of research appraisal. Validated instructional techniques for postgraduates include dedicated journal clubs [16], individual research methodology workshops [17], and full curricula associated with monthly instruction [18]. Unfortunately, only a minority of graduate medical education programs have an established EBM curriculum [19].

Limitations

Although this was a cross-sectional survey of a diverse group of emergency medicine residents from across the nation, it was not encompassing of all residents in training. Overall study findings could be altered by including a larger sample size of residents and by including more MDs for a better representation of this population. Residents were not excluded from the study if they had previous knowledge or had previously taken the FTEBM as part of their education or during another research project. The EBM curriculum at each institution was also not included in the original data that was gathered. These two limitations could have altered the results because residents may have had a prior understanding of the concepts on the examination or may have taken the examination at a previous point in their academic training.

Conclusions

The surveyed emergency medicine residents show an incomplete understanding of the basic concepts in EBM as tested by the FTEBM. No difference in knowledge was seen based on the postgraduate level of training, on the different medical school degrees, or if a resident held an advanced degree in addition to that of medicine. A heightened focus on teaching the basic theoretical and statistical concepts of research is needed if residents are truly expected to not only meet the specific milestones prior to graduation but also effectively treat patients with an eye toward EBM.


Corresponding author: Anthony Santarelli, PhD, Kingman Regional Medical Center, Department of Graduate Medical Education, Hualapai Mountain Campus, Kingman Regional Medical Center, 3801 Santa Rosa Drive 86401 Kingman, AZ, USA, E-mail:

  1. Research funding: This project was supported by the Foundation for Osteopathic Emergency Medicine Research Network David A. Kuchinski Memorial Grant.

  2. Author contributions: All authors provided substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; all authors drafted the article or revised it critically for important intellectual contact; all authors gave final approval of the version of the article to be published; all authors agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

  3. Competing interests: None reported.

  4. Ethical approval: All study procedures were approved by the Kingman Regional Medical Center Institutional Review Board (approval number, KHI-0204).

References

1. Sackett, DL, Rosenberg, WM, Gray, JA, Haynes, RB, Richardson, WS. Evidence based medicine: what it is and what it isn’t. Br Med J 1996;312:71–2. https://doi.org/10.1136/bmj.312.7023.71.Search in Google Scholar PubMed PubMed Central

2. Segal, MM, Williams, MS, Gropman, AL, Torres, AR, Forsyth, R, Connolly, AM, et al.. Evidence-based decision support for neurological diagnosis reduces errors and unnecessary workup. J Child Neurol 2014;29:487–92. https://doi.org/10.1177/0883073813483365.Search in Google Scholar PubMed

3. Miller, FG, Joffe, S, Kesselheim, AS. Evidence, errors, and ethics. Perspect Biol Med 2014;57:299–307. https://doi.org/10.1353/pbm.2014.0024.Search in Google Scholar PubMed

4. Smith, AB, Semler, L, Rehman, EA, Haddad, ZG, Ahmadzadeh, KL, Crellin, SJ, et al.. A cross-sectional study of medical student knowledge of evidence-based medicine as measured by the Fresno test of evidence-based medicine. J Emerg Med 2016;50:759–64. https://doi.org/10.1016/j.jemermed.2016.02.006.Search in Google Scholar PubMed

5. Smith, CA, Ganschow, PS, Reilly, BM, Evans, AT, McNutt, RA, Osei, A, et al.. Teaching residents evidence-based medicine skills: a controlled trial of effectiveness and assessment of durability. J Gen Intern Med 2000;15:710–5. https://doi.org/10.1046/j.1525-1497.2000.91026.x.Search in Google Scholar PubMed PubMed Central

6. Ling, LJ, Beeson, MS. Milestones in emergency medicine. J. Acute Med. 2012;2:65–9. https://doi.org/10.1016/j.jacme.2012.06.002.Search in Google Scholar

7. Ilic, D. Assessing competency in Evidence Based Practice: strengths and limitations of current tools in practice. BMC Med Educ 2009;9:53. https://doi.org/10.1186/1472-6920-9-53.Search in Google Scholar PubMed PubMed Central

8. Ramos, KD, Schafer, S, Tracz, SM. Validation of the Fresno test of competence in evidence based medicine. Br Med J 2003;326:319–21.10.1136/bmj.326.7384.319Search in Google Scholar PubMed PubMed Central

9. Caspi, O, McKnight, P, Kruse, L, Cunningham, V, Figueredo, AJ, Sechrest, L. Evidence-based medicine: discrepancy between perceived competence and actual performance among graduating medical students. Med Teach 2006;28:318–25. https://doi.org/10.1080/01421590600624422.Search in Google Scholar PubMed

10. Lai, NM, Teng, CL. Competence in evidence based medicine of senior medical students following a clinically integrated training programme. Hong Kong Med J 2009;15:332–8.Search in Google Scholar

11. Halalau, A, Holmes, B, Rogers-Snyr, A, Donisan, T, Nielsen, E, Cerqueira, T, et al.. Evidence-based medicine curricula and barriers for physicians in training: a scoping review. Int J Med Educ 2021;12:101–24. https://doi.org/10.5116/ijme.6097.ccc0.Search in Google Scholar PubMed PubMed Central

12. Friedman, S, Sayers, B, Lazio, M, Friedman, S, Gisondi, MA. Curriculum design of a case-based knowledge translation shift for emergency medicine residents. Acad Emerg Med 2010;17(2 Suppl):S42–8.10.1111/j.1553-2712.2010.00879.xSearch in Google Scholar PubMed

13. Mohr, NM, Stoltze, AJ, Harland, KK, Van Heukelom, JN, Hogrefe, CP, Ahmed, A. An evidence-based medicine curriculum implemented in journal club improves resident performance on the Fresno test. J Emerg Med 2015;48:222–9.e1. https://doi.org/10.1016/j.jemermed.2014.09.011.Search in Google Scholar PubMed

14. Bentley, S, Slovis, BH, Shah, K. Introduction of a novel evidence-based medicine curriculum in emergency medicine. Med.Sci.Educ. 2018;28:497–501. https://doi.org/10.1007/s40670-018-0575-9.Search in Google Scholar

15. Koo, J, Bains, J, Collins, MB, Dharamsi, S. Residency research requirements and the CanMEDS-FM scholar role: perspectives of residents and recent graduates. Can Fam Physician 2012;58:e330–6.Search in Google Scholar

16. Ahmadi, N, McKenzie, M, Maclean, A, Brown, C, Mastracci, T, McLeod, R, et al.. Teaching evidence based medicine to surgery residents-is journal club the best format? A systematic review of the literature. J Surg Educ 2012;69:91–100. https://doi.org/10.1016/j.jsurg.2011.07.004.Search in Google Scholar PubMed

17. Gogtay, NJ. Research methodology workshops: a small step towards practice of evidence-based medicine. Perspect Clin Res 2018;9:59–60. https://doi.org/10.4103/picr.PICR_28_18.Search in Google Scholar PubMed PubMed Central

18. Aneese, AM, Nasr, JA, Halalau, A. A prospective mixed-methods study evaluating the integration of an evidence based medicine curriculum into an internal medicine residency program. Adv Med Educ Pract 2019;10:533–46. https://doi.org/10.2147/AMEP.S203334.Search in Google Scholar PubMed PubMed Central

19. Carpenter, CR, Kane, BG, Carter, M, Lucas, R, Wilbur, LG, Graffeo, CS. Incorporating evidence-based medicine into resident education: a CORD survey of faculty and resident expectations. Acad Emerg Med 2010;17(2 Suppl):S54–61. https://doi.org/10.1111/j.1553-2712.2010.00889.x.Search in Google Scholar PubMed PubMed Central

Received: 2022-02-01
Accepted: 2022-05-16
Published Online: 2022-06-15

© 2022 James Katsilometes et al., published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 8.6.2024 from https://www.degruyter.com/document/doi/10.1515/jom-2022-0027/html
Scroll to top button