Skip to main content
Log in

The Association of Student Examination Performance with Faculty and Resident Ratings Using a Modified RIME Process

  • Original Article
  • Published:
Journal of General Internal Medicine Aims and scope Submit manuscript

Abstract

Background

RIME is a descriptive framework in which students and their teachers can gauge progress throughout a clerkship from R (reporter) to I (interpreter) to M (manager) to E (educator). RIME, as described in the literature, is complemented by residents and attending physicians meeting with a clerkship director to discuss individual student progress, with group discussion resulting in assignment of a RIME stage.

Objective

1) to determine whether a student’s RIME rating is associated with end-of-clerkship examination performance; and 2) to determine whose independent RIME rating is most predictive of a student’s examination performance: attendings, residents, or interns.

Design

Prospective cohort study.

Participants

Third year medical students from academic years 2004–2005 and early 2005–2006 at 1 medical school.

Measurements and Main Results

Each attending, resident, and intern independently assessed the student’s final RIME stage attained. For the purpose of analysis, R stage=1, I=2, M=3, and E=4. Regression analyses were performed with examination scores as dependent variables (National Board of Medical Examiners [NBME] medicine subject examination and a clinical performance examination [CPE]), with independent variables of mean attending RIME score, mean resident score, and mean intern score. For the 122 students, significant predictors of NBME subject exam score were resident RIME rating (p = .008) and intern RIME rating (p = .02). Significant predictor of CPE performance was resident RIME rating (p = .01).

Conclusion

House staff RIME ratings of students are associated with student performance on written and clinical skills examinations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1

Similar content being viewed by others

References

  1. Pangaro LN. Investing in descriptive evaluation: a vision for the future of assessment. Med Teach. 2000;22:478–81.

    Article  Google Scholar 

  2. Pangaro L. A new vocabulary and other innovations for improving descriptive in-training evaluations. Acad Med. 1999;74:1203–7.

    Article  CAS  PubMed  Google Scholar 

  3. Gaglione MM, Moores L, Pangaro L, Hemmer PA. Does group discussion of student clerkship performance at an education committee affect an individual committee member’s decisions. Acad Med. 2005;80(10 suppl):S55–8.

    Article  PubMed  Google Scholar 

  4. Hemmer PA, Pangaro L. Using formal evaluation sessions for case-based faculty development during clinical clerkships. Acad Med. 2000;75:1226–1.

    Google Scholar 

  5. Battistone MJ, Milne C, Sande MA, Pangaro LN, Hemmer PA, Shomaker TS. The feasibility and acceptability of implementing formal evaluation sessions and using descriptive vocabulary to assess student performance on a clinical clerkship. Teach Learn Med. 2002;14:5–10.

    Article  PubMed  Google Scholar 

  6. Hemmer PA, Pangaro L. The effectiveness of formal evaluation sessions during clinical clerkships in better identifying students with marginal funds of knowledge. Acad Med. 1997;72:641–3.

    Article  CAS  PubMed  Google Scholar 

  7. Hemmer PA, Hawkins R, Jackson J, Pangaro L. Assessing how well three evaluation methods detect deficiencies in medical students professionalism in two settings of an internal medicine clerkship. Acad Med. 2000;75:167–73.

    CAS  PubMed  Google Scholar 

  8. Lavin B, Pangaro L. Internship ratings as a validity outcome measure for an evaluation system to identify inadequate clerkship performance. Acad Med. 1998;73:998–1002.

    Article  CAS  PubMed  Google Scholar 

  9. Hemmer PA, Papp KK, Mechaber AJ, Durning SJ. Evaluation, grading, and use of a common evaluation framework by internal medicine clerkship directors: results of a national survey. Teach Learn Med. 2008, in press.

  10. Espey E, Ogburn T. The R-I-M-E method for evaluation of medical students on an obstetrics and gynecology clerkship. Am J Obstet Gynecol. 2003;189:666–9.

    Article  PubMed  Google Scholar 

  11. Torre DM, Daley BJ, Sebastian JL, Elnicki DM. Overview of current learning theories for medical educators. Am J Med. 2006;119:903–7.

    Article  PubMed  Google Scholar 

  12. Durning SJ, Pangaro LN, Denton GD, Hemmer PA, Wimmer A, Grau T, Gaglione MA, Moores L. Intersite consistency as a measurement of programmatic evaluation in a medicine clerkship with multiple, geographically separated sites. Acad Med. 2003;78(10 supp):S36–8.

    Article  PubMed  Google Scholar 

  13. Alliance for Clinical Education. Guidebook for Clerkship Directors. Chapter 6. McGaghie WC, Pangaro LN. available at: http://familymed.uthscsa.edu/ACE/chapter6.htm#clerkship, accessed March 3, 2008.

  14. Battistone MJ, Pendleton B, Milne C, et al. Global descriptive evaluations are more responsive than global numeric ratings in detecting students’ progress during the inpatient portion of an internal medicine clerkship. Acad Med. 2001;76(10 suppl.):S105–7.

    CAS  PubMed  Google Scholar 

  15. Griffith CH, Wilson JF, Haist SA, Ramsbottom-Lucier M. Do students who work with better housestaff in their medicine clerkship learn more. Acad Med. 1998;73(10 suppl.):S57–9.

    Article  PubMed  Google Scholar 

  16. Akl EA, Maroun N, Klocke RA, Schunemann HJ. A survey of internal medicine residents and faculty about the duration of attendings’ inpatient rotations. J Gen Intern Med. 2004;19:1133–9.

    Article  PubMed  Google Scholar 

Download references

Acknowledgments

We would especially like to offer our gratitude for the extremely helpful suggestions made by our anonymous external reviewers. Thank you.

Conflict of interest

None disclosed.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charles H. Griffith III MD, MSPH.

Additional information

Portions of this paper have been presented at the National Meeting of the Clerkship Directors of Internal Medicine, October 27, 2006, in New Orleans, LA; and at the southern regional SGIM meeting, February 10, 2007, New Orleans, LA.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Griffith III, C.H., Wilson, J.F. The Association of Student Examination Performance with Faculty and Resident Ratings Using a Modified RIME Process. J GEN INTERN MED 23, 1020–1023 (2008). https://doi.org/10.1007/s11606-008-0611-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11606-008-0611-3

KEY WORDS

Navigation