Skip to main content

Advertisement

Log in

Standard Setting for Clinical Competence at Graduation from Medical School: A Comparison of Passing Scores Across Five Medical Schools

  • Published:
Advances in Health Sciences Education Aims and scope Submit manuscript

Abstract

While Objective Structured Clinical Examinations (OSCEs) have become widely used to assess clinical competence at the end of undergraduate medical courses, the method of setting the passing score varies greatly, and there is no agreed best methodology. While there is an assumption that the passing standard at graduation is the same at all medical schools, there is very little quantitative evidence in the field. In the United Kingdom, there is no national licensing examination; each medical school sets its own graduating assessment and successful completion by candidates leads to the licensed right to practice by the General Medical Council. Academics at five UK medical school were asked to set passing scores for six OSCE stations using the Angoff method, following a briefing session on this technique. The results were collated and analysed. The passing scores set for the each of the stations varied widely across the five medical schools. The implication for individual students at the different medical schools is that a student with the same level of competency may pass at one medical school but would fail at another even when the test is identical. Postulated reasons for this difference include different conceptions of the minimal level of competence acceptable for graduating students and the possible unsuitability of the Angoff method for performance based clinical tests.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • G.J. Cizek (2001) Setting Performance Standards: Concepts, Methods, and Perspectives Lawrence Erlbaum Associates Mahwah, NJ

    Google Scholar 

  • M.D. Cusimano (1996) ArticleTitleStandard setting in medical education Academic Medicine 71 S112–S120 Occurrence Handle10.1097/00001888-199610000-00062

    Article  Google Scholar 

  • InstitutionalAuthorNameGeneral Medical Council (2003) Tomorrow’s Doctors GMC London

    Google Scholar 

  • D.M. Kaufman K.V. Mann A.M.M. Muijtjens C.P.M. Vleuten Particlevan der (2000) ArticleTitleA comparison of standard-setting procedures for an OSCE in undergraduate medical education Academic Medicine 75 267–271 Occurrence Handle10.1097/00001888-200003000-00018

    Article  Google Scholar 

  • A. Kramer A.M.M. Muijtjens K. Jansen H. Dusman L. Tan C.P.M. Vleuten Particlevan der (2003) ArticleTitleComparison of a rational and an empirical standard setting procedure for an OSCE Medical Education 37 132–139 Occurrence Handle10.1046/j.1365-2923.2003.01429.x

    Article  Google Scholar 

  • S.A. Livingstone M.J. Zeiky (1982) Passing Scores: A Manual for Setting Standards of Performance on Educational and Occupational Tests Educational Testing Service Princeton

    Google Scholar 

  • Muir, E. (2000). Comparing Results from Shared OSCE Stations Across Five London Medical Schools. Paper presented at the Annual Scientific Meeting of the Association for the Study of Medical Education

  • J.J. Norcini (2003) ArticleTitleSetting standards on educational tests Medical Education 37 464–469 Occurrence Handle10.1046/j.1365-2923.2003.01495.x

    Article  Google Scholar 

  • Quality Assurance Agency (2000). Handbook for Academic Review. Gloucester, QAA

  • Quality Assurance Agency (2004). Code of Practice – Quality and Standards in HE: Assessment of Students. Gloucester, QAA: Quality Assurance Agency for HE Available at: http:/www.qaa.ac.uk/public/cop/copaosfinal/contents.htm

  • D.J. Sheskin (2004) Handbook of Parametric and Nonparametric Statistical Procedures Chapman & Hall Florida 826–828

    Google Scholar 

  • S.M. Smee D.E. Blackmore (2001) ArticleTitleSetting standards for an objective structured clinical examination: the borderline group method gains ground on Angoff Medical Education 35 1009–1010 Occurrence Handle10.1046/j.1365-2923.2001.01047.x

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Katharine A. M. Boursicot.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Boursicot, K.A.M., Roberts, T.E. & Pell, G. Standard Setting for Clinical Competence at Graduation from Medical School: A Comparison of Passing Scores Across Five Medical Schools. Adv Health Sci Educ Theory Pract 11, 173–183 (2006). https://doi.org/10.1007/s10459-005-5291-8

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10459-005-5291-8

Key words

Navigation