Skip to main navigation menu Skip to main content Skip to site footer

Original article

Vol. 152 No. 1112 (2022)

Bottom-up feedback to improve clinical teaching: validation of the Swiss System for Evaluation of Teaching Qualities (SwissSETQ)

  • Jan Breckwoldt
  • Adrian P. Marty
  • Daniel Stricker
  • Raphael Stolz
  • Reto Thomasin
  • Niels Seeholzer
  • Joana Berger-Estilita
  • Robert Greif
  • Sören Huwendiek
  • Marco P. Zalunardo
DOI
https://doi.org/10.4414/SMW.2022.w30137
Cite this as:
Swiss Med Wkly. 2022;152:w30137
Published
18.03.2022

Summary

AIMS OF THE STUDY: Clinical teaching is essential in preparing trainees for independent practice. To improve teaching quality, clinical teachers should be provided with meaningful and reliable feedback from trainees (bottom-up feedback) based on up-to-date educational concepts. For this purpose, we designed a web-based instrument, "Swiss System for Evaluation of Teaching Qualities" (SwissSETQ), building on a well-established tool (SETQsmart) and expanding it with current graduate medical education concepts. This study aimed to validate the new instrument in the field of anaesthesiology training.

METHODS: Based on SETQsmart, we developed an online instrument (primarily including 34 items) with generic items to be used in all clinical disciplines. We integrated the recent educational frameworks of CanMEDS 2015 (Canadian Medical Educational Directives for Specialists), and of entrustable professional activities (EPAs). Newly included themes were "Interprofessionalism", "Patient centredness", "Patient safety", "Continuous professional development’, and "Entrustment decisions". We ensured content validity by iterative discussion rounds between medical education specialists and clinical supervisors. Two think-aloud rounds with residents investigated the response process. Subsequently, the instrument was pilot-tested in the anaesthesia departments of four major teaching hospitals in Switzerland, involving 220 trainees and 120 faculty. We assessed the instrument's internal structure (to determine the factorial composition) using exploratory factor analysis, internal statistical consistency (by Cronbach’s alpha as an estimate of reliability, regarding alpha >0.7 as acceptable, >0.8 as good, >0.9 as excellent), and inter-rater reliability (using generalisability theory in order to assess the minimum number of ratings necessary for a valid feedback to one single supervisor).

RESULTS: Based on 185 complete ratings for 101 faculty, exploratory factor analysis revealed four factors explaining 72.3% of the variance (individual instruction 33.8%, evaluation of trainee performance 20.9%, teaching professionalism 12.8%; entrustment decisions 4.7%). Cronbach's alpha for the total score was 0.964. After factor analysis, we removed one item to arrive at 33 items for the final instrument. Generalisability studies yielded a minimum of five to six individual ratings to provide reliable feedback to one supervisor.

DISCUSSION: The SwissSETQ possesses high content validity and an "excellent" internal structure for integrating up-to-date graduate medical education concepts. Thereby, the tool allows reliable bottom-up feedback by trainees to support clinical teachers in improving their teaching. Transfer to disciplines other than anaesthesiology needs to be further explored.

References

  1. Davis DA. Reengineering Medical Education. in Gigerenzer G, Gray JA (Ed). Better doctors, better patients, better decisions: Envisioning health care 2020. Boston, MA 2011 (The MIT Press), p.243-64.
  2. Van Der Leeuw RM, Boerebach BC, Lombarts KM, Heineman MJ, Arah OA. Clinical teaching performance improvement of faculty in residency training: A prospective cohort study. Med Teach. 2016 May;38(5):464–70. https://doi.org/10.3109/0142159X.2015.1060302
  3. Steinert Y, Mann K, Anderson B, Barnett BM, Centeno A, Naismith L, et al. A systematic review of faculty development initiatives designed to enhance teaching effectiveness: A 10-year update: BEME Guide No. 40. Med Teach. 2016 Aug;38(8):769–86. https://doi.org/10.1080/0142159X.2016.1181851
  4. Schuwirth LW, Van der Vleuten CP. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 2011;33(6):478–85. https://doi.org/10.3109/0142159X.2011.565828
  5. Kember D, Leung DY, Kwan K. Does the use of student feedback questionnaires improve the overall quality of teaching? Assess Eval High Educ. 2002;27(5):411–25. https://doi.org/10.1080/0260293022000009294
  6. Richardson JT. Instruments for obtaining student feedback: A review of the literature. Assess Eval High Educ. 2005;30(4):387–415. https://doi.org/10.1080/02602930500099193
  7. Fluit CR, Bolhuis S, Grol R, Laan R, Wensing M. Assessing the quality of clinical teachers: a systematic review of content and quality of questionnaires for assessing clinical teachers. J Gen Intern Med. 2010 Dec;25(12):1337–45. https://doi.org/10.1007/s11606-010-1458-y
  8. Lombarts KM, Ferguson A, Hollmann MW, Malling B, Arah OA, Arah OA ; SMART Collaborators. Redesign of the System for Evaluation of Teaching Qualities in Anesthesiology Residency Training (SETQ Smart). Anesthesiology. 2016 Nov;125(5):1056–65. https://doi.org/10.1097/ALN.0000000000001341
  9. Lombarts MJ, Bucx MJ, Rupp I, Keijzers PJ, Kokke SI, Schlack W. [An instrument for the assessment of the training qualities of clinician-educators]. Ned Tijdschr Geneeskd. 2007 Sep;151(36):2004–8.
  10. van der Leeuw R, Lombarts K, Heineman MJ, Arah O. Systematic evaluation of the teaching qualities of Obstetrics and Gynecology faculty: reliability and validity of the SETQ tools. PLoS One. 2011 May;6(5):e19142. https://doi.org/10.1371/journal.pone.0019142
  11. Boerebach BC, Lombarts KM, Arah OA. Confirmatory Factor Analysis of the System for Evaluation of Teaching Qualities (SETQ) in Graduate Medical Training. Eval Health Prof. 2016 Mar;39(1):21–32. https://doi.org/10.1177/0163278714552520
  12. Bowling A. Quantitative social science: the survey. In Bowling A, Ebrahim S (eds). Handbook of Health Research Methods: Investigation, Measurement & Analysis. New York 2005 (McGraw-Hill), pp.190-214.
  13. Lietz P. Research into Questionnaire Design: A Summary of the Literature. Int J Mark Res. 2010;52(2):249–72. https://doi.org/10.2501/S147078530920120X
  14. Skeff KM. Evaluation of a method for improving the teaching performance of attending physicians. Am J Med. 1983 Sep;75(3):465–70. https://doi.org/10.1016/0002-9343(83)90351-0
  15. Litzelman DK, Westmoreland GR, Skeff KM, Stratos GA. Factorial validation of an educational framework using residents’ evaluations of clinician-educators. Acad Med. 1999 Oct;74(10 Suppl):S25–7. https://doi.org/10.1097/00001888-199910000-00030
  16. Frank JR. The CanMEDS 2005 Physician Competency Framework. Better Standards. Better Physicians. Better Care. Ottawa 2005. The Royal College of Physicians and Surgeons of Canada.
  17. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007 Sep;29(7):648–54. https://doi.org/10.1080/01421590701392903
  18. Frenk J, Chen L, Bhutta ZA, Cohen J, Crisp N, Evans T, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010 Dec;376(9756):1923–58. https://doi.org/10.1016/S0140-6736(10)61854-5
  19. Srinivasan M, Li ST, Meyers FJ, Pratt DD, Collins JB, Braddock C, et al. “Teaching as a Competency”: competencies for medical educators. Acad Med. 2011 Oct;86(10):1211–20. https://doi.org/10.1097/ACM.0b013e31822c5b9a
  20. ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005 Dec;39(12):1176–7. https://doi.org/10.1111/j.1365-2929.2005.02341.x
  21. Jonker G, Hoff RG, Ten Cate OT. A case for competency-based anaesthesiology training with entrustable professional activities: an agenda for development and research. Eur J Anaesthesiol. 2015 Feb;32(2):71–6. https://doi.org/10.1097/EJA.0000000000000109
  22. Marty AP, Schmelzer S, Thomasin RA, Braun J, Zalunardo MP, Spahn DR, et al. Agreement between trainees and supervisors on first-year entrustable professional activities for anaesthesia training. Br J Anaesth. 2020 Jul;125(1):98–103. https://doi.org/10.1016/j.bja.2020.04.009
  23. Ogrinc G, Armstrong GE, Dolansky MA, Singh MK, Davies L. SQUIRE-EDU (Standards for QUality Improvement Reporting Excellence in Education): Publication Guidelines for Educational Improvement. Acad Med. 2019 Oct;94(10):1461–70. https://doi.org/10.1097/ACM.0000000000002750
  24. EQUATOR network. Accessed August 16, 2021. https://www.equator-network.org/
  25. Frank JR, Snell L, Sherbino J, editors. CanMEDS 2015 Physician Competency Framework. Ottawa 2015 (Royal College of Physicians and Surgeons of Canada).
  26. Ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using Entrustable Professional Activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983–1002. https://doi.org/10.3109/0142159X.2015.1060308
  27. Breckwoldt J, Beckers SK, Breuer G, Marty A. [Entrustable professional activities : promising concept in postgraduate medical education] [German]. Anaesthesist. 2018 Jun;67(6):452–7. https://doi.org/10.1007/s00101-018-0420-y
  28. Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006 Feb;119(2):166.e7–16. https://doi.org/10.1016/j.amjmed.2005.10.036
  29. Kaiser HF, Rice J. Little Jiffy, Mark IV. Educ Psychol Meas. 1974;34(1):111–7. https://doi.org/10.1177/001316447403400115
  30. Revelle W, Zinbarg RE. Coefficients alpha, beta, omega and the glb: comments on Sijtsma. Psychometrika. 2009;74(1):145–54. https://doi.org/10.1007/s11336-008-9102-z
  31. Shavelson RJ, Webb NM, Rowley GL. Generalizability theory. Am Psychol. 1989;44(6):922–32. https://doi.org/10.1037/0003-066X.44.6.922
  32. Shavelson RJ, Webb NM. Generalizability theory: A primer. Thousand Oaks, Ca. 1991 (Sage).
  33. Brennan RL. Generalizability theory. J Educ Meas. 2003;40(1):105–7. https://doi.org/10.1111/j.1745-3984.2003.tb01098.x
  34. R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria 2018. URL https://www.R-project.org/
  35. G_String 2019] A Windows Wrapper for urGENOVA [cited 2019 12 September]. Available from: http://fhsperd.mcmaster.ca/g_string/index.html)
  36. Neufeld VR, Maudsley RF, Pickering RJ, Turnbull JM, Weston WW, Brown MG, et al. Educating future physicians for Ontario. Acad Med. 1998 Nov;73(11):1133–48. https://doi.org/10.1097/00001888-199811000-00010
  37. CanMEDS. CanMEDS 2000: Extract from the CanMEDS 2000 Project Societal Needs Working Group Report. Med Teach. 2000;22(6):549–54. https://doi.org/10.1080/01421590050175505
  38. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007 Sep;29(7):642–7. https://doi.org/10.1080/01421590701746983
  39. Jilg S, Möltner A, Berberat P, Fischer MR, Breckwoldt J. How do Supervising Clinicians of a University Hospital and Associated Teaching Hospitals Rate the Relevance of the Key Competencies within the CanMEDS Roles Framework in Respect to Teaching in Clinical Clerkships? GMS Z Med Ausbild. 2015 Aug;32(3):Doc33.
  40. Ten Cate O. When I say … entrustability. Med Educ. 2020 Feb;54(2):103–4. https://doi.org/10.1111/medu.14005
  41. Schuwirth LW. [Evaluation of students and teachers]. Ned Tijdschr Geneeskd. 2010;154:A1677.
  42. Marty AP, Schmelzer S, Thomasin RA, Braun J, Zalunardo MP, Spahn DR, et al. Agreement between trainees and supervisors on first-year entrustable professional activities for anaesthesia training. Br J Anaesth. 2020 Jul;125(1):98–103. https://doi.org/10.1016/j.bja.2020.04.009

Most read articles by the same author(s)