Skip to main content

Simulation video: a tool to evaluate communications skills in radiologist residents

Abstract

Background

Effective communication is a crucial component of radiology resident training, and many different aspects need to be explored when teaching and evaluating communication skills. To ensure that radiology residents’ communication skill levels can be measured accurately, a standardized evaluation tool has been introduced. In twenty hospitals in Beijing, simulation videos have been developed as a way to assess the communication skills of radiology residents during their certification exams, to minimize evaluating biases. This study aims to assess the performance of a simulation video model in evaluating communications skills compared to the standard patient model.

Methods

This is a retrospective observational study. The performance of standard patient and simulation video models was evaluated through an eight-year examination of communication skills in radiology residents. From 2014 to 2021, communications skill tests were administered to 1003 radiology residents in 20 hospitals in Beijing. The standardized patient (SP) model was applied in 2014, and simulation videos were used from 2015 to 2021. The difficulty and discrimination radio of the tests were evaluated. The subjective survey for candidates on two models of communication skills evaluation was performed and analyzed.

Results

The simulation video model evaluation demonstrated stable difficulty (ranging from 0.92 to 0.98) and discrimination ratio (ranging from 0.37 to 0.49), except for minor exceptions of discrimination in 2019 (0.58) and 2020 (0.20). Furthermore, the Kruskal-Wallis H test revealed no significant differences in average scores between 2016 (93.9 ± 4.6) and 2018 (94.5 ± 4.2), 2016 and 2019 (97.3 ± 3.9), 2017 (97.0 ± 5.6) and 2019, 2017 and 2020 (97.7 ± 4.7), as well as 2019 and 2020 exams (all p ≥ 0.05). In addition, candidates who responded to the survey preferred the simulation video model (with a 77.2% response rate), with 62.7% choosing it over the SP model for communication skills evaluation.

Conclusion

The simulation video demonstrated a stable and better acceptable construct for assessing radiology residents’ communication skills.

Peer Review reports

Introduction

Since the original version of Tomorrow’s Doctors in 1993 [1], medical education and resident training have shifted from a system based solely on time and process to one that emphasizes multiple competencies. This significant change has had a considerable impact on UK medical schools, many of which have started creating dynamic and innovative curricula inspired by the book’s guidelines.

Historically, radiologists have been primarily responsible for interpreting medical imaging and generating reports. However, with the shift towards multi-competency resident training in the medical field, evaluation tools became a new requirement for the profession.

Communication is one of the core competencies of radiology residents [2]. Effective communication is an essential aspect of providing high-quality patient care, and this applies to many medical subspecialties including radiology. While traditionally most medical imaging results are provided directly to the referring clinician, direct communication between radiologists and patients has become increasingly important, especially in situations such as direct interpretation of written reports to patients or interventional radiology [3]. The study of Gutzeit et al. showed that direct commutation from radiologists to patients after MRI examinations improved the radiology service and bonding between radiologists and patients [4].

Radiologists must communicate with colleagues, technicians, nurses, surgeons, internal physicians, and patients. The standardized patient (SP) model has a long history in medical training; it has played a role in professional medical teaching for more than 50 years [5, 6]. The first reported SP was coached by a neurologist to exhibit various neurological symptoms to assess the diagnostic skills of students. SP has also been applied to cultivate and evaluate communications skills for medical students [7,8,9] However, the training with SPs in evaluation can be time-consuming and hard to normalize, especially for large-scale evaluation, and it is difficult to obtain high reproducibility from different SPs [9].

How to assess communication skills has been challenging since the training on communication needs to be improved in both undergraduate and postgraduate education [10, 11]. Based on the Chinese national survey on radiology residency training, training programs mainly focus on patient care and medical knowledge rather than other competencies such as communication [10]. In China, resident training for radiology was nationalized in 2014, and all medical students looking to become radiology staff are required to complete a three-year residency training in radiology. This requirement is mandatory, irrespective of whether the medical student has achieved a bachelor’s, master’s, or doctoral degree, and was in place at the time of the study being conducted. To date, there is no national resident certification exam in China. Twenty hospitals in Beijing experience is the most advanced and representative in the country [10]. There are twenty hospitals in Beijing qualified as radiological residents standardized training centers, which are Peking University Third Hospital,Peking University First Hospital, Peking University People’s Hospital, Peking University Cancer Hospital, Beijing Jishuitan Hospital, Beijing Tsinghua Changgung Hospital, Beijing Hospital, Beijing Chaoyang Hospital Affiliated to Capital Medical University, Beijing Shijitan Hospital Affiliated to Capital Medical University, Beijing Tiantan Hospital Affiliated to Capital Medical University, Beijing Tongren Hospital Affiliated to Capital Medical University, Beijing Friendship Hospital Affiliated to Capital Medical University, Xuanwu Hospital of Capital Medical University, Chinese People’s Liberation Army Air Force Special Medical Center, Sixth Medical Center, General Hospital of Chinese People’s Liberation Army, The First Medical Center of the Chinese People’s Liberation Army General Hospital, Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Chinese Academy of Medical Sciences Cancer Hospital, Sino-Japanese Friendship Hospital and Beijing Aerospace General Hospital. Beijing’s overall radiology resident capacity has considered the significance and taken action to cultivate communications skills. Although there are studies about simulation training and evaluation in radiology resident communication skills [12, 13], no one has used simulation video in communication skill evaluation. Considering the shortcomings of the SP model for communications skills evaluation, the evaluation team initiated the simulation video model. The 2014 radiology resident certification exam initially used SP conversation for communications skill evaluation. In the SP test, each resident had a medical inquiry with a trained SP and received a score from two examiners. From 2015 to 2020, the novel simulation video was used. This retrospective study compared the advantages and disadvantages of the two evaluation models.

Methods

This is a retrospective observational study. The objective and subjective evaluations were both performed. The performance of standard patient and simulation video models was evaluated through an eight-year examination of communication skills in radiology residents. From 2014 to 2021, communications skill tests were administered to 1003 radiology residents in 20 medical hospitals in Beijing. The standardized patient (SP) model was applied in 2014, and simulation videos were used from 2015 to 2021. The average score, difficulty, and discrimination radio of the tests were evaluated. The subjective survey for candidates on communication skills was performed and analyzed.

The standardized patient conversation evaluation model

The evaluation process was designed in four phases. The first phase was to write the script, which included the medical history of the SP, the emotional status, and the primary concern. In the second stage, the SPs are recruited according to the following standards (Table 1). The SPs do not need to have a professional medical background. Standardized training was carried out for all SPs enrolled. The training content included fundamental medical knowledge and doctor–patient communication knowledge and skills. The training goal is to “be able to truly show the complaint”, which means to act out symptoms and chief complaints, such as headaches, and stomachaches. The third phase was the evaluation of the SP after the training to ensure that they met the requirements.The fourth stage was the final examination. The whole process was observed and independently evaluated by two examiners. Each resident had 10 min to communicate with the SP.

Table 1 SP Recruitment criteria

Simulation video evaluation

The evaluation tool was designed in three phases, and the first phase was to write a script. The topics included breaking bad news, interpreting imaging reports, or taking a medical history. Bad communication practices were incorporated into the video in spoken words, nonverbal expression, tone of voice, body gestures, professional value, and attitude toward the patient. The second phase was to film a 5-minute video based on the script. The third phase was the final evaluation in which the short video was shown to the candidate, and the candidate was asked to indicate the bad practices that should be avoided. The final evaluation lasted for 10 min.

The standard for evaluation shown in Table 2 were applied in both SP and simulation video models. The items were scored, and the average score, difficulty, and distinction of the test were analyzed. Two interviewers rated one resident with a minimum working experience of ten years in radiology independently, and the final score was the average of the two scores. The total possible score was 120.

Table 2 Evaluation item and scale of marks

Measures and statistical analyses

The data analysis and statistics were performed with Statistical Program for Social Sciences (SPSS) software, version 25.0.

The final performance score of any candidate was the average score of the two evaluators. The difficulty and discrimination ratios were calculated each year. The difficulty was calculated as D = M/F, D = difficulty, M = mean score of all candidates, and F = whole test scores.

The discrimination ratio was calculated as DR=(XH-XL)/N(H-L), DR = discrimination ratio, XH = total sum of the scores of the high-score group, high-mark group = top 27% candidates, XL = total sum of the scores of the low-score group, N = 27% of all candidates, H = highest score, L = lowest score.

The Kruskal-Wallis H test was used to perform pairwise comparison of the average scores between any two years using the simulation video model. In this study, statistical significance was determined at a P < 0.05.

Results

Objective evaluation: performance of the eight-year communication skills examination

A total of 1003 candidates completed the communications skills assessment from 2014 to 2021, and 99 candidates in 2014 underwent the SP assessment. From 2015 to 2021, 904 candidates underwent the simulation video assessment.

Overall, 47.4% of the candidates had a bachelor’s degree, 37.2% had a master’s degree, and 18.4% had a doctorate degree. The age range for radiologists in Beijing to take the board examination was 24–32 years old. For SP model in 2014, the average score, difficulty and disicrimination ratio is 86.7 ± 13.4, 0.87 and 0.95.The difficultly and discrimination ratio of simulation videos was relatively stable from 2015 to 2021, with a slight exception in 2019 and 2020 (Fig. 1). Simulation vedio model annual performances of average score, difficulty and discrimination ratio are listed in Table 3. The pairwise comparison using Kruskal-Wallis H showed that there was no significant difference in the average scores between 2016 and 2018, 2016 and 2019, 2017 and 2019, 2017 and 2020, as well as 2019 and 2020 exams (all p ≥ 0.05)(Table 4).

Fig. 1
figure 1

Annual performance discrimination ratio and difficulty from 2015 to 2021

Table 3 Annual performance of the communications skills evaluation (average of two evaluators)
Table 4 Kruskal-Wallis H test of the annual score for pairwise comparison

* means P<0.05.

Comunication skills survey of the candidates

In total, 657 residents participated in the communication skills survey (77.2% response rate) in 2014–2021(Table 5). Most radiology residents were aware of the communications skills training (78.4%). However, only half had access to communications skills training, which means the communication teaching is incorporated into the candidate’s yearly learning program, or the candidate took the initiative to learn the knowledge and skills of communication. Most residents preferred the simulation video model (62.7%).

Table 5 Communication skills survey feedback

Discussion

As far as we know, the evaluation of communication skills among radiology residents over eight years in 20 hospitals in Beijing is the most comprehensive standardized evaluation reported to date. According to the quantitative analysis of the average score, difficulty, and discrimination ratio, using the simulation video model resulted in a stable evaluation of communication skills. The subjective assessment of candidates via a survey demonstrated that the simulation video model was more acceptable than the SP model.

While most of the average scores of the simulation video model from 2015 to 2021 were relatively stable, statistical analysis revealed that the scores in 2015 and 2021 differed significantly from those in other years. Several factors played a role in influencing the evaluation scores in 2015 and 2021. The adoption of the newly developed simulation video model test in 2015 required candidates to adjust to this new evaluation form, which could have affected their performance. In 2016, one year after introducing simulation vedio model into communication skill evaluation, the candidates demonstrated more excellent communications skills than they did in 2015 under similar test difficulty. Additionally, in 2021, the clinical rotations of residents who took the exams were disrupted for over six months due to the COVID-19 pandemic, which could have affected their preparation and performance.

The result of the survey showed that nearly half of the radiology residents do not have convenient access to appropriate communications skills training. The unpopularity of communications skills training may not be unique to radiology, but in many other medical specialties, novel trials advancing communications skills training for Chinese doctors are emerging [14,15,16]. Nevertheless, these attempts with limited sample sizes show a long way to go to popularize communications education.

Standardized patients have been used in medical education to improve communication skills [17,18,19] and in many other subspecialty training, such as pre-anesthetic assessment [15] or counseling of community pharmacists [8]. We initially used the standardized patients for evaluation in 2014; in that year, with nearly 100 candidates, four standardized patients were included in the final evaluation, and the recruitment and training process was time-consuming. Compared to the simulation video modality, the higher discrimination ratio of the SP model indicated heterogeneity among SPs. It was understandable that every SP has his or her unique communication style and subjective judgment. In that sense, the setting of different SPs in one evaluation may comprise its fairness and standardization. Compared with SP, simulation videos can quickly achieve the goal of fairness with higher stability. We collected feedback from the candidates, and the majority perceived that the simulation video model was better than the SP model in assessing communications skills. The performance in the first year (2015) of using the simulation video model was significantly lower than that of the following years (2016–2021, p < 0.001), even with the lowest difficulty level. This may be due to the maladjustment to the novel test method. However, with more experience for both the assessment team and the candidates, performance remained stable in the years following the introduction of the simulation video model (2016–2021).

Based on the annual performance of the evaluation, the application of the evaluation itself was a powerful intervention for improving awareness and mastery of communications skills among the radiology residents. From the residency candidates’ perspective, even with a little formal training, communications skills can be improved by self-reflection [20, 21] or observation and learning from senior colleagues [22, 23].

Our communications skills evaluation model was based on real-life hospital scenarios, The scoring was based on the knowledge that the doctor can have appropriate responses during patient-doctor communication only when he or she can pick up on clues from the patients. The original intention for implementing the communication evaluation was to use deliberately poor behavior as a mirror to inspire the candidates’ reflection on the proper patient-doctor relationship. We are convinced that residents will take this reflection back to their daily work environments as a toolkit for proper action and benefit daily from patient-doctor communication. Further study will be performed to evaluate the effect of videos demonstrating good doctor-patient interactions incorporated into the communication training program. Moreover, we also intended to carry out an individual improvement education program based on the trainees’ real-life communication scenarios.

To conclude, after eight years of experience, the simulation video model showed better acceptance and stability in assessing communication skills among radiology residents. It could be used as a benchmark for evaluating and training communication skills in other medical specialties.

Data Availability

The datasets used and analyzed during the current study are available from the corresponding author upon reasonable request.

References

  1. Rubin P, Franchi-Christopher D. New edition of Tomorrow’s doctors. Med Teach. 2002;24(4):368–9.

    Article  Google Scholar 

  2. Larson DB, Froehle CM, Johnson ND, Towbin AJ. Communication in diagnostic radiology: meeting the challenges of complexity. AJR Am J Roentgenol. 2014;203(5):957–64.

    Article  Google Scholar 

  3. Rockall AG, Justich C, Helbich T, Vilgrain V. Patient communication in radiology: moving up the agenda. Eur J Radiol. 2022;155:110464.

    Article  Google Scholar 

  4. Gutzeit A, Heiland R, Sudarski S, Froehlich JM, Hergan K, Meissnitzer M, Kos S, Bertke P, Kolokythas O, Koh DM. Direct communication between radiologists and patients following imaging examinations. Should radiologists rethink their patient care? Eur Radiol. 2019;29(1):224–31.

    Article  Google Scholar 

  5. Barrows HS. An overview of the uses of standardized patients for teaching and evaluating clinical skills. AAMC. Acad Med. 1993;68(6):443–51. discussion 451 – 443.

    Article  Google Scholar 

  6. Barrows HS, Abrahamson S. THE PROGRAMMED PATIENT: A TECHNIQUE FOR APPRAISING STUDENT PERFORMANCE IN CLINICAL NEUROLOGY. J Med Educ. 1964;39:802–5.

    Google Scholar 

  7. Berger-Estilita JM, Greif R, Berendonk C, Stricker D, Schnabel KP. Simulated patient-based teaching of medical students improves pre-anaesthetic assessment: a rater-blinded randomised controlled trial. Eur J Anaesthesiol. 2020;37(5):387–93.

    Article  Google Scholar 

  8. Paravattil B, Kheir N, Yousif A. Utilization of simulated patients to assess diabetes and asthma counseling practices among community pharmacists in Qatar. Int J Clin Pharm. 2017;39(4):759–68.

    Article  Google Scholar 

  9. May W, Park JH, Lee JP. A ten-year review of the literature on the use of standardized patients in teaching and learning: 1996–2005. Med Teach. 2009;31(6):487–92.

    Article  Google Scholar 

  10. Zhang J, Han X, Yang Z, Wang Z, Zheng J, Yang Z, Zhu J. Radiology residency training in China: results from the first retrospective nationwide survey. Insights Imaging. 2021;12(1):25.

    Article  Google Scholar 

  11. Yin K, Huang Y, Wilkes MS, Gao H. Teaching communication skills to undergraduate medical students in China. Med Teach. 2016;38(6):636.

    Article  Google Scholar 

  12. Uppot RN, Laguna B, McCarthy CJ, De Novi G, Phelps A, Siegel E, Courtier J. Implementing virtual and augmented reality tools for Radiology Education and Training, Communication, and Clinical Care. Radiology. 2019;291(3):570–80.

    Article  Google Scholar 

  13. DeBenedectis CM, Gauguet JM, Makris J, Brown SD, Rosen MP. Coming out of the Dark: a curriculum for teaching and evaluating Radiology residents’ communication skills through Simulation. J Am Coll Radiol. 2017;14(1):87–91.

    Article  Google Scholar 

  14. Sun C, Zou J, Zhao L, Wang Q, Zhang S, Ulain Q, Song Q, Li Q. New doctor-patient communication learning software to help interns succeed in communication skills. BMC Med Educ. 2020;20(1):8.

    Article  Google Scholar 

  15. Wu X, Wang Z, Hong B, Shen S, Guo Y, Huang Q, Liu J. Evaluation and improvement of doctor-patient communication competence for emergency neurosurgeons: a standardized family model. Patient Prefer Adherence. 2014;8:883–91.

    Google Scholar 

  16. Chan CS, Wun YT, Cheung A, Dickinson JA, Chan KW, Lee HC, Yung YM. Communication skill of general practitioners: any room for improvement? How much can it be improved? Med Educ. 2003;37(6):514–26.

    Article  Google Scholar 

  17. Schreckenbach T, Ochsendorf F, Sterz J, Rüsseler M, Bechstein WO, Bender B, Bechtoldt MN. Emotion recognition and extraversion of medical students interact to predict their empathic communication perceived by simulated patients. BMC Med Educ. 2018;18(1):237.

    Article  Google Scholar 

  18. Gamble AS, Nestel D, Bearman M. Listening to young voices: the lived experiences of adolescent simulated patients in health professional education. Nurse Educ Today. 2020;91:104476.

    Article  Google Scholar 

  19. Ashida R, Otaki J. Survey of Japanese Medical Schools on Involvement of English-speaking Simulated Patients to Improve Students’ Patient Communication Skills. Teach Learn Med 2021:1–8.

  20. Pawluk SA, Zolezzi M, Rainkie D. Comparing student self-assessments of global communication with trained faculty and standardized patient assessments. Curr Pharm Teach Learn. 2018;10(6):779–84.

    Article  Google Scholar 

  21. Greenberg KB, Baldwin C. Use of a Self-Reflection Tool to Enhance Resident Learning on an adolescent Medicine Rotation. J Adolesc Health. 2016;59(2):230–5.

    Article  Google Scholar 

  22. Daley LK, Menke E, Kirkpatrick B, Sheets D. Partners in practice: a win-win model for clinical education. J Nurs Educ. 2008;47(1):30–2.

    Article  Google Scholar 

  23. Fallowfield L, Jenkins V, Farewell V, Saul J, Duffy A, Eves R. Efficacy of a Cancer Research UK communication skills training model for oncologists: a randomised controlled trial. Lancet. 2002;359(9307):650–6.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

2020 CMB Open Competition Research Program(Grant #20–385)by Huadan Xue. Construction of an innovative teaching platform based on the new generation of artificial intelligence(201920200102) by Zhengyu Jin. Top-scale medical discipline construction project of Peking Union Medical College(201910600101) by Zhengyu Jin. Medical imaging longitudinal course construction project(2019zlgc0140) by Daming Zhang. This research is funded by the National High-Level Hospital Clinical Research Funding, No.2022-PUMCH-B-68, and the Beijing Municipal Key Clinical Specialty Excellence Program by Zhengyu Jin. Beijing Municipal Key Clinical Specialty Excellence Program by Zhengyu Jin.

Author information

Authors and Affiliations

Authors

Contributions

All the authors contributed in the creation and organization of the yearly communication evaluations, DMZ and ND drafted the manuscript, DMZ, HDX and ZYJ participated in the design of the study, GH,ND and DMZ performed the statistical analysis. XW, HS, LS and YC participated in its design and coordination and helped to draft the manuscript. All authors read and approved the final manuscript.

Corresponding authors

Correspondence to Daming Zhang, Huadan Xue or Zhengyu Jin.

Ethics declarations

Ethics approval and consent to participate

All methods of this study were carried out by the Declaration of Helsinki. This study was approved by the Institutional Review Board of Peking Union Medical College Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College (No. S-K2067). Written informed consent was obtained from all individual participants included in the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ding, N., Hu, G., Wang, X. et al. Simulation video: a tool to evaluate communications skills in radiologist residents. BMC Med Educ 23, 586 (2023). https://doi.org/10.1186/s12909-023-04582-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04582-w

Keywords