Intended for healthcare professionals

Career Focus

Assessment tools for foundation programmes—a practical guide

BMJ 2005; 330 doi: https://doi.org/10.1136/bmj.330.7500.s195-a (Published 14 May 2005) Cite this as: BMJ 2005;330:s195
  1. Helena Davies, consultant in medical education,
  2. Julian Archer, medical education research fellow,
  3. Shelley Heard, deputy dean and director,
  4. Lesley Southgate, professor of medical education
  1. Sheffield Children's Hospital, Sheffield Children's NHS Trust, Sheffield S10 2TH
  2. Sheffield Children's Hospital
  3. London Deanery, 20 Guilford Street, London WC1N 1DZ
  4. St George's Hospital Medical School, London

Abstract

Do you know your mini-PAT and mini-CEX from your CbD, not to mention your DOPS? Helena Davies, Julian Archer, Shelley Heard, and Lesley Southgate explain how doctors will be assessed

A major reform of postgraduate medical education is set out in Modernising Medical CareersThe Next Steps (MMC).1 According to one of the key principles of “foundation programmes,” these programmes should be trainee based and competencies should be assessed. Their effective implementation is therefore dependent on effective, quality assured, assessment in the workplace.

The benefits of firm foundations: the Gherkin Tower, London

Credit: JONATHAN BANKS/REX; DAMIEN GILLIE/CONSTRUCTIONPHOTOGRAPHY.COM; EDDIE MULHOLLAND/REX,

Before foundation programmes go “live” in August 2005, a large assessment pilot is being undertaken across England, Wales, and Northern Ireland. The assessment tools for the pilot were chosen on the basis of the draft MMC curriculum,2Good Medical Practice,3 the principles for assessment set out by the Postgraduate Medical Education and Training Board (PMETB),4 and available research evidence. Key themes that emerge from the literature include the need for wide sampling of content and across assessors and the important concept that individual tools must form part of a cohesive assessment system that maps to the relevant curriculum. An essential requirement for assessment methods for foundation programmes is that they can be used in a wide range of clinical contexts and generate feedback that informs developmental planning for trainees.

The tools

The assessment system consists of four tools.

Mini-PAT and MSF (multisource feedback)

The mini peer assessment tool (mini-PAT) consists of 16 questions mapped to the domains of Good Medical Practice. Views from a range of clinical colleagues are sought and collated. The average scores per question and the average overall score are compared with the trainee self ratings and an equivalent population (the whole cohort of senior house officers in their second foundation year). Free text responses are anonymised but fed back verbatim. The questions are derived from a longer peer assessment tool SPRAT (Sheffield peer review assessment tool), which has been validated for use in more senior doctors. Mini-PAT will be undertaken twice in one year for each trainee. For developmental purposes, the trainee's educational supervisor will give feedback after the first cycle. All responses from both cycles will be collated to inform the overall assessment profile for the trainees at the end of the year.

In the West Midlands. a method has been developed and validated in the region—team assessment of behaviours (TAB) instead of mini-PAT.

Mini-CEX

The mini clinical evaluation exercise (mini-CEX7) is an assessment of an observed clinical encounter, using a structured checklist with developmental feedback provided immediately after the encounter. Mini-CEX for foundation programmes has been modified from the American Board of Internal Medicine's mini-CEX. In the United Kingdom, the Royal College of Physicians has undertaken pilot work with mini-CEX.

CbD

Case based discussion (CbD8) consists of a structured discussion, using patients' notes and the trainee's actual entries as the focus. Its particular strength is in the evaluation of clinical reasoning. Work using CbD as part of the General Medical Council's performance procedures has informed development of the foundation CbD tool.

DOPS

Direct observation of procedural skills (DOPS9) is an assessment of an observed technical skill by using a structured checklist. Developmental feedback is provided immediately after the encounter. Based on other tools for the assessment of technical skills, DOPS was developed and initially evaluated by the Royal College of Physicians.

Trainees will be asked to undertake six each of the mini-CEX, CbD, and DOPS over the course of their second foundation year.

Practicalities

Administration and forms

A central administrative system with specialist software for scanning and generating forms is used, to minimise the paperwork and administrative burden in deaneries. From this system, an electronic assessment profile can be built up for all trainees, which will be fed back to them at the end of the year.

All trainees participating in the pilot will receive a pack containing a basic data form, a self mini-PAT form, and a form for nominating raters (assessors), which they complete and return to the foundation programme coordinator in their trust, who sends them to the central administrative centre. Distribution of mini-PAT forms to assessors is administered centrally, and, once complete, the forms are returned directly to the administration centre for scanning and collation.

In addition each trainee receives three carbonised triplicate pads—one each for mini-CEX, CbD, and DOPS. These pads contain 10 copies of the forms for rating and written guidance for trainees and assessors. Each trainee is expected to undertake six assessments, using each tool. Trainees retain one copy of the form for their portfolio, one copy goes to their educational supervisor, and one is returned centrally via the trust's foundation coordinator.

Doing the assessments

Ensuring that assessments are undertaken, selecting when they take place and who assesses them—using mini-CEX, CbD, and DOPS—is the responsibility of the trainee. Trainees should select a range of clinical problems to demonstrate as many of the competencies required from foundation training as possible and use a range of assessors. Assessors may include consultants, general practitioners, specialist registrars, nurses, or allied health professionals. The individual undertaking the assessment has to be an appropriate person to make a sensible judgment about how well the procedure or encounter has gone. For mini-CEX, CbD, and DOPS, feedback is given immediately after the encounter and includes highlighting anything especially good, identifying any developmental needs, and agreeing an action plan, in line with good feedback practice.10 Trainees who have used the tools to date in the United Kingdom and the United States have welcomed the opportunity to be observed by someone more experienced while they interact with a patient and to receive direct and immediate feedback. Each encounter is expected to take 15-20 minutes, including feedback. Where possible encounters should be built into the daily routine. For example, a specialist registrar could take 15 minutes to observe a foundation doctor explaining results and a management plan to a patient in the acute medical admissions unit, or to see a consultant spend 15 minutes at the end of the ward round undertaking a CbD around an entry in the notes for a patient whom the foundation doctor has admitted on the take.

Three practical, workshop based, training days have taken place, and materials from these are available on the MMC website.11

Challenges

Many challenges remain. Widespread implementation of workplace based assessment is a cultural change for the NHS. The assessment pilot aims to spread the workload associated with assessment across several healthcare professionals and settings and to integrate it into clinical practice. The potential for educational impact and thus increased satisfaction with training for trainees and trainers was a major driver when planning the assessment system. It should also contribute to the overall enhancement of patients' safety by improving formal observation of trainees. Detailed quality assurance (reliability, validity, feasibility, identification of systematic sources of bias) is essential, and this is built into the programme. Finally, the assessment pilot will be evaluated in detail so that positive as well as problematic aspects of the process can be identified.

For more information, see www.mmc.nhs/assessment

References