Are lots of exams at medical school a bad thing?
Is there a link between the amount of assessment at medical school and performance of doctors in postgraduate exams? Oliver Devine and Andrew Harborne explain the results of their study
- By: Oliver Devine, Andrew Harborne
When we reflected on our different experiences of medical school we found striking disparity in how we were taught and assessed. Medical schools are required by the General Medical Council to make sure that the students they graduate fulfil the outcomes of Promoting Excellence: standards for education and training, but how they teach and assess students to meet these requirements is at the discretion of individual institutions.
Studies have shown that graduates from some UK medical schools have a higher first time postgraduate exam pass rates than others.  With this in mind we wanted to explore a potential underlying cause of this variation: the volume of assessment across UK medical schools. Other factors are involved, but this was one of the more straightforward variables that we could easily quantify and compare with postgraduate performance.
How does assessment volume vary between medical schools?
Our first step was to determine how many minutes of written and practical assessment each UK medical school requires students to sit over course of their degree. We defined assessment as any type of exam that was taken under formal conditions and counted towards students’ end of year mark. Written assessments encompassed anything from multiple choice questions (MCQs), extended matching questions (EMQs), to essays or reflective pieces. Practical assessments were defined as objective structured clinical examinations (OSCEs). We contacted the assessment offices of 25 of the 33 medical schools in the UK to verify the data.
The median volume of written assessment across all medical courses was 1900 minutes (see fig 1; mean 2000, standard deviation 600). These data highlighted the broad differences in how each medical school examines its students. For example, the first two years of the University of Cambridge course has a greater number of minutes of exams than 16 schools have over their entire five or six year courses. The highest amount of written assessment time was provided at Cambridge, Birmingham, Nottingham, Queen Mary (London), and Newcastle. The least amount of written assessment time was provided at Liverpool, Keele, Leeds, Hull-York, and Dundee. 1
Across the span of a degree the median was 400 minutes of practical exams (fig 2 mean 500, standard deviation 200). Some UK medical schools do not conduct any preclinical practical examinations. Hull-York required the highest number of minutes (1000), whereas Southampton required the lowest (200). 2
Box 1: Student views
View from University of Cambridge student
“Although exam weeks were a week from hell (six days of exams back to back), they were useful in retrospect. It meant that you left with fewer knowledge gaps; if you had a weak spot, you would be found out very quickly. The number of exams forced you to cover everything well.”
View from University of Leeds student
“A lot of the assessment in Leeds was based on reflective and case based essays plus formative work books to complete. In my opinion these were time consuming for students’ but they were typically justified by the medical school as evidence we were meeting the GMC outcomes.”
Does total number of medical school exam minutes influence postgraduate exam performance?
)Our next step was to see if there was any correlation between the total of exam minutes at each medical school and graduates’ first time pass rate in major postgraduate exams (MRCP, PACES, MRCGP, MRCS) between 2008 and 2014. Overall, there was a significant correlation between total minutes of assessment time and mean postgraduate attainment (r[s]=0.515, P=0.014, n=22). 3 Notably, we also found a strong association between written assessment volume and postgraduate practical performance, but no correlation between practical assessment volume at medical school and performance in postgraduate practical assessment.
What do the results mean?
There could be several reasons for the variation in postgraduate exam performance. Students who are assessed more may be better at taking exams; schools that teach students more may feel entitled to assess them more; a high volume of assessment may compel students to learn more.
Assessment is just one of the factors that influence postgraduate performance, and it is a much easier variable to measure than volume of teaching time, teaching style, or environmental factors that could also cause this variation. Further research is needed to see what effect, if any, these other factors have on postgraduate performance.
What does this mean for students?
Students who do not pass their postgraduate exams first time face the prospect of slower career progression and the additional financial and academic pressure of re-sitting exams. If the volume of assessment time is a contributing factor to this then it is important that schools are transparent and accountable about how they will prepare their students for the future, particularly when tuition fees continue to rise. We should also ask what role the Medical Schools Council and the GMC should be playing in explaining and correcting these differences to ensure that graduates are all of a similar standard.
The GMC has announced the introduction of the UK Medical Licensing Assessment, which is due to be sat for the first time by final year medical students in 2021. In recent years more national exams have been introduced, such as the situational judgement test and the prescribing safety assessment—both need to be passed to get on to the foundation programme.
The United States already has a national licensing exam—the US Medical Licensing Examination (USMLE)—which alone represents more assessment than is administered by six UK medical schools over the entire five year programme. A study has even shown an inverse association with mortality in patients with acute myocardial infarction treated by US clinicians: for every additional point scored on the USMLE, mortality decreased by 0.2%. Similar evidence is shown for the Canadian licensing examination.
One of the primary arguments for the UKMLA is that differences in local assessment practices make it difficult to compare students from different medical schools at the point of graduation. A national examination is seen as a way to gain an overview of the competency of all students wishing to register with the GMC. In response to this research, Niall Dickson, chief executive of the GMC said: “the introduction of a licensing assessment would provide the reassurance to patients that medical graduates from the UK, and those from overseas, are meeting the same standards. There is still much to do on this, but working with everyone involved in medical education, we are confident we can together create an internationally renowned benchmark for entry to medicine that will reflect and enhance the world class standing of UK medicine.”
What this study tells us
- The number of minutes that you are assessed for in written exams at medical school correlates with your chances of passing postgraduate exams first time around
- UK universities with the most written assessment time were Cambridge, Birmingham, Nottingham, Queen Mary (London), and Newcastle
- UK universities with the least written assessment time were Liverpool, Keele, Leeds, Hull-York, and Dundee
- No correlation was found between the number of minutes of practical assessment at medical school and performance in postgraduate practical assessments
What we don’t know yet
- Which other factors contribute to postgraduate performance other than the total number of minutes students are examined. Other factors, such as teaching style, clinical teaching time, and peer to peer teaching, are harder to quantify,
- Whether students who have more exams timetabled are compelled to learn and revise more
- Whether being assessed more improves your exam technique
- Is there a style of written assessment that best prepares you for the format of postgraduate exams?
1University College London, 2Hull Royal Infirmary
Correspondence to: firstname.lastname@example.org
Competing interests: OD and AH are the authors of the original research paper.
Provenance and peer review: Not commissioned; not externally peer reviewed.
- McManus IC, Elder TA, de Camplain A, et al. Graduates of different UK medical schools show substantial differences in performance on MRCP(UK) Part 1, Part 2 and PACES examinations. BMC Med 2008;6:5.
- McCrorie P, Boursicot. KAM. Variations in medical school graduating examinations in the United Kingdom: are clinical competence standards comparable? Med Teacher 2009;31:223-9.
- Devine OP, Harborne AC, McManus IC. Assessment at UK medical schools varies substantially in volume, type and intensity and correlates with postgraduate attainment. BMC Med Ed 2015;15:146.
- Hately P. A new national exam for medical students. Student BMJ 2015;23:h4208.
- Norcini JJ, Boulet JR, Opalek A, Dauphinee WD. The relationship between licensing examination performance and the outcomes of care by international medical school graduates. Acad Med 2014;89:1157-62.
- Tamblyn R, Abrahamowicz M, Dauphinee WD, et al. Association between licensure examination scores and practice in primary care. JAMA 2002;288,3019-26.
Cite this as: Student BMJ 2015;23:h6516
- Published: 08 September 2016
- DOI: 10.1136/sbmj.h6516