Medical schools should stop cherry picking league table data
Applicants should examine the GMC’s new database instead, says Richard Wakeford
How can two universities both claim to be the best? The student recruitment website for Aberdeen’s School of Medicine and Dentistry declares that it is “Ranked top in Scotland and fifth in the UK” while, from 70 miles away, Dundee’s website claims that its Bachelor of Medicine and Bachelor of Surgery (MBChB) degree course is “Ranked fifth in the Guardian 2013 University Guide” and “Dundee is ranked no 1 for medicine in Scotland.”
I was startled to see two adjacent medical schools each claiming to be top for medicine in Scotland, so I decided to examine the three main UK university league tables. These are the Complete University Guide (CUG), the Guardian University Guide’s league tables and the Times and Sunday Times’ Good University Guide.
Comparing the guides, I found that Aberdeen was ranked fifth in the UK by the CUG, but 16th by the Guardian. Dundee was fifth according to the Guardian, but 22nd on the CUG list. The Times had the two universities as sixth and 10th in the UK, respectively. So Aberdeen was rated first in Scotland by the CUG and the Times, but Dundee was first in Scotland according to the Guardian. If three different systems can throw up such different statistics for just two institutions, how well do they agree with each other about how other medical schools should be ranked, and more importantly, which should we believe is most credible?
Although there is some agreement, especially at the top end, there is plenty of variance (table 1). The correlation between the CUG and Guardian rankings shows 14% of shared variance, between the CUG and the Times 31%, and between the Guardian and the Times 60%. Table 2 aggregates the listings from the three 2015 guides. At the top there is some consistency—Cambridge, Oxford, and UCL are in each guide’s top five schools—but further down there is much variation, and at the tail only King’s College London is in the bottom five of each of the listings.
|Guide||Complete University Guide||Guardian||Times|
|Complete University Guide||1||0.373*||0.554**|
|Rank according to averaged rankings||Medical school||Complete University Guide2015||Guardian2015||Times and Sunday Times 2015||Average of 2015ranks|
|23||Brighton & Sussex||24||18||23||21.7|
|28||Hull & York||29||26||18||24.3|
Such differences between the listings appear to be partly caused by variation in the number of variables aggregated by the different table makers. The CUG and the Times take into account the research quality of the institutions, whereas the Guardian does not but incorporates several indicators used by the National Student Survey (NSS), such as how satisfied the final year medical students were with the quality of feedback and teaching on each course.
Perhaps it is not surprising that universities’ marketing departments want to use the data from the guide most favourable to their institution. Cherry picking the most advantageous listing may seem appropriate behaviour to a university’s press relations department. However, these are scientific institutions that should apply rigorous standards to the evidence that they use and publish. So if you know that you are fifth in one league table but 16th or 22nd in another, how honest is it to assert that you are fifth? Unreferenced assertions are also common. Imperial College’s website says it is: “second in Europe and third in the world for clinical, pre-clinical and health”—though bafflingly elsewhere on the website it claims: “third in Europe and fourth in the world for clinical, pre-clinical and health.” These data apparently come from the Times Higher Education World University Rankings, one of two international listings of universities. But why is the ranking not referenced?
What’s the solution? Do we need a code of practice from the Medical Schools Council to regulate use of these statistics? That seems extreme. Perhaps it would be better if prospective applicants examined the General Medical Council’s (GMC) new database, which lists how well doctors from different medical schools actually perform in their various postgraduate assessments. A league table (table 3) based on these outcomes can be found in the online version of this article.
|Rank according to pass rate||Qualifying body: medical school||N Attempts in year||Pass rate %||GMC listed statistical significance of pass rate compared with average|
|5||Imperial College London||978||77.8||Above average|
|6||University College London||757||76.2||Above average|
|19||Kings College London||576||68.2|
|21||London** (including St George’s)||4900||67.9||Below average|
|27||East Anglia||426||59.4||Below average|
How well do the different league tables predict this ranking of actual performance? The CUG quite well (r=0.69; P<0.01), the Times less well (r=0.41; P<0.05), the Guardian insignificantly (r=0.21 not significant).
In possession of these newfound data, I analysed whether there any big differences in these rankings of performance compared with those of the three guides. Using a criterion of ten or more places’ difference between the GMC’s rankings and the average of all the guides in Table 1, Cardiff, Leicester, Nottingham, and most of all Bristol are undervalued by the guides, which appear to overstate Aberdeen, Dundee, Keele, Manchester, and Peninsula.
Frankly, medical schools should behave more honestly—quoting all three guides and not just the one that suits them. In the meantime, I would advise applicants to medicine to mistrust and ignore all quotes from league tables on the medical school websites because they may be economical with the truth, or even misleading.Richard Wakeford, life fellow
1Hughes Hall, University of Cambridge, Cambridge CB1 2EW
Correspondence to: firstname.lastname@example.org
Competing interests: I have read and understood the BMJ Group policy on declaration of interests and have no relevant interests to declare.
Provenance and peer review: Not commissioned; not externally peer reviewed.
- University of Aberdeen. School of Medicine and Dentistry. Study Medicine (MBChB) www.abdn.ac.uk/smd/medicine/index.php.
- University of Dundee. Choose Scotland, choose Dundee. Dundee is ranked No. 1 for Medicine in Scotland. July 2014. http://medicine.dundee.ac.uk/news/choose-scotland-choose-dundee.
- General Medical Council (GMC). Progress of doctors in training split by medical school. www.gmc-uk.org/education/25496.asp.
Cite this as: Student BMJ 2015;23:h1359