The association between exaggeration in health related science news and academic press releases
Retrospective observational study
Sumner P, Vivian-Griffiths S, Boivin J, Williams A, Venetis CA, Davies A, et al. The association between exaggeration in health related science news and academic press releases: retrospective observational study by Petroc Sumner and colleagues (BMJ 2014;349:g7015)
Objective To identify the source (press releases or news) of distortions, exaggerations, or changes to the main conclusions drawn from research that could potentially influence a reader’s health related behaviour.
Design Retrospective quantitative content analysis.
Setting Journal articles, press releases, and related news, with accompanying simulations.
Sample Press releases (n=462) on biomedical and health related science issued by 20 leading UK universities in 2011, alongside their associated peer reviewed research papers and news stories (n=668).
Main outcome measures Advice to readers to change behaviour, causal statements drawn from correlational research, and inference to humans from animal research that went beyond those in the associated peer reviewed papers.
Results 40% (95% confidence interval 33% to 46%) of the press releases contained exaggerated advice, 33% (26% to 40%) contained exaggerated causal claims, and 36% (28% to 46%) contained exaggerated inference to humans from animal research. When press releases contained such exaggeration, 58% (95% confidence interval 48% to 68%), 81% (70% to 93%), and 86% (77% to 95%) of news stories, respectively, contained similar exaggeration, compared with exaggeration rates of 17% (10% to 24%), 18% (9% to 27%), and 10% (0% to 19%) in news when the press releases were not exaggerated. Odds ratios for each category of analysis were 6.5 (95% confidence interval 3.5 to 12), 20 (7.6 to 51), and 56 (15 to 211). At the same time, there was little evidence that exaggeration in press releases increased the uptake of news.
Conclusions Exaggeration in news is strongly associated with exaggeration in press releases. Improving the accuracy of academic press releases could represent a key opportunity for reducing misleading health related news.
Why do the study?
News coverage of medical research generates interest, influence for academics and their institutions, and all important funding for further research. Health related news also makes money for news outlets including newspapers and broadcasters. Those producing medical research and those trying to sell it as entertainment to the paying public both have a vested interested in making health related news stories as “big” as possible. Exaggeration and hype are well documented problems in medical science reporting.
When news is exaggerated, public perception is distorted. People are misinformed, with potentially damaging consequences to their health. The research agenda is also distorted, along with the flow of money. Funds are diverted from research that we all need to research that makes a good story, which is bad for everyone and for the credibility of medical research. Journalists are often blamed for hype and misrepresentation in health related news, but these authors decided to investigate another possible culprit: the press releases issued by academic institutions and their researchers.
What did the authors do?
They started with all press releases issued in 2011 by 20 of the United Kingdom’s leading universities, and identified 462 releases about health related research published in academic journals. For each press release, they sourced the published paper and all associated print and online news (668 stories in total). Finally, they looked for exaggeration in each press release and news story, where exaggeration meant claims, advice, or inferences that went beyond those in the peer reviewed paper.
The authors developed a set of codes to quantify three types of exaggeration in news stories and press releases: inappropriately strong advice to change behaviour, inferring cause and effect from observational research (for example “wine causes stress” when the paper reported an association between wine and stress), and drawing conclusions for humans from research done on animals, cells, or simulations.
The main analyses report the prevalences of these three types of exaggeration in health related news stories and their linked press releases, then explore any association between the two. In other words, is a news story more likely to be exaggerated when a linked press release is exaggerated in the same way?
What did they find?
Forty per cent of press releases and 36% of health related news gave firmer advice to readers than their linked research papers, 33% of press releases and 39% of news stories made stronger causal inferences than their linked papers, and 36% of press releases and 47 % of news stories exaggerated the implications of non human research.
For all three types, exaggeration in news was significantly associated with exaggeration in the related press release. You can see this clearly in the figure. If you look at the white bars news stories are significantly more likely to overstate the results of research when academic press releases overstate results in the same way. For example, when press releases exaggerated advice, 58% of related news did too. When press releases did not exaggerate advice, just 17% of news stories did (odds ratio 6.5, 95% confidence interval 3.5 to 12.4).
These respective rates were 81% v 18% for news overstating cause and effect (19.7, 7.6 to 51.4) and 86% v 9.6% for news overstating the human implications of non-human research (56.1, 14.9 to 211).
In further analyses, press releases that were categorised as being exaggerated were no more likely to generate news stories (and did not generate more news overall) than other press releases.
What are the study’s strengths and weaknesses?
The complex and detailed coding used to identify and quantify three varieties of exaggeration is a strength of this paper and a key advance on previous work. But the complexity is also a weakness. Each set of articles (press release, published paper, and related news) took between three and four hours to code, so this research will be hard for others to replicate and confirm. Double coding is an important check on reliability and these authors were able to double code only a small proportion of articles, presumably because coding took so long. Concordance between coders was a reassuring 91%.
The authors had a reasonably large sample of press releases, papers, and news stories to analyse. But the source material was taken from only 20 UK universities, so the results can’t be generalised to other universities or to other countries.
Papers published in academic journals are not meant to overstate their results, but overenthusiasm can sneak through the most rigorous peer review. This study takes no account of the exaggeration already present in published papers. Nor does it explore the contribution of exaggeration in press releases issued by academic journals.
This study is retrospective, which means the authors looked back at existing data sources (press releases, papers, and news stories) to answer their research question. Prospective studies that start with a question then collect the specific data they need to answer it are generally stronger and easier to interpret. They collect tailor made data as they go, rather than making do with what’s already available.
Finally, the analyses comparing the volume of news generated by exaggerated and non exaggerated press releases were fairly crude. The hyped and unhyped releases were about different sets of papers that may have had different news values (newsworthiness). To find out if exaggerated press releases generate more news, ideally you would compare the volume of news from pairs of press releases about the same paper—one that overstated the results and one that did not.
What do the findings mean?
Press releases issued by top UK universities regularly overstate the findings of medical research. The releases give unwarranted advice, make unjustified claims of causality, and infer outcomes for humans from research done on animals or cells. These misdemeanors are associated with similar exaggerations and misrepresentations in related news stories in print and online. Journalists are often blamed for poor reporting and hype, but it now seems clear that at least some of it is already present in academic press releases.
Will fixing press releases help prevent potentially damaging misinformation hitting the headlines? It’s too early to say, but, in his linked editorial, Ben Goldacre considers some practical steps for further investigation: ensure that press releases are signed by all authors with a stake in the content, published alongside peer reviewed academic papers, and subject to post publication scrutiny and feedback in the response sections of medical journals. 
Reforming a global research enterprise that rewards media visibility and impact above the value of research to patients may take a little longer.
How not to exaggerate the results of this study
This study has an observational design. The authors were looking for a link between an exposure (exaggeration in press releases) and an outcome (exaggeration in related news), and they found one. We can’t conclude that hype in university press releases causes hype in related news stories. All we can say is that these two things are associated: if a press release overstates a paper’s results, news stories about the same paper are also more likely to overstate the results. Direct cause and effect is one explanation. Another possibility is that some research topics or types of research question lend themselves to exaggeration more than others, so exaggeration appears independently in both press releases and news stories. This is called confounding. Look out for it in all observational research.
Correspondence to: email@example.com
Competing interests: AT helps select research for publication in The BMJ, and was the handling editor for Sumner and colleagues’ paper.
Provenance and peer review: Commissioned; not externally peer reviewed.
- Goldacre B. Preventing bad reporting on health research. BMJ 2014;349:g7465.
Cite this as: BMJ 2015;23:h381