X Close

Why you can’t rely on the news media to understand… science

Credit: Getty


March 13, 2018   4 mins

One of UnHerd’s key concerns is how we can better measure and report our fast-changing times. With that in mind, we are running a series on which subject areas/ institutions/ parts of the world are least helped by how the news media reports upon them. Tom Chivers kicks us off… 

Here is a fact that many people might not realise, and that explains a lot of what the media gets wrong about science: usually, scientific studies are just not that interesting or important. That may sound ridiculous, but it’s true. Scientific studies, plural, are interesting and important. But any given one of them, not so much. It’s not always true – the first papers about the discovery of the Higgs boson or gravitational waves, they were pretty interesting and important – but often, especially in complex areas like healthcare or psychology, it is.

That’s because science is messy. Say you’re trying to find out whether wine gums prevent or cause haemorrhoids, so you do a literature review. You find 20 studies, but they all say different things. Five of them say people who eat wine gums are a bit more likely to get piles. But six say a bit less. Two of them say “much less likely”, one says “much more likely”. And six find no significant effect at all. Findings like these are completely normal: any real effects are often hard to tease out from noisy data.

If you actually wanted to know what the impact of wine gums on haemorrhoids was, you’d look at all 20 studies in the aggregate. Maybe, after carefully looking at the data and checking that the studies were well conducted, you’d conclude, cautiously, that they might have a small protective effect.

I found 1,700 examples of the exact phrase “new study says” on the Daily Mail website alone
-

But if you wanted to tell people that wine gums cause haemorrhoids, to scare them, then you’d just take one of the studies that show the opposite. Then you might, for instance, put it on the front page of your newspaper, under the headline “Wine gums cause piles, says new study”. It would be literally true; the study does say that. But it would also be nonsense, because the evidence does not show that. As a wise blogger once said: beware the man of one study.

Unfortunately, the media – by its nature – is largely made up of Men Of One Study, or more accurately Stories Of One Study. That’s because we are incentivised, as journalists, to show new things and sudden change. We need events. Plane crashes, not mortality risk statistics.

In science, the events are usually the publication of new studies. I found 1,700 examples of the exact phrase “new study says” on the Daily Mail website alone. New study says half a glass of wine could stop some babies breathing. New study says a glass of wine a day can lead to the shakes. A new study says coffee can make you live longer. Some of these studies may accurately represent the state of reality, but many will not.

The MMR scare was a ridiculous bout of overexcitement caused by a single study that would have been small and uninteresting even if it hadn’t used falsified data. Credit: David McNew/Getty

It’s also worth remembering that the media is also incentivised to find the most dramatic studies, because drama sells, and that the most dramatic, surprising results are the least likely to be true. (If it’s surprising, we weren’t expecting it, so it doesn’t fit the existing body of evidence.)

It’s not fair to pick on the Mail. It happens across the industry, although some outlets do it more than others. And it is, I think, the fundamental problem of how science is represented in the media. At my old employer, BuzzFeed, we used to try to get around it by not reporting on single studies unless we felt they met a threshold of believability and importance; otherwise we tried to do original reporting. But that’s time-consuming and expensive, and few reporters have that luxury.

I’ve spoken to science journalists who have to write five stories a day. All you have time to do is find the sexiest press releases and write them up. As the media withers, and there are fewer reporters to fill the same space, the problem won’t go away.

The media’s MMR scare story literally cost lives
-
This matters. It gives people the impression that science is making it up as it goes along (“Oh, I see red wine prevents cancer this week”). More important, it gives people bad information about health and risk; there are real things you should know about diet and exercise and so on, but it’s lost in a welter of dramatic-sounding but thinly evidenced stories about low-carb diets and broccoli preventing heart disease. It leads to things like the MMR scare, a ridiculous bout of overexcitement caused by a single study that would have been small and uninteresting even if it hadn’t used falsified data. There’s an example that literally cost lives.

Normally it’s nice to end this sort of piece on a “how we’ll fix it” note, but honestly I don’t know how. The incentives to publish dramatic stories on the back of a single study are so extreme. All I can recommend is that readers, if they see a story about substance X having effect Y on humans, look out for the phrase “new study says” and treat it with extreme caution.


Tom Chivers is a science writer. His second book, How to Read Numbers, is out now.

TomChivers

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments