Wednesday 15 September 2010

Science and journalism: an uneasy alliance


“Fish oil helps schoolchildren to concentrate” shouted the headline in the Observer, “US academics discover high doses of omega-3 fish oil combat hyperactivity and attention deficit disorder”.  Previous research on this topic has been decidedly underwhelming (see slides for 7th BDA international conference), so I set off to track down the source article.

Well, here's a surprise:  the study did not include any children with ADHD. It was an experiment with 33 typically-developing boys. And another surprise: on a test of sustained attention, there was no difference between boys who'd been  given supplementation of an omega 3 fatty acid (DHA) for 8 weeks and those given placebo. Indeed, boys given low-dose supplementation made marginally more errors after treatment. So where on earth did this story come from? Well, in a brain scanner, children given DHA supplementation showed a different pattern of brain activity during a concentration task, with greater activation of certain frontal cortical regions than the placebo group. However, the placebo group showed greater activation in other brain regions. It was not possible to conclude that the brains of the treated group were working better, given the large number of brain regions being compared, and the lack of relationship between activation pattern and task performance. 

A day or two later, another article was published, this time in the Guardian, with the headline Male involvement in pregnancy can weaken paternal bond. I tried to track down the research report. I couldn’t find it. I traced the researcher. He told me that the piece was not referring to published research, but rather to views he had expressed in an interview with a journalist. He told me he had not intended to recommend that fathers stay away from antenatal classes. He was also concerned that the article had described him as Director of his research institute - in fact he is a lecturer.

At this point, inspired by the example of the Ig Nobel prize, I announced the Orwellian Prize for Journalistic Misrepresentation, an award for the most inaccurate newspaper report of an academic piece of work, using strict and verifiable criteria. An article would get 3 points for an inaccuracy in the headline, 2 points for inaccuracy in the subtitle, and 1 point for inaccuracy in the body of the article. The fish oil piece totalled 16 points.

Comments on the prize were mostly supportive. I had thought I might attract hordes of journalistic trolls but they did not materialise. Indeed, several journalists responded positively, though they also noted some difficulties for my scoring system. They politely pointed out, for instance, that headlines, to which I gave particular weight in the scoring, are not written by the journalist. Also, it is not unknown for university press officers, who regard it as their job to get their institution mentioned in the media, to give misleading and over-hyped press releases, sometimes endorsed by attention-seeking researchers.

But over in the mainstream media, a fight was brewing up. Ben Goldacre, whose Bad Science column in the Guardian I’ve long regarded as a model of science communication, independently picked up on the fish oil article and gave its author a thorough lambasting.  Jeremy Laurance of the Independent retorted with a piece in which he attacked Goldacre. Laurance made three points: first, science journalism is generally good; second, reporters can’t be expected to check everything they are told (implying that the fault for inaccuracy lay with the researcher in this case), and third, that journalists work under intense pressure and should not be castigated for sometimes making mistakes.

I would be the first to agree with Laurance’s initial point. During occasional trips to Australia and North America, I've found the printed media to be mostly written as if for readers with rather few neurons and no critical faculties. Only when deprived of them do you appreciate British newspapers. They employ many talented people who can write engagingly on a range of issues, including science. Regarding the second point, I am less certain. While I have some sympathy with the dilemma of a science reporter who has to report on a topic without the benefit of expertise, stories of hyped-up press releases and self-publicising but flawed researchers are numerous enough that I think any journalist worth their salt should at least read the abstract of the research paper, or ask a reputable expert for their opinion, rather than taking things on trust. This is particularly important when writing about topics such as developmental disorders that make people’s life a misery. Many parents of children with ADHD would feed their child a diet of caviare if they felt it would improve their chances in life. If they read a piece in a reputable newspaper stating that fish oil will help with concentration, they will go out and buy fish oil.(I've no idea whether fish oil sales spiked in June, but if anyone knows how to check that out, I'd be interested in the answer).  In short, reporting in this area has consequences – it can raise false hopes and make people spend unnecessarily.

On the third point, lack of time, Goldacre’s supporters pointed out that working as a doctor is not exactly a life of leisure, yet Ben manages to do a meticulously researched column every week. Other science bloggers write excellent pieces while holding down a full-time day-job.

It was unfortunate indeed that the following week, Laurance, whom I've always regarded as one of our better science journalists, produced a contender for the Orwellian in an Independent report on a treatment for people with Alzheimer’s disease. Under the title 'Magnets can improve Alzheimer’s symptoms' he described a small-scale trial of a treatment based on repetitive transcranial magnetic simulation, a well-established method for activating or inhibiting neurons by using a rapidly changing strong magnetic field. In this case,  the account of the research seemed accurate enough. The problem was the context in which Laurance placed the story, which was to draw parallels with ‘magnet therapy’ involving the use of bracelets and charms.  Several commentators on the electronic version of the story went on the attack, with one stating “This is not worthy of print and it is absolutely shameful journalism.”

I was recently interviewed for the More or Less radio 4 program about the Orwellian Prize, together with a science journalist who clearly felt I was being unfair in not making allowances for the way journalists work – using arguments similar to those made by Jeremy Laurance. At one point when we were off the air, she said, “But don’t you make loads of mistakes?” I realised when I said no that I was simultanously tempting fate, and giving an impression of arrogance.  Of course I do make mistakes all the time, but  I go to immense lengths of checking and rechecking papers, computations, etc, to avoid making errors in published work. A degree of obsessionality is an essential attribute for a scientist. If our published papers contained ‘loads of’ mistakes we’d be despised by our peers, and probably out of a job.

But is the difference between journalists and scientists just one of accuracy?  My concern is that there is much more to it than that. I did a small experiment with Google to find out how long it would take to find an account of transcranial magnetic stimulation. Answer: less than a minute. Wikipedia gives a straightforward description that makes  it abundantly clear that this treatment has nothing whatever to do with 'magnet therapy'. Laurance may be a busy man, but this is no excuse for his failure to check this out.

So here we come to the nub of the matter, and the reason why scientists tend to get cross about misleading reporting: it is not just down to human error. The errors aren't random: they fall in a particular pattern suggesting that pressure to produce good stories leads to systematic distortion, in a distinctly Orwellian fashion. Dodgy reporting comes in three kinds:

1. Propaganda: the worst case of misleading information, when there is deliberate distortion or manipulation of facts to support the editor’s policy. I think and hope this is pretty rare, though some reporting of climate change science seems to fall in this category. For instance, the Australian, the biggest-selling national daily newspaper in Australia, seems much happier to report on science that queries climate change than on science that provides evidence for it. A similar pattern could be detected in the hysteria surrounding the MMR controversy, where some papers only covered stories that argued for a link between vaccination and autism. It is inconceivable that such bias is just the result of journalists being too inexpert or too busy to check their facts. Another clue to a story being propaganda is when it goes beyond reporting of science to become personal, querying the objectivity, political allegiances and honesty of the scientists. Because scientists are no more perfect than other human beings, it is important that journalists do scrutinise their motives, but the odd thing is that this happens only when scientists are providing inconvenient evidence against an editorial position. The Australian published 85 articles about the 'climategate' leaked emails, in which accusations of dishonesty by scientists were repeated, but they did not cover the report vindicating the scientists at all. 

2. Hype. This typically does not involve actual misrepresentation of the research, but a bending of its conclusions to fit journalistic interests, typically by focusing more on future implications of a study rather than its actual findings. Institutional press officers, and sometimes scientists themselves, may collude with this kind of reporting, because they want to get their story into the papers and realise it needs some kind of spin to be publishable. In my interview with More or Less, I explained how journalists always wanted to know how research could be immediately applied, and this often led to unrealistic claims (see my blog on screening, for examples). The journalist’s response was unequivocal. She was perfectly entitled to ask a scientist what relevance their work was, and if the answer was none, then why were they taking public money to do it? But this reveals a misunderstanding of how research works.  Scientific discoveries proceed incrementally, and the goal of a study is often increased understanding of a phenomenon. This may take years: in terms of research questions, the low-hanging fruit was plucked decades ago, and we are left with the difficult problems.  Of course, if one works on disorders, the ultimate goal is to use that understanding to improve diagnosis or treatment, but the path is a long and slow one.  I discussed the conflict between the nature of scientific progress and the journalists’ need for a ‘breakthrough’ in another blog. So the typical researcher is, on the one hand, being encouraged by their institution to talk to the media, and on the other hand knows that their research will be dismissed as uninteresting (or even pointless) if it can’t be bundled into a juicy sound-bite with a message for the lay person. One of two reactions ensues: many scientists just give up attempting to talk to the media; others are prepared to mould an account of their research into what the journalists want. This means that the less scrupulous academics are more likely to monopolise media attention.

3. Omission: this is harder to pin down, but is nonetheless an aspect of science journalism that can be infuriating. What happens is that the papers go overboard for a story on a particular topic, but totally ignore other research in the same area. So, a few weeks before the fish-oil/ADHD paper was covered, a much larger and well-conducted trial of omega-3 supplementation in school-children was published but ignored by the media. Another striking example was when the salesman Wynford Dore was actively promoting his expensive exercise-based treatment for dyslexia, skilfully using press releases to get media coverage, including a headline item on the BBC News. The story came from a flawed small-scale study published in a specialist journal. While this was given prominence, excellent trials of other more standard interventions went unreported (for just one example, see this link).  I guess it is inevitable: Telling the world that you can cure dyslexia by balancing on a wobble board is newsworthy - it has both novelty and human interest. Telling the world that you can improve reading  with a phonologically-based intervention has a bit of human interest but is less surprising and less newsworthy. Telling the world that balancing on a wobble board has no impact on dyslexia whatsoever is not at all surprising, and is only of interest to those who have paid £3000 for the intervention, so it's totally un-newsworthy.  It's easy to see why this happens: it's just a more extreme form of the publication bias that also tarnishes academic journals whose editors favour 'interesting' research (see also Goldacre on similar issues). Problem is, it has consequences.

For an intelligent analysis of these issues, see Zoe Corbyn’s article in the Times Higher Education, and for some ideas about alternative approaches to science reporting, a blog by Alice Bell. I, meanwhile, am hoping that there won’t be any nominations for the Orwellian Prize that earn more points than the fish oil story, but I’m not all that confident.

P.S. I wanted to link to the original fish oil article, but it is no longer available on the web. The text is on my blog page describing the Orwellian prize.

P.P.S. Ah, I’ve just had a new nomination that gets 17 points, largely because it ignored wise advice tweeted recently by Noah Gray (@noahWG), Senior Editor at Nature: “Journalism Pro Tip: If your piece starts talking more about a study's untested implications rather than what the science showed, start over."

P.P.P.S It has been gently pointed out to me that I erred in original version of this blog, and said the Laurance magnet piece was in the Guardian, when in fact it was in the Independent.  Deeply embarrassing but now corrected.

7 comments:

  1. Ok, something I CAN quite seriously reccomend as worth reading if you are interested in these issues is this:

    http://www.esrc.ac.uk/ESRCInfoCentre/Images/Mapdocfinal_tcm6-5505.pdf

    I'd especially emphasise their point that when thinking about media effects, it's best not to think of headline/ story making it's way into someone's mind, they are more the building blocks which get selectively used to make a more complex structure of ideas around a topic. So headlines can frame or set what is seen to be important by readers, but there's a lot more going on.

    So, my point is that there's a lot of other things happening in terms of the public understanding of science, and activists who care about changing ideas of the public need to think carefully about their tactics. Perhaps headline shaming is part of this, but it really can only be a very small part and personally, I suspect this sort of prize's role is largely making everyone feel a bit better about something which a lot of people get really upset about. Maybe I'm wrong though, maybe it'll build into some social change. Headlines should be better! They bug me, and I know they can annoy journalists a lot!

    ReplyDelete
  2. "While I have some sympathy with the dilemma of a science reporter who has to report on a topic without the benefit of expertise..."

    But I think we have to ask - why are science reporters in such a predicament? No-one would get someone who didn't know who George Osborne was to write a political feature, or someone who didn't know what GDP was to write about the economy. So why do we expect science writers to write about stuff they don't know about? That's asking the impossible.

    Now you might say that the science journalists are the victims here, they are doing the best they can, and I'm sure they are. But if the result is poor as it often is, that's not an excuse. Other branches of journalism manage to provide expert reporting.

    ReplyDelete
  3. Much of what I say here conflicts with the more optimistic and proactive view of science communication by David Dobbs on @guardiansciblog
    http://bit.ly/bFj5A2
    But the commentators on that seem to be of a more cynical persuasion. One of them pointed to this very apposite cartoon:
    http://www.phdcomics.com/comics/archive.php?comicid=1174

    ReplyDelete
  4. This kind of blog always useful for blog readers, it helps people during research. your post is one of the same for blog readers.

    ReplyDelete
  5. great post...learned more thanks for sharing

    ReplyDelete
  6. Many parents of children with ADHD would feed their child a diet of caviare if they felt it would improve their chances in life.argumentative essay ideas

    ReplyDelete
  7. Aw, this was a really nice post. In idea I would like to put in writing like this additionally – taking time and actual effort to make a very good article… but what can I say… I procrastinate alot and by no means seem to get something done.ninja cooking system

    ReplyDelete