I recently got into a small kerfuffle with a journalist, actually a sports writer who decided to dabble in science journalism. The exchange started at science-based medicine when I wrote a piece critical of the claims being made for a new device called the GyroStim, which is being offered as a treatment for brain injury.
In this article I linked to a piece in the popular press about the treatment, in the Denver Post by a sports writer, Adrian Dater. Dater thought I was being unfair in my criticism of his piece, and so wrote a response on his blog. The exchange and the comments have exposed many of the problems with journalism in general and science journalism in particular, that I would like to explore further here.
First I have to say that there are many excellent journalists and science journalists out there. I am not implying that that there are no good journalists. I do find, however, that the baseline quality of science journalism is lacking and, if anything, getting worse. Part of the problem is the evaporating infrastructure for full-time journalists. Many outlets no longer maintain specialist journalists, and use generalists (including editors) to cover science news stories.
What follows can be seen as a quick primer, or at least a list of helpful suggestions, to journalists who wish to cover science topics. I will primarily use examples from the recent exchange over the GyroStim.
Think About the Narrative
Almost all news stories have a clear narrative. The facts of the story are presented in a way to create a meaningful story. Even if all the facts are individually correct, the choice of which facts to present, in what balance, and in which order affect the bottom line impression left by the article. The choice of headline is also important, and I know for big news outlets the article author is often not the headline writer, but that doesn’t mean the headlines don’t matter also – they tend to frame the article.
For example, Dater defended his article by claiming that he got all the facts right, that he included “balance” (more on that below) by indicating the Gyrostim is not FDA approved and quoting a doctor saying more evidence is needed, and that he did not directly endorse the treatment.
My criticism, however, was based on the narrative that he blatantly created. I find it interesting that he seems to be unaware of this narrative or its effects.
The story was framed as a touching account of a father who is an engineer who decided to build a machine to cure his daughter of cerebral palsy. Right out of the gate the reader is rooting for this machine to work. The pull-out quotes include, “Spinning stimulates the brain,” and “Miracles almost every day.” He also ended with a hopeful anecdote:
“The machine is amazing, it really is,” said Hishon, who played nine games for the Lake Erie Monsters this season after nearly two years of concussion symptoms. “It just seemed to wake something up in my brain. I can’t explain it, but it definitely worked wonders with me.”
The fact that there is some token skepticism tucked in the middle is just part of the narrative – every story needs a villain, right? Dater may not have intended the skeptics to be the villain of his narrative, but that is the role he assigned them. On the one hand you have a loving father, hope, the device is being researched, you have excited practitioners, and many patients who are thrilled with their results, and on the other side some talking-head canned skepticism – “more evidence is needed, not FDA approved, blah, blah.”
It seems to me that many journalists don’t even think about the narrative – it just emerges as a default story format. Start with a human interest angle to draw in the reader, then just report what both sides are saying, be sue to include plenty of anecdotes, and then end on a hopeful note.
What many non-science journalists don’t seem to get is that this is not a proper narrative for a science story. Further – all journalists need to decide what their narrative is before writing the story (although hopefully after they have researched it – often journalists decide on their narrative first then just backfill the facts and anecdotes).
Here are some other narratives that journalists covering science stories might consider:
– The allure and harm of false hope, and the exploitation of false hope by dubious practitioners and companies.
– A cautionary tale about getting excited prematurely by some newfangled treatment before it is adequately tested, given that most new treatments do not pan out.
– Is new and high tech always better – does this machine that costs tens of thousands of dollars work better than a $20 device that has a similar function?
Put the Story Into Context
The most challenging part of science reporting is putting a new story into a deeper scientific context. This requires background research and talking to a variety of experts – and asking the right questions and really listening to what they say. This context includes:
– What is the plausibility of the new claim? Does it confirm or contradict what is currently believed to be true?
– Does the new device, treatment, product resemble anything that has come before? Is it truly new, or just a rebranding of an old concept – and if the latter, how have previous incarnations fared?
– What is the current consensus, if any, on this new claim? Is it truly controversial, or very one-sided with the majority of scientists taking one position and only a few outliers disagreeing with the consensus?
– What are the credentials and backgrounds of the experts on which you are relying. Is their degree generally recognized as valid? Do they have a history of making other dubious or controversial claims? Do they have a history of fraud?
– Overall, how does the new discover, claim, treatment, etc. fit into existing evidence and scientific theories?
– What are the implications of all this for the current stance one should have toward the claim – should it be considered experimental, should it be taught in public science classes, should it be legal, etc. ?
– What steps are needed in the future? What questions need to be resolved?
Adding the above context is exactly what we do at Science-Based Medicine, and at many other “skeptical” blogs. Good science journalists also do this. This is the real story, not the fluff narrative that is better suited to covering the local dog show.
False Balance and Token Skepticism
The balance of the article should generally reflect that balance of scientific acceptance. If 95% of the scientific community accepts one consensus, then that is what the bulk of the article should reflect. If you feel the other 5% deserves a mention, then it should be given appropriate space, and also put into context (as above).
Stories about politics and social issues requires obsessive balance, because these are mostly based on value-judgments and opinions. For these stories a journalist needs to get the facts right, and make sure that all credible sides have their say.
Science does not work like that. In science, some opinions are objectively better than others. Science stories are about the evidence and the process of science – about finding the best current answer. Science articles need to reflect that.
As soon as you put a pseudoscientist up against a genuine and respected scientist, you have elevated the pseudoscientist to a stature they likely do not deserve. You have framed the story in a very deceptive way that does not reflect the reality.
Bad science journalism generally falls into one of three categories in this regard. Some stories have false balance, where a pseudocontroversy is presented as if it is a real scientific controversy. This is the false-balance fallacy.
Other stories have what we call token skepticism – most of the article dedicated to giving a forum to the crank and glowing anecdotes, with scant mention of doubt and/or quick commentary by a real scientist. Dater’s article fell into this category.
The third is when even token skepticism is lacking – the story is presented without a hint of skepticism or actual investigation.
Good science journalism requires putting a science news story into a proper context, making sure the narrative that emerges is fair and appropriate to the actual story, and properly balancing different points of view to the scientific consensus and to scientific legitimacy.
This is not easy. It requires, in my opinion, at least a baseline of scientific literacy. It also requires significant background research into the topic, and into any experts upon which the journalist relies.
What we have from Adrian Dater is an excellent example of what happens when a non-science journalist thinks they can dabble in science reporting, without understanding any of the special requirements of competent science reporting. Even more telling that the article itself is Dater’s defense of his journalism. As is often the case with defensive overreaction, he just dug himself in deeper and deeper.
The exchange also highlights for me the new role that science blogs are playing in the reporting of science news. Journalists who write bad science news stories now have to contend with the second wave of science blog analysis. Now actual scientists, or at least dedicated science journalists, can add the missing context, deconstruct a misleading narrative, and rebalance a science news story.
Journalists, like Dater, who encounter this science-blog pushback when they write a naive and misleading piece would be better off if they embrace the criticism and try to learn from it, rather than get into an online fight with someone who actually knows what they are talking about.