Communicating Science Through Storytelling: Beware Exaggeration

nasainterview_Valley_fever_1

Image Credit: NASA, www.nasa.gov

Scientists are being encouraged to develop better communication skills and engage the public by finding the story in their research and tailoring it for a wider audience. I’ve certainly encouraged readers to design their science videos so that they tell a compelling story: The Stories We TellSilver Linings, Kurt Vonnegut, and Telling Science StoriesI’m Not Interesting, But My Research IsCan Scientists Be Taught To Talk and Act Like Normal People? However, there’s a downside to reframing one’s research to be more interesting to a broader audience.

Three recent events have made me ponder the dangers inherent in designing science messages that appeal to non-scientific audiences. I’ll describe the first two events in this post and the third in a follow-up post.

Press Releases That Exaggerate the Science

The first event that got me going on this topic was publication of a paper in the British Medical Journal (BMJ), which reported an association between exaggerated news stories about biomedical or health-related research findings and press releases that inflated or extrapolated the research findings beyond what was reported in the original paper. Of 462 press releases issued by 20 leading universities in the UK, 40% contained exaggerated advice, 33% contained exaggerated causal claims, and 36% contained unsupported inference from animal research subjects to humans. 58, 81, and 86% of related news stories written by journalists expressed similar exaggerations in those three ways, respectively. By comparison, news stories related to more accurate press releases had exaggeration rates of 17, 18, and 10%, respectively. The results pointed to a previously unrecognized source of inaccurate news stories about medical research findings: Press releases prepared by the scientists’ own institutions.

Of course, this is only one study, which was based on a correlative approach. That is, the correlation between press releases and news stories may not indicate a cause and effect relationship, and the exaggeration may not be the fault of press officers or journalists but could instead reflect a third factor—scientists who themselves exaggerated their work in interviews, for example. This paper did not address these points. Also, this study focused on biomedical research in the UK; we don’t know whether these findings are representative of science in general.

Whatever the reason for the results, though, this BMJ paper definitely raises important issues about how research is described in the news media, who is designing the message, and how exaggeration gets propagated.

Workshops That (Inadvertently) Teach Scientists to Exaggerate

The second event that caught my attention was a post on the British Ecological Society’s blog: Maximising the reach of your research paper. The post described a workshop at a recent BES meeting that provided some “practical tips on how to sell your research”. The workshop participants designed tweet-length messages for six research papers to be distributed to three hypothetical audiences: the general public, interested non-specialists such as policy makers, and other scientists.

As I was reading this, I was nodding and thinking that this workshop was a worthy effort and a nice way to teach researchers to share their work more broadly. But then I got to the examples of messages that the workshop participants designed and saw a problem.

Here are the workshop-generated examples of messages about a paper reporting that foraging by bumblebees is affected by pesticides:

 Public: “Chemical cocktails intoxicate bumblebees and may lead to increasing food prices”

Policy: “Pesticides make bees worse at foraging for food: may affect bee survival and pollination rates”

Scientists: “New study of bumblebee foraging uses RFID tags to highlight the importance of looking at long-term, realistic exposure to sub-lethal effects of pesticides”

The messages written for policy makers and scientists appear to be accurate reflections of the original paper. However, I can’t say the same for the public message. That message replaced scientific terms with words that are more familiar to the average person (cocktail, intoxicate) and that would presumably attract more attention than the technical terms (pesticides, impaired foraging behavior). The public message also suggests that cocktail-imbibing bumblebees will somehow lead to higher food prices—included presumably to show how the research might affect the average consumer. There are three problems that I see with this message. One is that it lacks a key word—pesticide—which was the focus of the research; instead, the message uses the term “chemical cocktail”. The second problem is that the message uses “titillating” terms (cocktails and intoxicate) that are either inaccurate or confusing. Cocktail gives no hint that this is a pesticide and, along with “intoxicate”, implies that alcohol is one of the ingredients and downplays its potential toxicity. Third, the message makes an exaggerated claim regarding effects on food prices—exaggerated because the researchers did not examine impacts of impaired bumblebee foraging on costs of food.

In other words, the public message not only fails to accurately convey what the researchers studied and found, it makes an unsubstantiated causal claim, similar to the problems noted in exaggerated press releases and news stories.

Now, I’m not saying that these workshop organizers set out to teach participants how to exaggerate their research. Nor am I bashing the idea of teaching scientists how to communicate their research to the public. I realize that the questionable bumblebee message was the result of a workshop exercise by participants with limited experience. Maybe the workshop organizers critiqued all the messages, including this one, and pointed out where they needed improvement. I don’t know. I wasn’t there. All I know is that this example shows how easy it is to exaggerate research findings, especially in a tweet-length message.

My intent here is not to discourage such workshops or to criticize this particular workshop, but to make people aware of some of the pitfalls in designing such messages. Clearly, making exaggerated claims in a press release or tweet is not the way to “sell” your research. I’m all for using everyday terms in place of scientific jargon, but not at the expense of accuracy or clarity. Also, I think the public recognizes hyped verbiage and is getting tired of it.

Had I not first read the BMJ paper about exaggerated press releases and news stories, I probably would not have noticed anything wrong with the bumblebee message or started thinking about why we need to take more care in designing our stories to convey a science message. Fortunately, I did, because I think the problem is likely to get worse as more scientists try to “sell their research” through storytelling and social media.

There is clearly an art to designing short messages that are both accurate and appealing to the general public. You may be wondering, as I did, how was this bumblebee research described in real press releases and in the popular press? Did they use titillating terms or make exaggerated claims? You decide. Here are links to the research institutions’ press releases and several online news stories about the study:

Radio tags reveal how pesticides are impairing bumblebees’ ability to forage (Imperial College London)

Bee foraging chronically impaired by pesticide exposure: Study (University of Guelph)

Bee foraging chronically impaired by pesticide exposure: Study (Science Daily)

Environment: Bumblebees lose foraging skills after exposure to systemic neonicotinoid pesticides (Summit County Citizens Voice)

Commonly Used Pesticide Has Detrimental Consequences On Bumblebee Foraging (IFLScience)

Bee foraging skills impaired by neonicotinoid pesticides (CBC News)

As you can see, the two press releases and subsequent news stories (I’ve included only those on the first search page in Google) about the bumblebee research all used titles that accurately conveyed the research findings. The news stories repeated or slightly rephrased the titles put out by the researchers’ institutions. They did not substitute more flamboyant terms for “pesticide” or “bee foraging” and did not make unsubstantiated claims as to how impaired bumblebee foraging might affect humans. I could not find any other news stories that took major liberties with the message or that used questionable words to describe the research. I did see one news story that had this title: Cocktail of pesticides increases bee deaths, says study (Farmers Weekly). Unlike the workshop message, however, this title makes it clear what the cocktail refers to.

In our haste to jump on the science communication bandwagon, we may be making some mistakes (I certainly have). It’s important for scientists to participate in science communication and engage the public; however, we should not get carried away in making our research more palatable to non-specialists and compromise our scientific ideals. Those of us who are encouraging and training students to be better communicators have an obligation to point out pitfalls. We especially need to reiterate the idea that all of our communications, both technical and non-technical, should be held to a high standard. Otherwise, we risk losing the trust of the public.

This article by Virginia Hughes is an excellent summary of this issue and contains some good advice for scientists and science communicators on how to avoid misrepresenting science.

That’s not all…

There is another danger in putting out inflated research stories, especially with titillating or lascivious titles: Such research may get noticed by anti-science political groups. And those “catchy” titles? They become weapons that are turned against the scientists who conducted the research. In the next post, I’ll describe the third event that has made me rethink how we design our science messages for public consumption.