Posted: February 25, 2016
This page is categorized as:    link to Pandemic and Other Infectious Diseases index
Hover here for
Article SummaryOn February 22, NBC Reporter Maggie Fox emailed me and my wife and colleague Jody Lanard for help with a story she wanted to write “shooting down the Zika rumors that won't die.” She asked: “Why do people love, love, love to blame ‘toxic chemicals’ – in this case, larvicides? Why do people love to be scared of GMOs? Why are the conspiracy theorists ALWAYS the first ones to comment, and in ALL CAPS, on our stories?” Jody and I emailed her a quick response, some of which tried to answer her questions and some of which riffed on other aspects of Zika rumors. Maggie used parts of what we wrote in her story. Here is the whole email. It includes some points she probably didn’t expect us to make, including a defense of rumors in general and Zika rumors in particular, and a claim that experts and officials sometimes spread rumors too.

Zika Rumors

(a February 22, 2016 email in response to a query from Maggie Fox of NBC)
Maggie Fox’s February 25 article drew from this email.
For more on Zika-related risk communication:   

Before we get to the heart of your questions, we want to clear away some underbrush.

1. Linking a cause to a hot news peg is automatic for any activist worth his or her salt.

There’s nothing weird about activists for a cause – any cause – trying to appropriate a hot news peg for use as a teachable moment. They are smart to grab onto the coattails of the issue.

Normally, when activists put out press releases, hardly anyone covers them. But with a slow-motion big story like Zika, the mainstream media are eager for new angles. Reporters seem happy to give the activists free advertising for their viewpoints, almost no matter how whacko. Remember the British astrobiologist who used SARS as a news peg for his hypothesis that some pathogens might come from outer space via meteor dust? CNN published that, and CNN reporter Miriam Falco was obliged to ask then-CDC director Julie Gerberding what she thought of the possibility.

You’re focusing on “conspiracy theorists” who blame the increase in microcephaly not on the Zika virus but on genetically modified mosquitoes or the larvicide pyriproxyfen. But let’s add to the list. We have seen plenty of articles about how the essence of the Zika/microcephaly story is climate change, or women’s reproductive rights, or poverty, or cutbacks in member states’ funding of the World Health Organization.

Granted, there’s a stronger case for blaming the potential dire outcome of Zika infection on climate change, restrictive laws governing birth control and abortion, poverty in Latin America, or WHO budgets than for blaming microcephaly on GMOs or pesticides. But what about the arguments from the other side of those two issues – assertions that genetically modified mosquitoes or resumption of DDT spraying might be the silver bullet that saves the world from Zika? Isn’t the science underlying those claims nearly as questionable as the science underlying the anti-GMO and anti-pesticide claims? And if so, does that make them rumors, or conspiracy theories?

2. Rumors are natural too, especially when there’s a lot of mistrust and expert uncertainty. And rumors are often useful.

Despite the negative connotations of the word “rumor,” rumors are what people inevitably, naturally, and often usefully absorb and spread in the absence of information they consider reliable. Although we think most expert statements about Zika are responsibly and repeatedly emphasizing the various levels of uncertainty, the certainty void still leaves open an entry point for rumors of all kinds.

While responsible experts and public health officials are acknowledging a lot of Zika uncertainty, less responsible ones are claiming or implying more confidence than the situation merits. Especially given the low level of trust in government in Brazil and many other Zika-affected countries, and to a lesser extent in the U.S., overconfident official claims about Zika may well raise more doubts among ordinary citizens than appropriately tentative claims. Distrust and doubt provide another entry point for rumors.

Bear in mind that rumors sometimes turn out true – and sometimes spread faster than scientists’ or officials’ formal claims. We recall when ProMED, an infectious disease surveillance listserv, posted rumors link is to a PDF file about atypical pneumonia cases in China, before the Chinese government said a word. It turned out to be SARS, the first new global epidemic of the 21st century. That’s why the World Health Organization systematically monitors rumors as a key part of its early warning system for novel infectious disease outbreaks.

The hypothesis that Zika virus might be responsible for microcephaly started as a rumor spread by Brazilian healthcare workers. If that rumor had turned out mistaken, it would have been cited as an example of how irresponsible it is to spread rumors you’re not sure about. Since it looks likely to turn out correct, it will probably be forgotten as an example of how valuable rumors are as an early warning system.

3. Officials and experts sometimes spread rumors too. Only then we don’t call them rumors.

When officials or experts wish to convince the public of something they consider important to public health, they sometimes make claims they know – or should know – are false or exaggerated. These claims often go unexposed. Even when exposed, they aren’t called rumors. They should be.

In 2015, for example, U.S. officials spread the rumor that measles was roaring back. These officials were, as we are, extremely pro-vaccine. They used the Disneyland measles outbreak as an activist peg to increase the rate of childhood vaccination. The media followed along complacently.

Measles was not roaring back. Due partly to massive public health intervention, and partly to the already high level of measles immunity in the U.S., most measles cases launched from Disneyland landed on infertile ground, and caused either no clusters or very small clusters. But few if any mainstream sources at the time dared to say that measles was not roaring back.

Right now in early 2016, there is a dramatic and widespread rumor that an entire generation of children in Flint, Michigan have been catastrophically damaged due to lead in their tap water. We don’t for a moment doubt the evidence that government misbehavior genuinely caused some kids in Flint to absorb more lead into their bodies than would have been the case if not for that misbehavior. Flint outrage is entirely justified.

But there are grounds for questioning the conventional wisdom about Flint hazard. The one article link is to a PDF file proving an increase in the number of early childhood lead levels above 5 micrograms per deciliter in Flint provides no data about how many kids’ levels were how much above 5. We have been unable to get that information – which is surely relevant to whether an entire generation has been catastrophically damaged … or a few kids have really high blood lead levels and more have levels typical of their older siblings at the same age and well below their parents at the same age … or some more complicated truth.

We can find no reporters who are seeking these data. The few commentators who have even dared to wonder whether the Flint lead hazard may have been overstated have been almost universally attacked; one skeptic’s candor cost him his job as a contributing writer for “Inside Michigan Politics,” a political analysis newsletter. Lead experts with similar questions in their minds have understandably held back. Some rumors are just too dangerous to question, let alone rebut.

Just as the “measles is roaring back” rumor increased vaccination rates, the “Flint kids are ruined” rumor has provoked useful investigations of lead levels throughout the country. That doesn’t make the rumors true.

4. Rebutting false rumors requires taking them seriously on their merits – not dismissing them just because they’re rumors.

We have written before on how best to rebut false rumors. (See Peter’s 2008 website column, “Rumors: Information Is the Antidote.”) We’re not talking here about rumors that are merely mistaken about some of the details, or rumors that are unconfirmed and officials still hope they’ll turn out false, or even rumors that are probably false but anything’s possible. We’re talking about absolutely bet-your-life-on-it false rumors.

Not too long ago, communication professionals often counseled their clients to ignore false rumors. “Don’t dignify them with a response.” This profoundly unwise recommendation has slowly given way to the far sounder principle that false rumors should be acknowledged and rebutted. Of course it makes sense to ignore a rumor you’re sure that few people have heard and few will hear. But silence about a rumor that people have already heard or are likely to hear can be disastrous.

A good response to a false rumor has six components:

number 1
Repeat the rumor you’re rebutting, and then explain why you think (or why you’re nearly sure) it is false. You can’t rebut it if you won’t repeat it.
number 2
Be empathic toward those who believe the rumor. You can’t convince people the rumor is false if it’s clear you think they’re idiots for believing it.
number 3
Demonstrate that you have taken the rumor seriously. “Here’s what we did to look into the rumor…. And here’s what we learned….” Better yet, let people watch you look into the rumor, so they learn that it’s false at the same moment you do.
number 4
Give evidence that the rumor is false. Don’t expect people to take your word for it.
number 5
Discuss all evidence – even questionable evidence – that the rumor might be true. Assume people have heard or will hear the other side’s most persuasive arguments. Discuss them. If the totality of the evidence is 95% on your side, don’t claim it’s 100% on your side. Talk about the discrepant 5%, and explain why you have decided to discount it.
number 6
Promise to stay alert. Good science is always tentative, and so is good risk communication. “Even though there are no convincing signs so far that this rumor is true, we are keeping an open mind and remaining vigilant.” What if you are absolutely 100% certain the rumor is false? Then it’s okay to say so, we think – and immediately add that you remain open to evidence that you’re wrong.

Officials usually do a superb job with #4, which of course means they do #1 as well. They typically default on #s 2, 3, 5, and 6.

5. Certain risks are alarming, and certain claims about the causes of those risks are convincing, because they tap into primal “outrage factors.”

This is the heart of your questions, we think: Why do some explanations of microcephaly – competitors to the Zika explanation – have more staying power than others, even in the face of evidence against them?

Experts in risk perception and risk communication have a list of factors link is to a PDF file that predispose people to take a risk seriously – to respond with worry or concern or sometimes anger. Some of these “outrage factors” (as we call them) are characteristics of the risk itself – factors like unfamiliarity, absence of control, undetectability, industrial rather than natural origins, and dread. Others are characteristics of the organizations people associate with the risk: untrustworthiness, unresponsiveness, immorality, and the like.

A risk that is high-outrage is likely to alarm people even if it is low-hazard – that is, not terribly dangerous. A risk that’s low-outrage tends to be shrugged off even if it’s high-hazard.

Microcephaly is obviously a high-outrage risk. We know very few people, and no parents, who can look at a news photo of a microcephalic infant without shuddering. Microcephaly is also high-hazard, of course, in terms of its often horrific impact on baby and family. And if the Zika connection proves true and not rare, microcephaly will become a more serious hazard in terms of frequency as well. But it’s the outrage, not the hazard, link is to a PDF file that provokes people’s strong emotional response. This is true even though microcephaly hazard is really high and people genuinely believe that’s why they’re upset; other even higher hazards fail to arouse similar concern because they lack enough outrage factors.

The same list of outrage factors also determines which claims about the cause of a risk people are tempted to believe.

The Zika virus itself is high-outrage in only a few ways, most notably its novelty, mysteriousness, undetectability, and unavoidability. But because the virus is natural, there’s no one to blame, no enemy except nature. It’s just a virus. How much more satisfying to blame microcephaly on the evil scientists who genetically modify existing organisms to create creatures God never thought to create! How much more satisfying to blame the evil chemical industry that invents and markets pesticide after pesticide after pesticide, until it’s nearly impossible to find a person or animal without pesticide-contaminated tissues!

Or: How much more satisfying to blame the evil denialists who refuse to take climate change seriously; or the evil Catholic prelates who refuse to countenance reproductive freedom in Latin America; or the evil developed world that consigns the developing world to endless poverty; or the evil national governments that starve WHO’s response capacity; or the evil opponents of GMOs or pesticides who won’t accept genetically modified mosquitoes or DDT as the Zika final solution!

GMOs in particular tap into nearly all the outrage factors we know about, and pesticides aren’t far behind. They’re both readily available high-outrage explanations for whatever’s going wrong in the world.

Climate change, the Church, poverty, WHO funding declines, and “the stupid public’s fear of GMOs and pesticides” all have less capacity to arouse outrage than GMOs and pesticides themselves. Or at least they are less able to arouse widespread outrage in the U.S. general public – though they arouse more than ample outrage in smaller segments of the public.

In all of these cases, the dynamic is the same: Independent of the science – the actual hazard – it’s mostly outrage that determines hazard perception. Of course plenty of people are sufficiently outraged (fearful, worried, concerned) about Zika, and sufficiently confident in official sources, that they do not need a villain nastier than the virus itself to blame. But plenty of others are open to the appeals of activists who are seizing the opportunity to market an even higher-outrage villain.

For more on Zika-related risk communication:   


Copyright © 2016 by Peter M. Sandman and Jody Lanard

For more on infectious diseases risk communication:    link to Pandemic and Other Infectious Diseases index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.