Posted: October 12, 2016
This page is categorized as:    link to Precaution Advocacy index  link to Outrage Management index
Hover here for
Article SummaryConfirmation bias is our universal tendency to hang onto our beliefs in the face of evidence to the contrary. This column begins by describing the cognitive defenses that confirmation bias relies on: selective exposure, selective attention, selective perception, framing, selective interpretation, and selective retention. Then the column addresses strategies risk communicators can use to reduce their audience’s confirmation bias. The key is to avoid challenging the audience more than necessary by finding things (sometimes even irrelevant ones) to reinforce or agree with. The column closes with pointers on how to disagree when disagreeing is necessary. The entire column is about ways to overcome your audience’s confirmation bias; a sequel on ways to overcome your own confirmation bias is also on this site.

Confirmation Bias (Part One):
How to Counter Your Audience’s
Pre-Existing Beliefs

(See also “Confirmation Bias (Part Two):
How to Overcome Your Own Pre-Existing Beliefs
.”)

This is the 33rd in a series of risk communication columns I have been asked to write for The Synergist, the journal of the American Industrial Hygiene Association. The columns appear both in the journal and on this website. This one can be found (at significantly shorter length and with minor copyediting changes) in the October 2016 issue of The Synergist, pp. 20–21.

Confirmation bias is the universal tendency of human beings to hang onto what they already believe in the face of evidence to the contrary. You may know it by its endearing nickname, “myside bias,” which nicely captures its essence.

I’m not talking about intentional bias. That happens too. People sometimes go hunting for evidence that they’re right and then intentionally distort what they find, consciously building a biased case in hopes of winning an argument. Confirmation bias is unintentional. It’s how we win our internal arguments, how we convince ourselves we’re right.

Since this is a risk communication column, I want to focus here on the implications of confirmation bias for risk communicators. Your audience members are sure to filter your warnings and reassurances through their own preexisting opinions about what’s safe and what isn’t, resisting anything you say that tries to change their views. How should this fact affect your messaging?

There’s a second part to this column, postponed to a later issue, on ways to minimize your own confirmation bias – ways to be more open to challenging information.

One goal of the two parts, taken together, is to increase your awareness of confirmation bias. If I succeed, you should start experiencing confirmation bias about confirmation bias. As the cartoon caption has it: “Ever since I’ve heard about confirmation bias, I’ve started seeing it everywhere.”

How confirmation bias works

Confirmation bias is a system of defenses aimed at protecting you – and me, and everybody – from uncomfortable information. Here are some of its key components.

Selective exposure and selective attention are our first lines of defense against information we don’t want to know about. We try not to encounter messages that disagree with us, and if we run into them by accident we try to tune them out.

These two defenses have always been major components of confirmation bias, but the rise of the Internet and social media has made selective exposure a much more potent defense than ever before. A generation ago, if we followed news at all we had little choice but to watch the same newscasts and read the same newspapers as everybody else. Now we can choose from literally millions of information providers. Not surprisingly, we choose the ones that tell us we’re right.

When we fail to tune out messages we don’t agree with, selective perception is a key unconscious strategy for avoiding their meaning. We simply misperceive them.

Numerous studies, for example, have found that whites and African-Americans literally “see” different things when they look at the same image (or body cam video) of a racial confrontation. If the image is ambiguous, people of both races “clarify” it based on their preexisting expectations. And even if the image is clear, those whose preexisting expectations are challenged may simply see what isn’t there. To reverse the conventional proverb, “believing is seeing.”

Closely related to selective perception is framing. We see new information through the frame of what we know or believe already. When a U.S. public health official states that she expects “local outbreaks” of Zika virus disease, for example, some in the audience picture widespread, devastating outbreaks, while others picture a few very small ones. Their prior opinions about how much Zika we’re likely to experience is the frame through which they perceive what the official meant by “local outbreaks.”

Selective interpretation is more conscious than selective perception and framing. If we possibly can, we find a way to interpret – opponents would say “misinterpret” – messages so they don’t challenge our preconceptions. Suppose you asked Trump and Clinton supporters to listen to a speech by either candidate, and then to tell you what they heard. In addition to a lot of selective perception, their answers would reflect a lot of selective interpretation as well. “What he (or she) really meant was….”

Selective interpretation leads to what I sometimes call “still-thinking.” We are all accomplished still-thinkers, capable of interpreting discrepant information away and then claiming, “I still think….”

Our final defense is selective retention. If we can’t avoid or ignore or misperceive or misinterpret the messages that tell us we’re wrong about something, we forget them or misremember them. Try sitting down with someone you had a fight with last week to reconstruct who said what. Your recollections will be quite different, and both will be self-serving. (I don’t actually recommend doing this. You’re too likely to end up in a fight about the fight.)

If all these defenses fail and we actually absorb a challenging message accurately, we can still choose whether to accept it or decide it’s wrong. But both of these alternatives are painful. One means change and the other means conflict – and choosing between them means self-doubt. We would much rather ward off the message in the first place. That’s what confirmation bias is for.

Although most of my examples have been hotly emotional interactions, exactly the same confirmation bias defenses run interference between your risk communication efforts and your audience’s ability to absorb what you’re saying. There is a substantial literature on how confirmation bias and related cognitive biases lead even seasoned professionals to commit disastrous safety errors. An entirely different body of research addresses the identical ways in which people convinced that something is dangerous resist learning that it’s safer than they thought.

Not every risk message encounters confirmation bias, of course. Sometimes your audience has absolutely no preexisting opinions, attitudes, values, or expectations relevant to your topic. You’re talking to the proverbial tabula rasa (“blank slate”), and the main barrier to the audience absorbing your message is probably apathy.

When your message is actually of interest to your audience, on the other hand, they’re likely to be testing that message against what they already know and believe and feel – and confirmation bias will rear its ugly head.

Employees whose years of accident-free work have taught them they don’t need to adhere to safety procedures will deploy their confirmation bias defenses against your safety messaging. Neighbors whose outrage about your facility’s emissions tells them the facility is causing cancer in the community will deploy these same defenses against your reassurances.

How can you overcome their confirmation bias?

Overcoming confirmation bias in your audience

The most important implication of confirmation bias for risk communicators – or any communicators – is this: If you possibly can, confirm something. Don’t challenge your audience any more than you absolutely have to.

I’m not saying you should change your core message. It is what it is. But look for ways to reframe your core message so it is more compatible with your audience’s preexisting opinions, attitudes, values, and expectations.

Decades ago, many construction workers and others in hazardous occupations resisted wearing hardhats, in part because they were proud of their courage and competence, and felt the hardhat requirement called both into question – needlessly so, since their experience had taught them they were capable of avoiding injury without any head protection. Strict rules forbidding the use of hardhats in low-risk parts of the worksite turned the meaning of the safety gear on its head (so to speak). The new meaning: Only workers skilled enough and brave enough to work in dangerous places wear hardhats. The change was so successful that these workers came to be called, and to call themselves, “hardhats.”

In precaution advocacy – urging people to take a risk more seriously – the single most basic principle is to find something in your audience that already predisposes them to do what you want them to do … and build on that. If you’re lucky, you’ll find something substantively relevant to build on. But it’s better to build on preexisting opinions, attitudes, values and expectations that are only obliquely relevant or even totally irrelevant than to build on nothing at all. In the face of your audience’s confirmation bias, jujitsu works better than frontal attack.

If you can’t convince parents to vaccinate their children because infectious diseases are a lot more dangerous than vaccines, maybe you can convince them to do so because they don’t want to offend their neighbors. Or because they don’t want to be different from everybody else. Or even just because a rock star they admire is a big vaccination proponent.

The substantively relevant reasoning often comes later. You get me to vaccinate my kids because I want to be like my favorite musician. Understandably, I feel a little weird about having made such an important parenting decision for such an irrational reason. (This weird feeling is called cognitive dissonance.) So I start looking for evidence that vaccination is good. I’m still under the sway of confirmation bias. But what I want to confirm has changed. Now instead of seeking to confirm my former opinion that vaccines are dangerous, I’m motivated to confirm that I was right to vaccinate my kids last week. So your pro-vaccination messages are no longer messages I’m trying to avoid, ignore, misperceive, misinterpret, or misremember. They have become messages that can help me feel better about myself.

Trying to calm people who are excessively upset about a risk (outrage management, in my jargon) is in many ways the opposite of precaution advocacy. But the confirmation bias problem is pretty much the same.

Suppose I think your factory threatens my children’s health. You’re confident I’m wrong on the merits. But you can’t just tell me that “the science” says I’m being stupid.

Knowing that confirmation bias is more powerful when people are in the grip of strong emotions, you should make time to listen to me vent, so I get a little calmer and a little more willing to hear what you’ve got to say. Then when it’s finally your turn to speak, you should validate my valid concerns. If you can’t agree with me that your dimethylmeatloaf emissions are killing my kids (because you’re convinced they’re not), you can find other things to agree with me about: that your company shouldn’t have stonewalled my demands to know how much dimethylmeatloaf you emit; that a couple of studies do suggest high doses of dimethylmeatloaf could cause cancer; that there are things your company can do to reduce the volume of emissions; etc.

One of the core confirmation bias lessons for outrage management: Use two-sided rather than one-sided messaging. There are circumstances under which one-sided messages are wiser, at least in the short term. If your audience is uninterested and uninformed, and likely to remain so, there’s a case to be made for simply telling them what you want them to know.

But outraged audiences are by definition highly interested, even obsessed. And they’re highly informed, though their information (thanks to confirmation bias) has been cherry-picked to favor their outraged conviction. So you need to use two-sided messaging. You need to acknowledge their outrage. You need to acknowledge what your organization has done or failed to do that exacerbated the outrage. And you need to acknowledge everything technical they’re right about – their half, or their ten percent, or even their one-tenth of one percent of the technical truth.

When you have to disagree

Two-sided messaging doesn’t mean abandoning your key message just because your audience disagrees. If your basic risk communication goal is to convince your audience that they’re wrong about something, you can’t just confirm and confirm and confirm. Eventually you need to get to where you disagree.

Won’t confirmation bias make that disagreeable part of your message hard for your audience to absorb? Sure. But not as hard as if your entire message were disagreeable.

Obviously there are better and worse ways to disagree with someone. For sure, establishing some common ground is a good first step. “Yes and” or even “Yes but” is a lot better than “No, you’re totally wrong.” Here are some additional pointers for overcoming your audience’s confirmation bias when you have to disagree:

  • Identify the confrontational part of your message explicitly and empathically. Instead of trying to sneak it in, say something like “Now we’re coming to a piece of information I think you may find hard to accept.”
  • Draw a clear distinction between understanding and agreement. Give your audience “permission” to disagree, thereby making it less threatening to understand. Once the challenging information gets in, maybe it will provoke some reluctant agreement.
  • Test for understanding, but not for agreement. Ask your audience to repeat back what you said, but give them a chance to say that’s your position they’re parroting, not necessarily theirs.
  • Warn your audience in advance that you’re going to test later for understanding. Frame this as a test of whether you did a good job of communicating the message, not whether they did a good job of hearing it. Even in the face of confirmation bias, people listen much better when they know they’re going to be asked to repeat what they heard.
  • Break down your message into smaller and more concrete steps – especially the parts of your message likeliest to fall prey to confirmation bias.
  • Consider a deductive rather than an inductive reasoning structure. Inductive arguments start with their conclusion and then present supporting evidence. They’re easier to understand – but an audience that disagrees with your conclusion will be turned off at the get-go. Deductive arguments start with the evidence and reason their way to the conclusion. They’re much harder to follow, but they provoke less confirmation bias.

But more important than any of these pointers is this: Don’t disagree more than you have to. My clients pick a lot of unnecessary fights with their stakeholders, arousing confirmation bias defenses they didn’t need to arouse.

Think again about the hardhat example I gave earlier. Your reason for wanting employees to wear their hardhats is to prevent head injuries. But a quite different reason might be more persuasive to them: to signal their courage and their competence to work in dangerous places. Or maybe they would happily accept hardhats in order to display decals demonstrating both their group allegiances and their individual commitments to spouse, pet, or motorcycle. Or maybe all they want is a chance to pick the style of hardhat they think looks coolest.

As long as they wear their hardhats, do you really care why? Do you really need to argue with them about whether they’re likely to suffer head injuries? Are you provoking confirmation bias unnecessarily?

Copyright © 2016 by Peter M. Sandman

For more on precaution advocacy:    link to Precaution Advocacy index
For more on outrage management:    link to Outrage Management index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.