Types of Risk-Related Numbers and Statistics
Levels of risk are only one among several numbers and statistics that need to be conveyed in communicating to the public. Other numbers and statistics include the following:
- Concentrations – such as parts per million or billion.
- Probabilities – the likelihood of the event.
- Quantities – such as how many tons of air toxic X were emitted into the air near the plant, how much effluent was released into the water, how much soil was contaminated.
This list is neither complete nor precise. The point to remember is that plant managers often think they are providing information about “risk” when they actually are providing risk-related numbers and statistics. There is nothing wrong with the latter. Since the most difficult number to conceptualize, present, and explain is risk itself, at times it makes sense to talk about one of the other numbers or statistics. Indeed, providing risk-related numbers and statistics often contributes to a clearer understanding of how risk is calculated. For example:
The plant releases z pounds of air toxic X per month (quantity) into the air, causing y ppb of air toxic X in the air at the plant gate (concentration) on a bad day. A population exposed to that dose of air toxic X for 70 years would experience a w percent increase in the rate of lung cancer (risk).
Using Comparisons for Explaining Risk-Related Numbers and Statistics
Risk-related numbers and statistics can be presented as part of a comparison. However, comparisons often create their own problems. For example, “concentration” comparisons (“a drop of vermouth in a million-gallon martini”), though common (see Appendix A), are seldom helpful. Most concentration comparisons have the same drawbacks as the worst risk comparisons. They seem to minimize and trivialize the problem and also to prejudge its acceptability. Therefore concentration comparisons – usually designed to convince the audience to stop worrying – are likely to fail. Furthermore, concentration comparisons can also be misleading, since chemicals vary widely in potency (one drop of chemical x in a swimming pool might kill everyone, while one drop of chemical y will have no effect whatsoever). In view of these reservations and others, it is best to use concentration comparisons sparingly, and with sensitivity.
Like “concentration” comparisons, “probability” comparisons can be misleading. For example, some analysts suggest that instead of comparing the probability of one unlikely (chemically hazardous) event to the probability of another unlikely but negative event, such as getting hit by lighming, you would do better to compare it to an unlikely positive event, such as winning the Irish Sweepstakes.
“Quantity” comparisons, on the other hand, tend to be more useful. For example, it can be quite informative to translate tons of ash into Astrodomes-ful or swimming pools-ful. Quantity comparisons usually help the audience visualize how much is there.
Alternative Ways to Express Risk-Related Numbers and Statistics
Due to recent changes in the law, one of the things you will have to do is offer the public a variety of numbers and statistics related to plant inventories and emissions. There are many ways to express such data. To communicate clearly, you should not simply pass the data on to the public in the form in which you received them. Instead, you should select the best way(s) to present and explain the data. As noted below, the terms in which you couch the data are all-important.
- Often when you provide the public with a reassuring risk-related number or statistic, you will find yourself attacked by critics for “lying with statistics.” The main point to remember is that in arguments about risks, no number can be “the right number.” Each way of expressing a risk “frames” the risk a little differently and thus has a different impact. Doctors, for example, are much more likely to prescribe a medication that saves the lives of 60 percent of seriously ill patients than a medication that loses 40 percent of the patients – even though the results are exactly the same.
Risk is usually presented as the expected number of deaths per unit of something per years exposed. The various ways of expressing this number, however, are not equivalent. For example, consider the issue of how many people will die annually as a result of emissions of air toxic X. Look at some of the ways this risk information can be expressed:
- Deaths per million people in the population
- Deaths per million people within n miles of the facility
- Deaths per unit of concentration (LD-50, for example, or any toxicity measure)
- Deaths per facility
- Deaths per ton of air toxic X released
- Deaths per ton of air toxic X absorbed by people
- Deaths per ton of chemical produced
- Deaths per million dollars of product produced, etc.
Depending on the circumstances, these different expressions will strike an audience as being more, or less, appropriate; more, or less, frightening; more, or less, comprehensible; and more, or less, credible. Consider and compare the first two for a minute. People living or working near a facility are obviously at much greater risk from emissions of air toxic X than people farther away. Therefore, dividing the expected number of deaths from air toxic X, virtually all of them near the plant, by the U.S. population, yields a reassuringly low risk figure – but a grossly misleading one. This approach is like including nonsmokers in a calculation of the risk of smoking.
- Transformations of scale are also important in presenting risk-related numbers and statistics. Should you describe the amount of effluent in pounds or tons (or barrels or truckloads)? Six parts per billion sounds like a good deal more than 0.006 parts per million.
- Risk data can be expressed in alternative ways by looking at different levels of exposure. One possibility, for example, is the period of exposure that is equal to some unit risk: n years at the plant gate, for example, equals a one-in-a-million chance of cancer. Or n years within 10 miles of the plant (a less alarming number). Or n years at the highest exposure so far detected (a more alarming number).
- One alternative unit for expressing risk data is lost life expectancy (on the basis of a conservative presumed normal life expectancy of 70 years). This measure gives more weight to early deaths and less to deaths in old age. Risk can also be expressed in terms of lost work days. This measure cannot be used, however, for estimating risks to children, non-working parents, and retired people.
- If data are available – which is seldom the case – risk can be calculated for health consequences other than death. These include such consequences as cancers, birth defects, genetic damage, immune-system damage, neurological damage, fertility problems, behavioral disorders, liver disorders, kidney disorders, blood problems, and cardiovascular problems. All are adverse chronic health consequences. Adverse acute health consequences include trauma, burns, rashes, and eye problems. One alternative way to express such acute consequences is to count hospital days or medical costs. Risks can also be calculated for adverse consequences on wildlife, endangered species, resources, plants, agriculture, air quality, water quality, habitat, aesthetics, climate, and so on.
Having chosen a unit and calculated a number, one can choose among a variety of ways of expressing it. Suppose emissions of the air toxic is expected to kill 1.4 people in 1,000 over a lifetime. Which of the following would you want to tell people?
- The lifetime risk is 0.0014.
- The lifetime risk is 0.14 percent.
- The lifetime risk is one in 710.
- In a community of a thousand people, we could expect 1.4 to die as a result of exposure.
Although these alternatives are equivalent, their meaning to, and effect on audiences are not. Some terms may make the risk seem small, while others may make the risk seem larger. Given this fact, you should choose your terms of expression consciously and responsibly.
- When you present and explain risk-related numbers and statistics, be aware that (1) most risk numbers are estimates, and (2) these estimates are often based on data from animals exposed to very high doses and on modeling of one sort or another. We know how many people die each year from auto accidents and from being struck by lightning. We can measure these rates. But we can only calculate and estimate – not measure – how many people die each year from chronic exposure to trace quantities of chemicals. Given this context, you, as the plant manager, have several options. You can offer your best estimate of the risk. You can (as some federal agencies do) offer an upper-bound or “worst case” estimate. In such an estimate, each step in the calculation moves toward greater risk, thus attempting to assess the maximum risk that could reasonably be expected. Or you can offer the most likely estimate, along with the highest and lowest estimates. This last option is recommended whenever possible, since people want to be given complete information and may resent your choosing for them.
Guidelines for Providing and Explaining Risk-Related Numbers and Statistics
With this background, what then can be said about how best to provide and explain risk-related numbers and statistics?
- Do not just take the risk-related numbers and statistics as you find them. The first step in selecting numbers and statistics wisely is realizing that you need to select.
- Pick a risk number for which the data are good. Some of the data manipulations that have been discussed are just simple transformations of the same number. However, some transformations (for example, lost-life-expectancy data) require additional information. Similarly, if you have data about deaths and illnesses but not environmental data, discuss deaths and illnesses. If, however, you expect that people will be concerned about environmental effects such as odors or haze, be prepared to talk about the partial data that are available and what you are going to do to check into environmental effects.
Use whole numbers and simple fractions whenever possible, such as 6 parts per billion instead of 0.006 parts per million. However, if you are presenting a table with several values, it is best to express all of them with the same denominator, even if it means violating the above rule. For example:
Substance Risk per million
persons exposedAir toxic X 3.0 Air toxic Y 0.6 Air toxic Z 4000.0 Pick a number that you think will be easy for your audience to understand. Avoid unfamiliar units or overly complex concepts. Embed the number in words that help clarify its meaning. For example, “a risk of 0.047” is comprehensible only to a select number of educated people – but everyone can understand that roughly five people in an auditorium of 100 would be affected. Similarly, “a cancer risk of 4.7 x 10-6” is hard going even for an expert. An alternative way of expressing this number is as follows:
Imagine 10 cities of 100,000 people each, all exposed to n amount of air toxic X. In five of these 10 cities, probably no one would be affected. In each of the other five cities, on average there will be one additional cancer.
- Use graphs, charts, and other visual aids to help present and clarify the number (see the appendices for examples).
- Pick a number that you, both as the plant manager and as a member of the community, consider fair and relevant. “Fair” means that the number does not give a misleading impression of the risk – that is, the number makes serious risks look serious and modest risks look modest. “Relevant” means that the number speaks to the issue at hand. For example, if you are talking to an audience of nearby residents, it is more relevant to discuss risk at the plant gate than average risk to the larger community.
- Pick a number that your audience will also consider fair and relevant. If you are not sure about which health or environmental consequences your audience cares about, find out. You can use various methods to obtain this information, including interviews, surveys, and focus groups. Explaining cancer risks to an audience concerned about miscarriages or property values or diminished elk hunting is pointless. Similarly, if an audience is angry or distrustful (a not uncommon state of affairs), bend over backwards to frame the risk in a way that audience will not resent – even if in doing so you must give an impression of greater risk than you think the data justify. It is pointless to offer a risk estimate that you know in advance your audience will reject as not being credible.
- Strive for comprehensibility and clarity, but do not oversimplify. Most people, if sufficiently motivated, are capable of understanding quite complex quantitative information. For example, people who have not been trained in probability and statistics can still understand mortgage rates and the complex odds at poker games and at race tracks. If people do not or cannot seem to understand what you are telling them about risk, you may be overloading them with technical jargon and details, or they may simply not trust you enough to want to understand what you are saying.
- Pay close attention to the numbers others are using. While you are not obliged to use someone else’s risk numbers (especially if you think the numbers are not fair), confusion will result if everyone is using a different measure of the same risk. If you think you must change terms to an altogether different measure, explain what you are doing. This tactic can be risky, however. Many of the other participants in debates about risk have more credibility with the public than you do as a company spokesperson. All the more reason to use their measures if you can, and explain why if you cannot.
If you have sufficient opportunity, consider offering several different estimates of the same risk. For example:
Our best estimate is a. Our cautious, worst-case estimate is b. The highest estimate we have heard, from the Environmental Defense Fund and the Sierra Club, is c.
If someone else has a higher calculated estimate of the risk than you do, you should be the first to say so.
Personalizing Risk-Related Numbers and Statistics
Technical experts see risk as a statistical number. But laypeople view risk in much more emotional terms – bound up with prospects of individual suffering and death. For many people, the term “low expected mortality” is fraught with emotional meaning – it could mean the death of the baby of the woman in the third row.
These different views are not just a matter of rational versus emotional. They also involve a matter of “macro” versus “micro.” Risk calculations are on the macro level: What will happen to the community as a whole. But citizens’ concerns are micro: What might happen to me and those I love.
One solution to this dilemma is to personalize the risk information – especially the quantitative risk information, which most needs personalizing. Several approaches are available:
- Use examples and anecdotes – hypothetical if necessary, real if possible – to make the risk data come alive.
- Talk about yourself – what risks you personally find unacceptable or frightening, how you and others close to you feel about the risk under discussion, or even how you feel about the emotional distress or the hostility you may be encountering at that moment.
- Use concrete images to give substance to abstract risk data.
- Avoid distant, abstract, unfeeling language about death, injury, and illness.
- Listen to people when they express their concerns. Reflect back to them your understanding of the content of their comments and the emotions they are expressing.
Copyright © 1988 by Chemical Manufacturers Association
Abbreviated Table of Contents
Table of Contents and Introduction
I . Effectively Communicating Risk Information
II. Guidelines for Presenting and Explaining Risk-Related Numbers and Statistics
III. Guidelines for Providing and Explaining Risk Comparisons
IV. Concrete Examples of Risk Comparisons
V. Anticipating Objections to Explanations of Chemical Risks
Appendix A: Concentration and Quantity Comparisons
Appendix B: Risk Comparison Tables And Figures