> >
Posted: February 21, 2012
This page is categorized as:    link to Outrage Management index
Hover here for
Article Summary I don’t know anything about Rabbit Calcivirus Disease (RCD), which the Australian government apparently used in an effort to control its rabbit population. This article argues that the Australian government was making the same mistakes with regard to RCD that the British government had made with regard to BSE (mad cow disease) – and that these mistakes are best understood in terms of my work on the hazard-versus-outrage distinction.

Copepodology for the ornithologist,
or what BSE can tell us about RCD

Paper presented to CSIRO Workshop on RCD and Rabbits, Canberra, 29 April 1997

When G E Hutchinson (1951) entitled his famous Ecology paper, Copepodology for the ornithologist, just what was he trying to say? He was merely making the simple point that we need to look over the disciplinary fence now and then. We need to learn from the experience of others. And that is the only point I want to make in introducing this discussion on RCD and rabbits. However it is a point that bears making with some force if our recent experience is any guide. We can make a strong case that, with BSE, we have not learned Hutchinson’s lesson, nor have we – and this may be worse – shown much inclination to admit these failures.

That these are direct criticisms of many hard working scientists I fully acknowledge, but being hard working is no defence against my claims. Science, as our only self-correcting path to understanding, depends on the generation of new criticism as much as on the generation of new knowledge. What is more important here is that the criticism is disinterested, and that it comes from one whose science has also been subject to intense criticism.

So I ask you to indulge me while I develop my argument that we have not done too well so far, because I will conclude it with some ideas on how we can do better. I will argue that, when we look over Hutchinson’s fence, we can see some interesting and controversial examples of outbreaking species. I will argue that these provide examples and counter-examples of how to handle the science, and, importantly, its links to public policy.

To see where we went wrong, I want to begin, as I alluded above, with the case of mad cow disease or BSE in Britain, an accidental release of an outbreaking ’species’ if we can call prions that. We can now see that, at least initially, the science was dreadful to the point that it not only attracted well-deserved opprobium in the scientific journals (Anon. 1996; Butler 1996; Gready 1997), but also in the editorial pages of the quality press, such as The Economist (1997). The gist of their criticism is that the science, in trying to serve public policy, became subservient to it.

In the BSE case, the science was monolithic to the point where one agency, Britain’s Ministry of Agriculture, Fisheries and Food (MAFF), controlled the process and access to all material. It was secretive instead of open to the point where external scientists could not easily obtain scientific data or even the information on how to obtain the data. It was uncritical in its design and execution to the point where important analyses were impossible to make. It was inward looking and narrow in its assumptions to the point where the possible human implications were initially ignored; and it was oversensitive to criticism to the point where its defensiveness has begun to look like cover-up. It was, in short, deplorable science.

Jill Gready (1997), commenting in Today’s Life Science, says: ‘MAFF adopted policies that rested on unsafe scientific assumptions and introduced regulations that were impossible to implement. Compliance could not be monitored because the necessary tests did not exist.’

As Nature (1996) points out in an extraordinary editorial: ‘The contrast of that approach to the openness involving data on the AIDS epidemic is striking, while its poor consequences for science are all too obvious.’

And perhaps most tellingly about the ’culture of secrecy’ (Butler 1996) at MAFF, it was arrogant and unapologetic when confronted with its failure. They have never said they were sorry! With this refusal to acknowledge their mistakes, MAFF has comprehensively joined the sorry ranks of bungled science (Pearce 1996).

Not only was the science bungled, but the public policy implications were bungled too. But they were bungled in such a way that in Britain, a bruised and battered public is in no mood to be sympathetic to all the other bio-political issues waiting in the wings – release of GMOs, use of pesticides in salmon farming, use of prophylactic antibiotics in intensive animal husbandry and so on. So not only has MAFF wasted a decade of research into BSE, it has almost certainly condemned other work in the mind of the public for at least as long. As The Economist (1997) points out:

The BSE crisis is a case study in how science should not be done. MAFF is a government agency, and its scientists answer to politicians. Yet science and politics are often incompatible. Instead of a broad, decentralised effort to crack the problem – as with HIV, a disease which has been around for much longer than BSE – the study of BSE has been hobbled by secrecy and government bungling. How has this served the public interest?

To see why this is so, we need to understand a little about the way organisations behave. Armed with this insight, we might not only understand MAFF in a more sympathetic light, but also learn from its mistakes. After all science is self-correcting only because people themselves learn and adapt.

For this insight, I want to turn to the work of Peter Sandman (1993), an organisational psychologist specialising in the ways in which ’risk controversies’ are handled. Sandman has an interesting perspective on risk, one worth developing a little here, because, as I hope you will see, it helps explain MAFF’s behaviour quite elegantly. Sandman has shown that if you make a list of, say, environmental health risks in order of how many people they kill each year, then list them again in order of how alarming they are to the general public, the two lists will be very different. Risk managers in industry and government often deduce from this that public perception of risk is ignorant or irrational. But he believes that a better way to conceptualise the problem is that the public defines ’risk’ more broadly than the scientists. He therefore offers a new definition: call the death rate ’hazard’; call everything else that the public considers part of risk, collectively, ’outrage’. Risk, properly conceived, includes both hazard and outrage.

In Sandman’s model, risks can be high in the public’s perception even if the hazard is low, and efforts to explain the hazard are unlikely to succeed as long as the outrage is high. More importantly, when people are outraged, they tend to think the hazard is serious, regardless of what the spin doctor from the ministry says.

Here are Sandman’s ’four traditional stages of a risk controversy’:

  • Ignore them. Your research tells you the hazard is low, so you do nothing. This typically generates more outrage.
  • Bury them in data. Ignoring them didn’t work, so you try to convince them they’re wrong. This typically generates more outrage.
  • Impugn their motives. If they are local, call them ignorant or hysterical. If they are nonlocal, call them radicals, mercenaries, or outside agitators. This typically generates more outrage.
  • Give them what they asked for. Management wants them to go away! Nothing else has worked, so you finally decided to pretend the hazard is huge, though you know it is not. Even this typically generates more outrage. They wanted an apology and a Community Advisory Panel; instead you gave them a cleanup or an expensive piece of equipment. They are still outraged – and now so are you!

It is a trivial Gedankenexperiment to see MAFF and BSE in all of this.

To see how we can move forward, I want to look over another of Hutchinson’s fences, to another controversial outbreaking species, Acanthaster planci, the crown-of-thorns starfish. I won’t deal with all the sorry history of this research (Moran 1986; Raymond 1986), except to say that the parallels with BSE and RCD are stark indeed, or at least were until we got smart. But we have had two bites at the cherry with Acanthaster and are now on our third, because it came back. Our response to the outbreaks in the late sixties and early seventies looks like Star Wars to the BSE’s Empire Strikes Back and RCD’s Return of the Jedi: same basic plot, just older technology. We were monolithic, secretive, arrogant and hypersensitive and the public were duly outraged and suspicious.

By the next series of outbreaks in the early eighties, we had become a little smarter: we were more inclusive, less arrogant and far less secretive and monolithic (Bradbury 1990; Moran and Bradbury 1989). All stakeholders, even critics, were involved and a competitive funding regime emerged. With the current series of outbreaks, this process has gone a little further, and the public outrage now seems to be in kilter with the actual hazard. From all the press cover, you would not suspect that a new wave of outbreaks is underway.

Significantly, by involving a wider scientific community from the early eighties onwards, arguably much better science has been done (Bradbury 1990). We have come to begin to understand this puzzling beast, and have been able to place it in the context of the reef ecosystem in a more comprehensive way. We may even be on the way to understanding.

So how does this help the RCD debate? What does all this ’copepodology’ tell us?

The science lessons from BSE tell us, I think, that we need to get a better understanding on the hazard itself through more inclusive, open, critical and humble scientific research. The science lessons from Acanthaster confirm this, but also tell us that that research should be systems based: it should focus on the interactions between host and parasite in the full context of the ecosystem in which they live their lives. There is, in the end, no substitute for understanding natural history: even Robert MacArthur, perhaps the greatest ecological theoretician, really knew his warblers (Cody and Diamond 1975).

The policy lessons from BSE and Acanthaster tell us that we need to do much more to reduce the public’s outrage over RCD, and that we need to do that in a dramatically different way than we have done hitherto: less hubris, less arrogance, more remorse, and more involvement of the all the stakeholders. RCD is only the most current of a range of bio-political policy issues impacting on the public just now. Others, such as the release of GMOs, the importation of species for research or biologial control, and the invasion of Australia’s waters and lands by alien species, are in the wings waiting to demand the public’s attention. Our scientific neighbours across other fences will not appreciate it if our work with RCD in our own backyard make theirs more difficult.


Anon. 1996. The hazards of government data. Nature 383 (10 October 1996):463.

Anon. 1997. The other BSE scandal. The Economist 22 February 1997:67-68.

Bradbury, R.H., ed. 1990. Acanthaster and the coral reef: a theoretical perspective. Vol. 88, Lecture Notes in Biomathematics. Berlin: Springer-Verlag.

Butler, Declan. 1996. BSE researchers bemoan ’ministry secrecy’. Nature 383 (10 October 1996):467–468.

Cody, M L, and J M Diamond, eds. 1975. Ecology and evolution of communities. Cambridge, Massachusetts: Belknap Press of Harvard University Press.

Gready, Jill E. 1997. It’s a sad, not mad, world. Today’s Life Science February 1997.

Hutchinson, G E. 1951. Copepodology for the ornithologist. Ecology 32:571–577.

Moran, P.J. 1986. The Acanthaster phenomenon. Oceanogr. Mar. Biol. Ann. Rev. 24:379–480.

Moran, P.J., and R.H. Bradbury. 1989. The crown-of-thorns starfish controversy. Search (Syd.) 20:3–6.

Pearce, Fred. 1996. Silence of the experts. New Scientist (30 November 1996):50.

Raymond, R. 1986. Starfish wars: coral death and the crown-of-thorns. Melbourne: Macmillan.

Sandman, Peter M. 1993. Responding to community outrage: strategies for effective risk communication. Fairfax, VA: American Industrial Hygiene Association.

Copyright © 1997 by R.H. Bradbury

For more on outrage management:    link to Outrage Management index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.