Three Case Studies from Mad Cows and Mother’s Milk: Dioxins, or Chemical Stigmata; Hamburger Hell;
& Mother’s Milk
By: Cameron A. Straughan
June 29, 2002
I chose to read Mad Cows and Mother’s Milk mainly because it deals in depth with risk communication concerning PCBs and Dioxins – something I have had ample experience with. Also, having read about risk selection in Risk and Culture, the next logical step would be to read about risk communication. As it turned out, Risk and Culture was a great introduction to this book. Lastly, as always, I looked for whatever advice I could get on how to make my documentary more effective.
I found Powell and Leiss’s book easy to read and succinct. I liked the layout of the book. The addition of figures, tables, and point-form lists turned the book into a useful manual – a check list of risk communication do’s and don’ts. I also liked the metaphors used (i.e., information vacuum and risk amplification). In fact, the very book itself is an example of successful communication regarding a complex topic! Accordingly, I believe the book would be valuable for practitioners of hard and soft sciences alike. The only quibble I have is some photos would have been nice, complete with analysis of their content. The case studies covered are – by and large – still relevant; and since they generally occurred within my recent memory, they proved to be excellent, interesting, and provocative examples.
The thesis put forward by Powell and Leiss is that failure of good risk communication creates a risk information vacuum with grave and expensive consequences for risk managers. The authors support their thesis by combining thorough research of scientific sources with media analysis. In some cases, quantitative analysis was used (i.e., graphs of the number of E. coli articles in Canadian and American newspapers, over time). This allowed the authors to pinpoint the beginnings of the risk issues (i.e., risk selection and management by scientists), the point at which the media became involved (via media analysis), the disparity (if any) between science and the media, and the final cumulative effect (of science and media) over time on public perception of risk.
I thought this was a very thorough, well-rounded approach. However, I often wished the authors conducted more thorough media and cultural analysis – including film, art, and literature produced at the time. Instead, there was a reliance on newspaper analysis. This is understandable, since newspapers are easily archived and can be accessed and searched electronically. My only other criticism is that in the Hamburger Hell case study I wished more differences between Canadian and American risk perception and communication were explored. In particular, I’m interested in the impact that American culture had on us. Did it cause risk amplification amongst the Canadian public? I recall a very good episode of Law & Order that dealt with the Jack-in-the-Box story, and films like The Rock and Outbreak banked on the public’s fear of killer bacteria, so it is likely that – despite the Canadian government’s apparent lack of concern – the Canadian public would become worried after viewing these US-based cultural products.
Criticisms aside, I found that this book related strongly to both my Plan of Study and my previous work experience. In fact, Figures 2.1 and 2.2 in the book look surprisingly like the figures I included in my Plan of Study:
Basically, through my Plan of Study, I want to position myself to become an interface by which the public can obtain easy-to-understand information concerning science and the environment. According to Powell and Leiss, what I’d really be doing is filling the “information vacuum”, thereby initiating good risk communication. When I realized this coincidence, it was a very inspirational and provocative moment. Having read this book, I’m convinced that I’m on the right track with my plan!
With regards to my previous work experiences, Powell and Leiss’s description of the problems facing communicating PCB and dioxin risk to the public rang a bell. When I was the principal author and coordinator of the Detroit River Update Report, I had to figure out how to communicate complex sediment contaminant dynamics to the general public – let alone information on PCB and dioxin levels in fish and birds! As described in the book, I encountered the tension between scientific and public discourse first hand. I chose to develop a simple metric by which I could determine what parts of the river were low, medium, or highly contaminated. To do this, I used a colour coded map of the river that was very easy for the public to understand. Problem was that MOE scientists didn’t like my designations, yet – oddly – EPA scientists said there was nothing wrong with the colour code; they had known for years the locations of the river’s problem areas. On the other hand, the general public liked the colour coded maps. They could clearly see the impacts on areas of the river that they used.
Powell and Leiss described in detail how the public came to fear dioxins via risk amplification. I saw evidence of this when I worked on the Detroit River Update Report. A local non-profit group, no doubt keen on Greenpeace’s stance, insisted that I include something on dioxins. I was able to include a good temporal database on dioxin levels in herring gulls that suggested that not only were dioxin levels well below levels of concern, but they seemed to be decreasing over time. Thus, dioxins were perceived as a major problem in the Detroit River and I demonstrated, in lay person’s terms, that they were not. According to Powell and Leiss, that was good risk communication!
During the Detroit River Update Report, I presented scientific information in magazine format, with photo contributions from local stakeholders. In this manner, I wanted to make the science more palatable using a standard, pop-culture format. It worked, but I was left wondering if scientists normally publish their papers in alternative versions for public consumption. I doubt they do. That’s unfortunate, because scientific journals have pre-set guidelines and formats that render them incomprehensible to the general public – not to mention the scientific lingo contained within. Along these lines, I find it sad and humorous that, as the authors state it, scientists believe that by some stroke of luck their research will diffuse into the public mind.
It is interesting to note that dioxins also have a stigma in the scientific community. In one paper I co-authored about PCB levels in herring gulls and cormorants, I was requested to change PCB levels into dioxin toxic equivalents. This calculation would help other scientists see the toxicological significance of the levels we detected. However, I can not help but think that this technique is accepted because risk amplification has legitimized it within the scientific community. Why else would a comparison to dioxin be relevant and justified? I think it’s a scare tactic to make other scientists sit up and take notice of the results.
Having read Mad Cows and Mother’s Milk, I also developed a better appreciation of Risk and Culture. As I have already mentioned, reading Risk and Culture before this book proved to be an excellent choice. I also felt that the tension described between border and centre organizations was very similar to the tension Powell and Leiss described between quantitative (expert) language and qualitative (public) language.
Again, this tension must exist to some extent, yet I find it somewhat absurd. I think it is suggested that the tension is continuous, yet good communication is very rare. When it does occur, good communication creates an equilibrium or steady state between the “warring” factions. It’s interesting that the authors use the term “vacuum”, which implies a search for equilibrium. After all, all things move towards equilibrium – it’s just that that point represents a very small window of opportunity. It is that small window of opportunity that risk communicators must open!
Based on my readings, there seems to be a threshold at which scientists must stop concealing their research and report something to the public. Otherwise, once that threshold has been crossed, the information vacuum quickly develops. The trick is how can scientists predict when this threshold will appear? How will they know that further silence – whether for further research or otherwise – will harm them in the long run? Obviously, this depends on the individual situation. Maybe the threshold should be the very first inkling that the media has caught hold of the story. Otherwise, the scientists will not be given a chance to voice their views before the facts become sensationalized. In addition, once a story is out of the gate and the public is alarmed, it is very hard for scientists to get that story back into the gate and renew public trust.
Reading Mad Cows and Mother’s Milk, I discovered that risk communication is a relatively new “science” – the term was coined in 1984. Thus, in retrospect, reading Risk and Culture (published in 1982) really did provide me with a good history of risk communication, since it was published during risk communication’s infancy.
When Powell and Leiss speak of the need for experts to understand and appreciate the general public’s view (i.e., know the public framing), they seem to be advocating the use of cultural analysis as outlined in Risk and Culture. The end goal is – once again – to build or change the institutional structure for the better. In fact, Powell and Leiss go on to say that phase II of risk communication (1995 to the present) emphasizes the consideration of social content. So it seems that the cultural analysis outlined in Risk and Culture has been applied in recent times and is still very relevant. Risk and Culture was more forward-looking and provocative than I had originally thought!
Within Mad Cows and Mother’s Milk, there are some good examples of the ties between culture and risk perception. For example, in the Hamburger Hell case study, one of the reasons that the risk was taken so seriously is that hamburger, like apple pie, is an integral part of American culture. It’s a staple of family reunions and barbecues. Another (even better) example is the Mother’s Milk case study.
Scientist’s hesitation at releasing results, media’s sensationalization of the impacts, and the subsequent reversal of decisions concerning consumption of PCB contaminated country foods caused northern Aboriginal peoples great distress. This distress was primarily due to potential impacts on their culture. First of all, there was a history of mistrust, since Aboriginal peoples were forced onto reservations. Secondly, the fish, fowl, and game they eat is an integral part of their culture. The act of skinning, eating, storytelling, and burning whale oil for warmth is all related to their food. Thus, if they had to switch to southern imported foods, their culture would be at stake. Thirdly, there was the danger that the advice of elders would no longer be highly regarded, in the face of southern scientific advice. Lastly, scientific lingo does not have an equivalent in Inuit language.
In this case, I think it was highly irresponsible for southern scientists to release results without considering the cultural ramifications. I’d argue that the Inuit were better off without the study. If anything, the Mother’s Milk case demonstrated that a broad based risk management scheme can not be applied across all communities. Instead, local involvement in all stages and cultural analysis (sensitivity) is required, otherwise stakeholder conflicts will arise.
“ … there is an obligation on the part of major institutional actors in society to communicate effectively about risks, not by simply touting the superiority of their own technical risk assessments but through making an honest effort to understand the bases of public risk perceptions and experimenting with ways to construct a reasoned dialogue around different stakeholder assessments of risk situations.” (p. 37)
With regards to my documentary, Mad Cows and Mother’s Milk did provide me with some useful tips. The book emphasized the importance of building credibility and trust – the final important step in risk communication. In documentary terms, this translates into avoiding cheating an audience using trick camera work or staged events etc. – unless they are warned up front. To achieve this trust, as the book suggests, it helps to have a transparent process leading up to the risk communication. That way, the public is not left wondering what is going on. More importantly, a third party, such as the media, can not leap into the information vacuum and misrepresent the facts so easily when the process is transparent from start to finish. In a way, I have achieved this with my documentary by allowing interview subjects to view and comment on their video taped interviews.
The book also provided a convenient check list of how to maximize the effectiveness of risk communication, which in turn could be applied to documentaries. Accordingly, documentaries should avoid biases, sensationalism, distorted truths, hidden agendas, irrational standpoints, and the use of language which the public will not understand. Further still, documentaries should include credible sources of information (and/or be made by a credible person), have a clear message, make effective use of channels (in the case of documentaries, this could be extended to mean visuals, sound effects, music etc.), and focus on the needs and perceived reality of the audience. It also helps if, as in the Mother’s Milk case, there is some sort of local involvement.
The documentary could be considered successful if it is persuasive. Persuasiveness could be evaluated by asking four key questions: Does it gain attention? Was it understood? Was it believed (in fact, trust seems to be a major indicator of effectiveness)? Did it provoke action of some sort? However, the authors are careful to point out that the use of persuasive techniques (i.e., sound or image to manipulate the audience’s emotions) should be limited, or the rational message will be subverted by clever techniques. Also, audiences know when they are being manipulated!
While Mad Cows and Mother’s Milk provided me with a lot of advice and insight, it also incited a lot of reflection. The book was very provocative and I was left with many questions, some of which may be worthy of further research. For example, how good is the current risk communication about skin cancer? I wonder because some dermatologists tell me to avoid sunlight at all costs whereas others say sunbathing is OK. Doctors disagree on it as well. So where does that leave me? Are certain people in certain areas at certain ages more susceptible than others? For me, “just say no to the sun” seems suspicious, given all the disagreement on the matter.
I was also left wondering how Greenpeace chooses what risks it will pursue. Do they follow media, public, or scientific perceptions – or are they leaked information from within institutions? Perhaps more importantly, did scientists and risk managers learn anything at all from the Greenpeace dioxin campaign? That is, did they decide that bearing mute witness, and letting someone else pick up the facts and run with them, was not the way to go? Did it improve their risk communications?
I also wondered why Canadian and American government agencies, with much more money than Greenpeace, do not have the best risk communications possible. Is it not in their best interests? Do their scientists fear criticism from their peers or damage to their reputations? But, as Powell and Leiss have pointed out, isn’t the aftermath of poor risk communication (an information vacuum) far worse? Powell and Leiss suggest that these institutions refuse to be responsible. This reminds me of the quote in Risk and Culture:
“Science and risk assessment cannot tell us what we need to know about threats of danger since they explicitly try to exclude moral ideas about the good life. Where responsibility starts, they stop.” (p. 81)
If government is not responsible for risk assessment, then who is? And if they shirk this responsibility, then why complain about Greenpeace picking up the pieces? Reading this book, I was angered by what I see as a vicious cycle of arrogance, ignorance, stubbornness, and complacency with regards to risk communication – particularly with regards to the Canadian government.
In the case of dioxins, it is obvious that science played a very small role and public perception, fueled by dramatic media accounts, was of utmost importance. But I wonder – if scientists could suddenly fill the risk information vacuum with their version of the facts, would it have helped at all, considering how the story got started? Lastly, I wonder if the authors have tried presenting Mad Cows and Mother’s Milk to the general public. It’s a valuable book, so I think it is their responsibility to do so.
In conclusion, I think Mad Cows and Mother’s Milk effectively defended the thesis that lack of good risk communication causes a dangerous information vacuum, leading to public mistrust, fear, and resentment. It could even ruin political careers and bring industries to bankruptcy! In fact, as in the case of silicone breast implants, public fears and perceptions alone can win a lawsuit – even in the absence of scientific data. In defending this thesis, Powell and Leiss provided me with an expansion on – a better understanding of – past experiences communicating environmental information to the general public. Their book inspired me, while confirming the importance of my Plan of Study. Lastly, the book provided me with helpful tips for future environmental communications.