Mad Cows & Mother’s Milk (Ten Lessons & Appendix)
Risk Literacy: Is it the Missing Link in Environmental Education?
Lay Foibles & Expert Fables in Judgements about Risks
By: Cameron A. Straughan
July 7, 2002
I chose these readings for a variety of reasons. Firstly, since I enjoyed Mad Cow’s and Mother’s Milk so much, I wanted to complete my reading by examining the conclusions that Powell and Leiss reached, and their views on media analysis as it applies to science. Secondly, I wanted to read Risk Literacy: Is it the Missing Link in Environmental Education? since environmental education is a component of my Plan of Study. I’ve often thought that environmental education should be included in the school curriculum from grade seven to grade twelve, and – being a science/technical/environmental writer – I am concerned with public literacy along these lines.
Lastly, I read Lay Foibles & Expert Fables in Judgements about Risks because I’ve often thought that scientists have (to their detriment) ignored why the public sees things the way they do. More importantly, I think this article continues a nice progression that my readings have been following. Specifically, I started off with risk selection, then risk communication and perception, and now – at the end of the process – specifics regarding how the public perceives risk.
These readings provided me with a well-rounded view of risk communication issues since they suited different purposes and used a variety of approaches. As I mentioned in my previous report, Mad Cow’s and Mother’s Milk supports the thesis that failure of good risk communication creates a risk information vacuum with grave and expensive consequences for risk managers. The authors support their thesis by combining thorough research of scientific sources with media analysis. Thus, both quantitative (media analysis) and qualitative analysis was used. The end result was a more holistic,
well-rounded approach to risk communication.
Risk Literacy: Is it the Missing Link in Environmental Education? sought to lay down a broad foundational base for teaching risk literacy. As such, it is a research essay, providing a general overview of its topic, using qualitative research (after all, it is titled “Viewpoint”).
Lastly, Lay Foibles & Expert Fables in Judgements about Risks is the most technical article I have read on risk communication to date. In general, the chapter demonstrates how scientists have misconceptions about how the public perceives risk. For this purpose, the authors review a lot of empirical evidence regarding lay attitudes and behaviours towards risk. Thus, there are lots of graphs, figures, and tables. The evidence is very well defended and many good examples are presented. In fact, while this article was written for resource managers and environmental planners, who must consider stakeholder values in their professions, I think that other scientists (i.e., toxicologists) should read it as well. I think they would appreciate the technical detail and thorough research that went into the chapter.
Reading Ten Lessons and the Appendix in Powell and Leiss’s book proved very useful. It succinctly summarized the book and refreshed my memory; thus, I could relate the book to the other two readings included in this report.
Essentially, Ten Lessons concluded that there are three phases to risk communication, but the first two are “magic bullets”. There is no quick fix in risk communication. Thus, the final phase should be long term institutional commitment to good risk communication. Reliance on magic bullets will result in the problems focused on in Lay Foibles & Expert Fables in Judgements about Risks. The chapter also stated that an information vacuum will counteract the efforts of scientists, government, and industries to alleviate a risk, so it is best that the vacuum never be allowed to form in the first place.
Having read the articles, I realized that there were some tensions worth noting. For example, while Risk Literacy: Is it the Missing Link in Environmental Education? seeks to lay down a broad foundational base for teaching risk literacy, as I have found previously in the Mother’s Milk test case, a broad based risk management approach will not work. Instead, an individually tailored, community based approach is required. Would this not extend to education? In addition, Powell and Leiss state that educating the public about science is no substitute for good risk communication practices (implying that cultural analysis is necessary). This seems directly contradictory to Riechard’s point of view (personally, I agree with Powell and Leiss’s view). While I support Riechard’s call for more risk education in schools, I found his article lacking in many ways and I was left with many questions.
If, as Riechard points out, there are no risk-specific conceptual themes or theory base, only marginalized technical information, then how did risk managers learn their trade? When did risk education start? I think a more thorough history of the profession is needed here. I also wonder if, in the so-called absence of themes and theories, Riechard’s article is pointing out that the education that risk managers do have is weak.
I also wonder at what age Riechard is proposing that risk education be taught. Do we want to “scare” children with theory and conjecture meant for adults? I’m reminded of the Mother’s Milk test case and the impact it had on Inuit culture. Clearly, scientists and educators must tread lightly here because children are, in a way, a different culture from us adults. Thus, considering cultural analysis and a community approach, risk education should be tailored for the age of the student, as well as their cultural background – and other factors, as appropriate.
I also think that teaching children / teenagers about risk is a slippery slope for adults. Teenagers generally think they will live forever. Thus, they are far more likely to accept those risks that adults would no longer tolerate. With this in mind, I think some students would balk at teachers telling them what is good and bad for them, especially since teachers were once teenagers and did basically the same things. Moreover, I fear that, if not handled properly, risk education may evolve into draconian laws about drinking, smoking, and sex. Students are likely to feel that their freedoms – freedom to be young – are being restricted by hypocritical teachers and parents. Riechard’s description of minimizing irrational and maximizing rational behaviour sent shivers up my spine. This sounds like 1984 meets A Clockwork Orange!
I also think that Riechard’s definition of “risk literacy” is basically “science”. But not all people can be science minded, as he seems to be suggesting – nor should they! I got the feeling that Riechard was suggesting a utopian dream – a well-informed, logical, scientific society. Even though I am a scientist, this sounds very dull to me. I can’t help but think that risk communication is much more important that Riechard’s idea of teaching risk literacy since, at least as far as Riechard’s definition is concerned, risk communication can have more mass appeal. It can be geared towards both scientific and lay audiences, without trying to turn everyone into a scientists. I was also disappointed with his definition of risk literacy since I originally thought he’d be delving into how people assimilate written information about risks. I guess I took the title too literally!
I also wonder at Riechard’s statement that the primary contribution to perception is learning. What type of learning is he referring to – school learning or learning from the media? Learning by experience or by example? I take it that he means school learning, since he’s discussing environmental education, but if that’s the case I can’t agree with his conclusion. Like most people, there were lots of things taught to me in school that had negligible impact on how I perceive the world.
Riechard also points out that heredity is important in risk perception. Here he means psycho -biological and -analytical abilities – not knowledge passed down through the generations. I’d argue that the latter is more important! For example, consider the cultural significance of First Nations stories passed down through the ages, and how the stories affect their risk perception. Obviously, I’ve inherited some things from my parents, but I’m of a different generation and I perceive risk much differently than they do. This difference was caused by my environment. I think Riechard’s notion of hereditary influences has little impact on individuals when compared to the effects of the environment surrounding them; however, these two elements do, of course, operate in concert.
I agree with Riechard that environmental education is important. Unfortunately, he suggests that it is a failure within the current system (1993), which is highly likely. It is also typical, based on what I’ve read in Mad Cow’s and Mother’s Milk, that education would not be in the better interest of risk managers. Despite this, and in agreement with Riechard, I’ve often thought that environmental education should start as early as possible. In fact, when I worked for EcoSource Mississauga, I created a Web site to teach elementary school children about recycling and reusing. If you can get environmental messages to the children when they are young, they will be more likely to become environmentally aware adults. I’d also like to see actual courses in environmental science start in grade nine and extend through to grade twelve.
Overall, I thought Riechard’s article was well-meaning but too provocative. I bet that much of it was met with criticism and skepticism. However, there are some good things to be said about it.
The article is important in that it provided me with a good psychological basis for risk perception. For example, the tension between the psychological self and the social environment causes fear and resulting risks. Perception of risk is acknowledged as developmental in nature (thus age dependent). This points to the complexity of risk perception – something that risk managers and scientists are unwilling to embrace. It also set a nice foundation for my reflections on Lay Foibles & Expert Fables in Judgements about Risks.
I thought it was interesting that educators thought it was the EPA’s job to teach about risk management. It’s seem like no body wants to accept responsibility for communicating and educating regarding risk. Why is that? Do they fear that a well-informed public may somehow delay decisions on already complex issues? As it turns out, some answers to these questions are provided in Lay Foibles & Expert Fables in Judgements about Risks.
Fischhoff, Slovic, and Lichtenstein’s chapter criticizes scientists for considering people homogeneous, predictable, statistical objects. The chapter seems to advocate the cultural analysis first presented in Risk & Culture. The authors state that the public wants to know the risks and, in general, their risk selection and perception occurs without the influence of risk management.
I think this chapter presents a thorough, well written criticism of scientific thinking and methodology. I agreed with the author’s statement that conclusions are never obvious. Thus, wording like “obvious trend” should be avoided. I recall my supervisor at the University of Windsor telling me that very thing when I was preparing a presentation for a major conference. The chapter also points to the need for scientists to say something before an information vacuum forms. In this regard, the article agrees with Powell and Leiss’s findings.
I also agree with Fischhoff, Slovic, and Lichtenstein’s statement that specializations can lead to tunnel vision. I think environmental education – biology and otherwise – should be geared towards a holistic view that incorporates a variety of other disciplines. That is exactly why I am currently at York – I want to blend my Fisheries Biology specialization with the study of communications, documentary filmmaking, culture, and the social sciences.
This week’s readings also provided me with some tips for my wolf documentary project.
For example, Ten Lessons provided me with a quick checklist of what causes a risk information vacuum. This is essential to my documentary since – via interviews with scientists, politicians, and the public – I can see if any of these characteristics are present, thus signaling an information vacuum.
With regards to the Algonquin wolf, my hypothesis is that there is definitely an information vacuum since policy does not reflect the science and public perception seems far removed from the facts. This may help refine my documentary question from “what is more important to wolf management and public perceptions: facts or ideology” to something like “is there a risk information vacuum concerning the wolf”?
The Appendix states that people get most of their risk and science information from the media, and increased coverage of a risk will increase the significance of a risk in the public mind. This suggests that a well-done, scientifically sound documentary, using the same techniques as television, could achieve some very positive results. However, there are pitfalls to watch out for.
Increasing the technical information in a documentary, while reassuring to scientists, will only make the audience more concerned and worried. Also, drama should be used to get the issue(s) explored in the documentary into the “magic seven”. Personal examples will also stick in the public’s mind. I intend to achieve this by interviewing locals who live around Algonquin Park.
As outlined in Riechard’s article, I could use the same tools that teachers use to judge the effectiveness of their teachings by creating an audience survey form. Also, if there is stress in the relationship between the individual, society, and nature, then risks evolve. The questions poised by my documentary will hopefully uncover any stress(es).
Fischhoff, Slovic, and Lichtenstein’s Lay Foibles & Expert Fables in Judgements about Risks turned out to be particularly valuable for my documentary. They went into detail concerning how the questions being asked can determine the content (and the wisdom) of the response. This reminds me of Peter Cole’s comments on documentary questions that I was going to pose to First Nations Elders.
I had organized the questions to move towards dramatic conclusions. However, I explained to Peter that I let all interview subjects view the questions well in advance of the interview. Thus, some wrote copious notes, as part of their response, and some chose to just tell a story – I gave them that option.
In the future, I may also try to avoid leading questions by developing a few possible questions for each possible answer, then randomly choosing one question to be used. Also, I would randomly order the questions, to avoid steering the subject towards a conclusion for the purposes of the documentary. I also let interview subjects view their interviews, so they know how it turned out.
It is also interesting to note that my racial background will probably impact my interviews with First Nations peoples. Yet, since I made the process very transparent, I sense they felt comfortable being interviewed by me. Also, because of my left wing politics and strong interest in the environment, I feel that they still felt free to criticize the way government operates with regards to the environment and Native rights and culture.
Fischhoff, Slovic, and Lichtenstein’s chapter also provided me with a convenient checklist for judging the accuracy of public perceptions and their cognitive skills. This checklist will be very useful for my audience survey forms, for the purposes of determining how effective my documentary is. The authors also state that a vivid film, such as Jaws, impacts risk judgment, and it helps if a risk is repeated overtime.
Lay Foibles & Expert Fables in Judgements about Risks also related strongly to the central question my documentary will ask. The chapter explains that public fears will guide risk selection and perception, as opposed to the “facts”, if lay people know something experts do not, lay people do not trust expert results or are not convinced, and/or lay peoples have a deep emotional investment in their beliefs. I can easily test for these three criteria via on-camera interviews using carefully prepared questions. This will help me recognize if scientific facts or ideology is having the major impact on wolf management.
These three readings brought to mind some research questions. I wondered why Canada lagged behind the US with regards to announcements about the effects of mad cow disease. Was it because, as a nation, we typically do not make bold, brash statements that could expose use to criticism(s)? Also, is industry more effective at risk communication than the government? I know the public doesn’t trust industry as much, even though they know the risks of their products inside out, but has there even been a comparison of these two risk communication strategies?
In conclusion, these three readings fit in nicely with my reading schedule, introducing me to key ideas and details in a surprisingly logical order – considering that I chose the readings randomly! While I had problems with Risk Literacy: Is it the Missing Link in Environmental Education?, it was still worthwhile reading it because I often learn more from my criticism of something than I do my agreement with it. In the end, all three articles provided some nuggets of knowledge for my documentary and helped further affirm the importance of my Plan of Study.