“Kahan’s argument about the woman who does not believe in global warming is a surprising and persuasive example of a general principle: if we want to understand others, we can always ask what is making their behaviour ‘rational’ from their point of view. If, on the other hand, we just assume they are irrational, no further conversation can take place.”
The New Republic has a short summary of the cultural cognition project: how to talk to climate change deniers.
Those who ‘deny’ climate change aren’t mad, deluded or evil – they’re just paying close attention to the community to which they owe allegiance. Various groups make use of publicly held views to create a kind of ‘badge of membership’. That’s why, for example, conservatives rarely wax enthusiastic about climate change policy. The issue has been polarised. The communal viewpoint is strong, which means that for individuals there’s little to be gained and much to be lost in opposing it. It’s all-important, in Margaret Thatcher’s timeless phrase, to remain ‘one of us’.
A great example of this is the case of former Congressman Bob Inglis. He’s a bona fide conservative who came unstuck in 2010 when the Tea Party decided it didn’t like his stance on climate change. Since losing his seat, far from giving up and toeing the line, he’s set up an initiative that aims to construct a conservative dialogue on climate and energy policy: ‘Putting free enterprise to work on energy and climate’. He’s proof that there’s little or nothing inherently liberal about climate change. Imaginative policy makers should be able to work with almost any kind of raw material. This American Life had a great piece on the issue.
When people don’t accept the scientific evidence, it may be useless to present them with yet more evidence. They are not stupid. They are simply protecting their cultural identity.
Here’s the journalism:
And here’s the original study, Motivated Numeracy and Enlightened Self-Government
Kahan, Dan M., Peters, Ellen, Dawson, Erica Cantrell and Slovic, Paul, Motivated Numeracy and Enlightened Self-Government (September 3, 2013). Available at SSRN: http://ssrn.com/abstract=
Why does public conflict over societal risks persist in the face of compelling and widely accessible scientific evidence? We conducted an experiment to probe two alternative answers: the “Science Comprehension Thesis” (SCT), which identifies defects in the public’s knowledge and reasoning capacities as the source of such controversies; and the “Identity-protective Cognition Thesis” (ICT) which treats cultural conflict as disabling the faculties that members of the public use to make sense of decision-relevant science. In our experiment, we presented subjects with a difficult problem that turned on their ability to draw valid causal inferences from empirical data. As expected, subjects highest in Numeracy—a measure of the ability and disposition to make use of quantitative information—did substantially better than less numerate ones when the data were presented as results from a study of a new skin-rash treatment. Also as expected, subjects’ responses became politically polarized—and even less accurate—when the same data were presented as results from the study of a gun-control ban. But contrary to the prediction of SCT, such polarization did not abate among subjects highest in Numeracy; instead, it increased. This outcome supported ICT, which predicted that more Numerate subjects would use their quantitative-reasoning capacity selectively to conform their interpretation of the data to the result most consistent with their political outlooks. We discuss the theoretical and practical significance of these findings.
- Politics Makes Morons of Us All (motherjones.com)
Dan Kahan‘s blog at the Cultural Cognition Project makes some conjectures about whether experts think in similar ways to non-experts. Specifically he wonders whether experts exhibit the kinds of cultural biases already demonstrated by non-experts. Do experts use cultural cognition?
My observation is that there would need to be care taken to avoid something like the fundamental attribution error. That is to say, being an ‘expert’ in a given field is strongly conditioned by situation. So the very choice of who the experts are may be conditioned by unacknowledged cultural bias. My conjecture is that experts therefore say what their audiences and sponsors expect them to, otherwise they would be unrecognizable as experts. In situations where the message is critiqued, so is the messenger’s status as an expert. In situations where the message is positively received, the messenger’s status as expert is regarded as obvious.
Three possible examples:
Who is an expert in local economic development? Rob Hopkins, the founder of the Transition Towns movement, tends to have a strongly Egalitarian outlook on the world. He recently complained that the ‘growth as usual’ mindset of local council officers called into question their competence as expert in their own field. His position is that true economic development experts would take into account peak oil, economic crises and climate change and allow for the possibility that economic growth, as it has been understood, may be a thing of the past.
Second example: Climate science has its experts and it is an open question as to whether the geologist Ian Plimer is one of them. At one level he is not an expert in climate science since that is not his area of professional competence. However, he has written a book on the subject and since he is a ‘climate sceptic’, there are some people who wish to present him as an expert in climate change. His new book for students was launched by a former prime minister of Australia.
The third example is that of US judges, experts in legal deliberation, many of whom are appointed on specifically political grounds. Voters have a sense of the liberal and conservative candidates for office and they vote accordingly. To those of us living in places where the judiciary is appointed on merit rather than elected, this appears strange indeed. After all, what could be less political than judgements concerning the facts?
In these examples the kinds of statements made by ‘experts’ are received not on the basis of whether the person in question actually has qualifications or professional standing, but on whether their words fit with a particular cultural bias. That is to say, each cultural bias already has its own experts, who are brought into the argument in order to cast doubt on the competence of the other side’s supposed expertise.
So before we can identify how experts behave it’s necessary to create a definition of expert that is broadly acceptable across the conflicting cultural solidarities described by Cultural Theory. The three examples given above show that this may be quite difficult.
Who would you trust to tell you what the risks are?
Research from the Cultural Cognition project suggests the cultural identity of the presenter matters significantly to the public reception of a particular message about risk. In other words, we need our experts to be our experts, not the other side’s experts.
It follows from this that one way of reducing the polarization of debates on risk may be to provide a variety of views on an issue from within a particular cultural bias. Two examples of this in practice are presented below, one quite successful, the other less so.
4culture’s insightful post put me in mind of something important that in fact he has said explicitly before: Understanding the contribution that cultural influences have on our perceptions of risk (and like facts) cannot only explain but also improve our situation. If we know we have cultural “blind spots” & where they are, then we should be able to do something to reduce their dimensions even if we are constrained (not so unhappily!) always to be who we are and thus see what we see.
In that spirit:
Imagine a “cultural theory” response to the “marketplace of ideas” view of free speech. This view holds that “truth” can be expected to emerge naturally & rapidly take hold in society through the competition of ideas in a “free speech” market (associated with J.S. Mill; Justice O.W. HolmesJr., US S Ct; and others).
Cultural Theory helps to show why this laissez faire attitude toward transmission of knowledge is naive. Through biased search and weighting of evidence, people conform their assessments of information to their cultural values. Accordingly, even if the market of ideas furnishes them with an ample supply of information that it would be very much in their interest to accept and act on (because, say, they are more likely to die if they don’t), culturally diverse people won’t come to see it as true (or at least not nearly so quickly) if it denigrates the worldviews of some portion of them. This “cultural market failure,” Cultural Theory tells us, warrants some sort of corrective intervention. Some possibilities:
1. Affirmation framing
A cognitive rendering of Cultural Theory would say we are unconsciously motivated to resist information that threatens our cultural worldview. One way to mitigate the potential for bias inherent in this dynamic, then, is to try to strive to frame information in ways that affirm a plurality of worldviews simultaneously. Thus, when presenting information about climate change, it might make a lot of sense to give prominent billing to greater use of nuclear power or to the development of geoengineering, steps that are identity-affirming for individualists, rather than focus predominantly on carbon-emission limits, a policy that threatens individualists,
2. “Subsidize” hierarchy
Wildavsky believed that signature blind spots of each worldview meant that societies are most likely to prosper when they have a rich inventory of all worldview types. He was worried that in contemporary America, at least, hierarchy was being driven out by “the rise of radical egalitarianism” and so he proposed that hierarchists should be treated with respect and not vilified so that the value society gets from having hierarchical insight remains available. (Mary Douglas too was very anxious about the decline of hierarchy.) Actually, I think conspicuous efforts by egalitarians and individualists to find ways for hierarchical meanings to co-exist with theirs through adroit framing (point 1) is a way to subsidize; it puts a brake on the instinct to attack and also furnishes evidence to persons of hierarchical sensibilities that they are not under attack and thus promotes their full participation in public debate.
3. Puncturing culture-pluralistic ignorance
It turns out that people tend to overestimate how uniform & how strongly held positions on risk are within their cultural group & within opposing ones. This perception feeds on itself: because individuals sense that they will likely be put at odds with their peers if they take a dissenting view, they are less likely to form one and less likely to express it; such reticence amplifies the signal that views are uniform and strongly held, which increases the pressure to conform, etc. Well, one way around this is to promote (particularly in formal deliberative settings) a deliberative norm of acknowledging the “strongest counterargument” to one’s position. Such a norm gives people an “immunity” from sanction within their own group so they voice equivocation and dissent more freely. The voicing of equivocation and dissent mitigates the impression that views are uniform and strongly held; as that impression recedes, so does the pressure to conform . . . voilà!
I’m sure others can think of more ideas. But the point — as the post makes clear — is that Cultural Theory is not just a theory of bias but also a guide to possible debiasing as well. After all, wasn’t that what Douglas & Wildavsky were trying to provide us?
Willful Blindness (fourcultures.com)
Image credit: http2007/flickr
Thanks for your interesting message. I’d certainly like to make a ‘guest post’ of it. It fits very well with the next piece coming up here on the London riots – but of course you say things that hadn’t even occurred to me. For example the whole idea of a market place of ideas as an Individualist fantasy is intriguing. (Likewise the idea of nuclear power as an Individualist institution. I have seen it as implicated in a Hierarchical or at least strong Grid world view and have perhaps been wilfully blind to the promethian, cutting edge of progress aspects.)
Your three suggestions for de-biassing public debate show that we could be doing much better than we currently are – and that the problem of market failure in the market place of ideas has some encouraging solutions. I’m sure the readers of Fourcultures will be fascinated…
Dan Kahan is a part of the Cultural Cognition Project. Watch this space for his guest post.
It’s been hard to move recently for people leaping to conclusions. Everyone with an Internet connection has already posted an opinion about the supposedly obvious causes of the London riots.
Medhi Hassan’s heartfelt plea for pundits to to stop generalising certainly makes sense. The introduction reads:
The debate about the riots is being hijacked by those who want to push partisan agendas and narratives. But shouldn’t we wait for evidence?
Yes we should, in the same way we should shut the door after the horse has bolted. Unfortunately the evidence will not help us in quite the ways we might expect. The Cultural Cognition project people claim to have shown that in at least one public debate (over climate change) the greater scientific knowledge there is, the more (not less) the preconceived opinions are reinforced. The facts tend to fuel, not calm the fire.
Roepik is the author of How Risky is it, Really? Why our fears don’t always match the facts. His website offers exerpts from the book and -wait for it –
While you’re here, though, you could take our little fourcultures quiz just to the right of this page. How much is there?
You know you want to.
…and if you really can’t get enough quiz in your life, why not try the cultural theory quiz posted at the OK Cupid website (no, really). According to its creator, ” The test items are taken from Gunnar Grendstad and Susan Sundback’s paper “Socio-demographic effects on cultural biases” published in Acta Sociologica, vol. 46, no. 4, 2003, pp. 289-306.”
Maybe one day I’ll get round to writing about my scepticism of these kinds of tests. There, I said it.
- David Ropeik: Where You Stand on the Culture War Issues, and Why! (huffingtonpost.com)
George Monbiot at the Guardian has finally begun to take account of Cultural Theory as a possible explanation for why people either believe or ‘refuse’ to believe in climate change. He cites an article in Nature by Dan Kahan of the Yale Law School Cultural Cognition Project.
Prof Kahan says:
‘we need a theory of risk communication that takes full account of the effects of culture on our decision-making.’
However, Monbiot claims the cultural biases in CT don’t fit his particular case, since he sees himself as an Egalitarian who has unwillingly been put in the invidious situation of defending scientists against their detractors, many of whom are themselves Egalitarians.
But a closer look at Monbiot’s article reveals that he has in mind an ‘ideal type’ of scientist, who precisely fits the Egalitarian conception of how scientists should behave. There are three key characteristics.
- First, Egalitarian scientists should do no evil. Weaponising anthrax is out, as is the development of terminator genes in food crops. A non-Egalitarian argument can be made for both these activities, but Monbiot isn’t interested in that.
- Second, Egalitarian scientists should produce freely accessible knowledge. Locking it away in pay-to-access journals isn’t on, and all well-meaning scientists should act together to end the monopolisation of knowledge the journal publishers have created for themselves (actually I think it’s a cartel, but we’ll let that pass).
- Third, and most importantly, the kind of scientific knowledge Monbiot as an Egalitarian is especially interested in is what he thinks scientists should be producing impartially: hard evidence of major threats to civilization. A fact, on this view, is something that has the power to bring the group closer together and promote group behaviour. What self-evidently guarantees the veracity of such facts is the classic Egalitarian resort to ‘consensus’.
Taken together, these features of ideal science make it clear that the Egalitarian worldview describes Monbiot’s position to a tee.
He asks how it is possible to persuade people who just don’t want to be persuaded – and has no answer. The answer, from a cultural Theory perspective, is fairly straightforward.
People and institutions with different cultural biases create, fund, support and pay attention to four very different types of evidence. What matters then is to produce and shape a variety of evidence, not only the Egalitarian evidence that Monbiot privileges as the only kind of truth.
Here are some suggestions: Continue reading “Culture and the Science of Climate Change”