How cultural commitments damage your ability to reason

start counting. Source: unlistedsightings/flickr

When people don’t accept the scientific evidence, it may be useless to present them with yet more evidence. They are not stupid. They are simply protecting their cultural identity.

Here’s the journalism:

Science confirms: politics wrecks your ability to do math

And here’s the original study, Motivated Numeracy and Enlightened Self-Government

Kahan, Dan M., Peters, Ellen, Dawson, Erica Cantrell and Slovic, Paul, Motivated Numeracy and Enlightened Self-Government (September 3, 2013). Available at SSRN:

Why does public conflict over societal risks persist in the face of compelling and widely accessible scientific evidence? We conducted an experiment to probe two alternative answers: the “Science Comprehension Thesis” (SCT), which identifies defects in the public’s knowledge and reasoning capacities as the source of such controversies; and the “Identity-protective Cognition Thesis” (ICT) which treats cultural conflict as disabling the faculties that members of the public use to make sense of decision-relevant science. In our experiment, we presented subjects with a difficult problem that turned on their ability to draw valid causal inferences from empirical data. As expected, subjects highest in Numeracy—a measure of the ability and disposition to make use of quantitative information—did substantially better than less numerate ones when the data were presented as results from a study of a new skin-rash treatment. Also as expected, subjects’ responses became politically polarized—and even less accurate—when the same data were presented as results from the study of a gun-control ban. But contrary to the prediction of SCT, such polarization did not abate among subjects highest in Numeracy; instead, it increased. This outcome supported ICT, which predicted that more Numerate subjects would use their quantitative-reasoning capacity selectively to conform their interpretation of the data to the result most consistent with their political outlooks. We discuss the theoretical and practical significance of these findings.

Science communication and conservative values

image CC via flickr/jeffreypriebe

Roger Scruton‘s recent article in Prospect Magazine provides an interesting illustration of what Dan Kahn and Chris Mooney have been discussing on their respective blogs. (Kahn blogs regularly now at the Cultural Cognition Project and Mooney writes at the Desmog Blog.)

The topic of their discussion: Is it possible to take the polemics out of science communication, and if so, how?

Scruton’s article, Nature, Nurture and Liberal Values, reviews three recent books on neuroscience and discusses the moral and philosophical implications of these new inflections of the nature/nurture debate, from a highly intelligent conservative perspective:

“The real question raised by evolutionary biology and neuroscience is not whether those sciences can be refuted, but whether we can accept what they have to say, while still holding on to the beliefs that morality demands of us.”

This, it seems to me, is exactly the kind of question Kahn and Mooney are discussing. Scruton’s statement, though, begs the question that conservatism reviles: whose morality? This is where Cultural Theory comes in, suggesting as it does that there is more than one worldview, more than one morality, and that therefore, more than one kind of reconciliation is required between science and morality, between descriptive and normative claims. However, the promise of Cultural Theory is that this is not an endless pluralism, or a morally bankrupt relativism, but rather a constrained pluralism. Yes, there are competing cultural worldviews. No, they are not endlessly differentiated. We can map them.

Scruton’s latest book, Green Philosophy, provides a kind of conservative re-imagining of the environmentalist terrain that he seems to think has been left almost entirely to the egalitarian left for the last thirty years and more. It’s a philosophical restatement of that old question, why should the devil have all the best music? Why should so-called ‘environmentalists’ keep the environmental high ground to themselves? One way of looking at this might be to hypothesise that conservatives might be more receptive to ‘environmentalist’ subject matter if they think it will make the world a more conservative place. This will probably not take the polemics out of discussions about climate change policy – quite possibly the reverse – but it might just help to end the rather strange situation in which some political groups and leaders simply deny/resist/ignore climate change and other environmentalist causes célèbres and try to make them disappear.

Scruton spoke about his book at the RSA recently (audio available), with Matthew Taylor chairing.

Experts and Cultural Cognition

Dan Kahan‘s blog at the Cultural Cognition Project makes some conjectures about whether experts think in similar ways to non-experts. Specifically he wonders whether experts exhibit the kinds of cultural biases already demonstrated by non-experts. Do experts use cultural cognition?

My observation is that there would need to be care taken to avoid something like the fundamental attribution error. That is to say, being an ‘expert’ in a given field is strongly conditioned by situation. So the very choice of who the experts are may be conditioned by unacknowledged cultural bias. My conjecture is that experts therefore say what their audiences and sponsors expect them to, otherwise they would be unrecognizable as experts. In situations where the message is critiqued, so is the messenger’s status as an expert. In situations where the message is positively received, the messenger’s status as expert is regarded as obvious.

Three possible examples:

Who is an expert in local economic development? Rob Hopkins, the founder of the Transition Towns movement, tends to have a strongly Egalitarian outlook on the world. He recently complained that the ‘growth as usual’ mindset of local council officers called into question their competence as expert in their own field. His position is that true economic development experts would take into account peak oil, economic crises and climate change and allow for the possibility that economic growth, as it has been understood, may be a thing of the past.

Second example: Climate science has its experts and it is an open question as to whether the geologist Ian Plimer is one of them. At one level he is not an expert in climate science since that is not his area of professional competence. However, he has written a book on the subject and since he is a ‘climate sceptic’, there are some people who wish to present him as an expert in climate change. His new book for students was launched by a former prime minister of Australia.

The third example is that of US judges, experts in legal deliberation, many of whom are appointed on specifically political grounds. Voters have a sense of the liberal and conservative candidates for office and they vote accordingly. To those of us living in places where the judiciary is appointed on merit rather than elected, this appears strange indeed. After all, what could be less political than judgements concerning the facts?

In these examples the kinds of statements made by ‘experts’ are received not on the basis of whether the person in question actually has qualifications or professional standing, but on whether their words fit with a particular cultural bias. That is to say, each cultural bias already has its own experts, who are brought into the argument in order to cast doubt on the competence of the other side’s supposed expertise.

So before we can identify how experts behave it’s necessary to create a definition of expert that is broadly acceptable across the conflicting cultural solidarities described by Cultural Theory. The three examples given above show that this may be quite difficult.


It matters who presents the message

unsafe area

Who would you trust to tell you what the risks are?

Research from the Cultural Cognition project suggests the cultural identity of the presenter matters significantly to the public reception of a particular message about risk. In other words, we need our experts to be our experts, not the other side’s experts.

It follows from this that one way of reducing the polarization of debates on risk may be to provide a variety of views on an issue from within a particular cultural bias. Two examples of this in practice are presented below, one quite successful, the other less so.

Continue reading

Moving beyond a failure in the marketplace of ideas

The following is a guest post from Prof Dan Kahan in response to a previous post here, on Margaret Heffernan’s book, Willful Blindness.

4culture’s insightful post put me in mind of something important that in fact he has said explicitly before: Understanding the contribution that cultural influences have on our perceptions of risk (and like facts) cannot only explain but also improve our situation. If we know we have cultural “blind spots” & where they are, then we should be able to do something to reduce their dimensions even if we are constrained (not so unhappily!) always to be who we are and thus see what we see.

In that spirit:

Imagine a “cultural theory” response to the “marketplace of ideas” view of free speech. This view holds that “truth” can be expected to emerge naturally & rapidly take hold in society through the competition of ideas in a “free speech” market (associated with J.S. Mill; Justice O.W. HolmesJr., US S Ct; and others).

Cultural Theory helps to show why this laissez faire attitude toward transmission of knowledge is naive. Through biased search and weighting of evidence, people conform their assessments of information to their cultural values. Accordingly, even if the market of ideas furnishes them with an ample supply of information that it would be very much in their interest to accept and act on (because, say, they are more likely to die if they don’t), culturally diverse people won’t come to see it as true (or at least not nearly so quickly) if it denigrates the worldviews of some portion of them. This “cultural market failure,” Cultural Theory tells us, warrants some sort of corrective intervention. Some possibilities:

1. Affirmation framing

A cognitive rendering of Cultural Theory would say we are unconsciously motivated to resist information that threatens our cultural worldview. One way to mitigate the potential for bias inherent in this dynamic, then, is to try to strive to frame information in ways that affirm a plurality of worldviews simultaneously. Thus, when presenting information about climate change, it might make a lot of sense to give prominent billing to greater use of nuclear power or to the development of geoengineering, steps that are identity-affirming for individualists, rather than focus predominantly on carbon-emission limits, a policy that threatens individualists,

2. “Subsidize” hierarchy

Wildavsky believed that signature blind spots of each worldview meant that societies are most likely to prosper when they have a rich inventory of all worldview types. He was worried that in contemporary America, at least, hierarchy was being driven out by “the rise of radical egalitarianism” and so he proposed that hierarchists should be treated with respect and not vilified so that the value society gets from having hierarchical insight remains available. (Mary Douglas too was very anxious about the decline of hierarchy.) Actually, I think conspicuous efforts by egalitarians and individualists to find ways for hierarchical meanings to co-exist with theirs through adroit framing (point 1) is a way to subsidize; it puts a brake on the instinct to attack and also furnishes evidence to persons of hierarchical sensibilities that they are not under attack and thus promotes their full participation in public debate.

3. Puncturing culture-pluralistic ignorance

It turns out that people tend to overestimate how uniform & how strongly held positions on risk are within their cultural group & within opposing ones. This perception feeds on itself: because individuals sense that they will likely be put at odds with their peers if they take a dissenting view, they are less likely to form one and less likely to express it; such reticence amplifies the signal that views are uniform and strongly held, which increases the pressure to conform, etc. Well, one way around this is to promote (particularly in formal deliberative settings) a deliberative norm of acknowledging the “strongest counterargument” to one’s position. Such a norm gives people an “immunity” from sanction within their own group so they voice equivocation and dissent more freely. The voicing of equivocation and dissent mitigates the impression that views are uniform and strongly held; as that impression recedes, so does the pressure to conform . . . voilà!

I’m sure others can think of more ideas. But the point — as the post makes clear — is that Cultural Theory is not just a theory of bias but also a guide to possible debiasing as well. After all, wasn’t that what Douglas & Wildavsky were trying to provide us?

Related articles

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Sherman, D.K., Nelson, L.D. & Ross, L.D. Naïve Realism and Affirmative Action: Adversaries are More Similar Than They Think. Basic & Applied Social Psychology 25, 275-289 (2003).

Willful Blindness (

Image credit: http2007/flickr

Culture and the Science of Climate Change

George Monbiot at the Guardian has finally begun to take account of Cultural Theory as a possible explanation for why people either believe or ‘refuse’ to believe in climate change. He cites an article in Nature by Dan Kahan of the Yale Law School Cultural Cognition Project.

Prof Kahan says:

‘we need a theory of risk communication that takes full account of the effects of culture on our decision-making.’

However, Monbiot claims the cultural biases in CT don’t fit his particular case, since he sees himself as an Egalitarian who has unwillingly been put in the invidious situation of defending scientists against their detractors, many of whom are themselves Egalitarians.

But a closer look at Monbiot’s article reveals that he has in mind an ‘ideal type’ of scientist, who precisely fits the Egalitarian conception of how scientists should behave. There are three key characteristics.

  • First, Egalitarian scientists should do no evil. Weaponising anthrax is out, as is the development of terminator genes in food crops. A non-Egalitarian argument can be made for both these activities, but Monbiot isn’t interested in that.
  • Second, Egalitarian scientists should produce freely accessible knowledge. Locking it away in pay-to-access journals isn’t on, and all well-meaning scientists should act together to end the monopolisation of knowledge the journal publishers have created for themselves (actually I think it’s a cartel, but we’ll let that pass).
  • Third, and most importantly, the kind of scientific knowledge Monbiot as an Egalitarian is especially interested in is what he thinks scientists should be producing impartially: hard evidence of major threats to civilization. A fact, on this view, is something that has the power to bring the group closer together and promote group behaviour. What self-evidently guarantees the veracity of such facts is the classic Egalitarian resort to ‘consensus’.

Taken together, these features of ideal science make it clear that the Egalitarian worldview describes Monbiot’s position to a tee.

He asks how it is possible to persuade people who just don’t want to be persuaded – and has no answer. The answer, from a cultural Theory perspective, is fairly straightforward.

People and institutions with different cultural biases create, fund, support and pay attention to four very different types of evidence. What matters then is to produce and shape a variety of evidence, not only the Egalitarian evidence that Monbiot privileges as the only kind of truth.

Here are some suggestions: Continue reading

Cultural bias and the HPV vaccine

Health communicators need to be able to handle… political issues skilfully and they need the training and tools to do so. Otherwise, their health messages run the risk of being ignored in a storm of political outrage. (Abraham 2009)

Prof Dan Kahan at the Yale Cultural Cognition project has been involved in work on cultural influences in the public debate about the HPV vaccine. For many the HPV vaccine will save lives and improve health, while providing strong returns for the manufacturers. For others, though, jabs are just risky or even downright dangerous.  For yet others, in providing the vaccine to teenagers there is an implicit condoning of promiscuity. Whichever it is, the scientific evidence seems to fuel a political debate.  Sales of Gardasil, says the Wall St Journal “have slowed over the past two years, as Merck has encountered difficulty persuading women ages 19 to 26 to get the shot.”

The Cultural Cogniton project is investigating just how people come to their beliefs about scientific evidence.

Some really interesting results: Continue reading