It matters who presents the message

unsafe area

Who would you trust to tell you what the risks are?

Research from the Cultural Cognition project suggests the cultural identity of the presenter matters significantly to the public reception of a particular message about risk. In other words, we need our experts to be our experts, not the other side’s experts.

It follows from this that one way of reducing the polarization of debates on risk may be to provide a variety of views on an issue from within a particular cultural bias. Two examples of this in practice are presented below, one quite successful, the other less so.

Continue reading

Moving beyond a failure in the marketplace of ideas


The following is a guest post from Prof Dan Kahan in response to a previous post here, on Margaret Heffernan’s book, Willful Blindness.

4culture’s insightful post put me in mind of something important that in fact he has said explicitly before: Understanding the contribution that cultural influences have on our perceptions of risk (and like facts) cannot only explain but also improve our situation. If we know we have cultural “blind spots” & where they are, then we should be able to do something to reduce their dimensions even if we are constrained (not so unhappily!) always to be who we are and thus see what we see.

In that spirit:

Imagine a “cultural theory” response to the “marketplace of ideas” view of free speech. This view holds that “truth” can be expected to emerge naturally & rapidly take hold in society through the competition of ideas in a “free speech” market (associated with J.S. Mill; Justice O.W. HolmesJr., US S Ct; and others).

Cultural Theory helps to show why this laissez faire attitude toward transmission of knowledge is naive. Through biased search and weighting of evidence, people conform their assessments of information to their cultural values. Accordingly, even if the market of ideas furnishes them with an ample supply of information that it would be very much in their interest to accept and act on (because, say, they are more likely to die if they don’t), culturally diverse people won’t come to see it as true (or at least not nearly so quickly) if it denigrates the worldviews of some portion of them. This “cultural market failure,” Cultural Theory tells us, warrants some sort of corrective intervention. Some possibilities:

1. Affirmation framing

A cognitive rendering of Cultural Theory would say we are unconsciously motivated to resist information that threatens our cultural worldview. One way to mitigate the potential for bias inherent in this dynamic, then, is to try to strive to frame information in ways that affirm a plurality of worldviews simultaneously. Thus, when presenting information about climate change, it might make a lot of sense to give prominent billing to greater use of nuclear power or to the development of geoengineering, steps that are identity-affirming for individualists, rather than focus predominantly on carbon-emission limits, a policy that threatens individualists,

2. “Subsidize” hierarchy

Wildavsky believed that signature blind spots of each worldview meant that societies are most likely to prosper when they have a rich inventory of all worldview types. He was worried that in contemporary America, at least, hierarchy was being driven out by “the rise of radical egalitarianism” and so he proposed that hierarchists should be treated with respect and not vilified so that the value society gets from having hierarchical insight remains available. (Mary Douglas too was very anxious about the decline of hierarchy.) Actually, I think conspicuous efforts by egalitarians and individualists to find ways for hierarchical meanings to co-exist with theirs through adroit framing (point 1) is a way to subsidize; it puts a brake on the instinct to attack and also furnishes evidence to persons of hierarchical sensibilities that they are not under attack and thus promotes their full participation in public debate.

3. Puncturing culture-pluralistic ignorance

It turns out that people tend to overestimate how uniform & how strongly held positions on risk are within their cultural group & within opposing ones. This perception feeds on itself: because individuals sense that they will likely be put at odds with their peers if they take a dissenting view, they are less likely to form one and less likely to express it; such reticence amplifies the signal that views are uniform and strongly held, which increases the pressure to conform, etc. Well, one way around this is to promote (particularly in formal deliberative settings) a deliberative norm of acknowledging the “strongest counterargument” to one’s position. Such a norm gives people an “immunity” from sanction within their own group so they voice equivocation and dissent more freely. The voicing of equivocation and dissent mitigates the impression that views are uniform and strongly held; as that impression recedes, so does the pressure to conform . . . voilà!

I’m sure others can think of more ideas. But the point — as the post makes clear — is that Cultural Theory is not just a theory of bias but also a guide to possible debiasing as well. After all, wasn’t that what Douglas & Wildavsky were trying to provide us?

Related articles

Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).

Sherman, D.K., Nelson, L.D. & Ross, L.D. Naïve Realism and Affirmative Action: Adversaries are More Similar Than They Think. Basic & Applied Social Psychology 25, 275-289 (2003).

Willful Blindness (fourcultures.com)

Image credit: http2007/flickr

Guest post coming up

email iconDan,

Thanks for your interesting message. I’d certainly like to make a ‘guest post’ of it. It fits very well with the next piece coming up here on the London riots – but of course you say things that hadn’t even occurred to me. For example the whole idea of a market place of ideas as an Individualist fantasy is intriguing. (Likewise the idea of nuclear power as an Individualist institution. I have seen it as implicated in a Hierarchical or at least strong Grid world view and have perhaps been wilfully blind to the promethian, cutting edge of progress aspects.)

Your three suggestions for de-biassing public debate show that we could be doing much better than we currently are – and that the problem of market failure in the market place of ideas has some encouraging solutions. I’m sure the readers of Fourcultures will be fascinated…

 Dan Kahan is a part of the Cultural Cognition Project. Watch this space for his guest post.

Evidence-based riots

cc SeanMacEntee/Flickr

It’s been hard to move recently for people leaping to conclusions. Everyone with an Internet connection  has already posted an opinion about the supposedly obvious causes of the London riots.

Medhi Hassan’s heartfelt plea for pundits to to stop generalising certainly makes sense. The introduction reads:

The debate about the riots is being hijacked by those who want to push partisan agendas and narratives. But shouldn’t we wait for evidence?

Yes we should, in the same way we should shut the door after  the horse has bolted. Unfortunately the evidence will not help us in quite the ways we might expect.  The Cultural Cognition project people claim to have shown that in at least one public debate (over climate change) the greater scientific knowledge there is, the more (not less) the preconceived opinions are reinforced. The facts tend to fuel, not calm the fire.

Read more:

The ground zero of meaning

Never let a crisis go to waste

image credit: Sean MacEntee/Flickr [CC]

Willful Blindness

Willful Blindness by Margaret Heffernan

Margaret Heffernan has written a book on willful blindness [excerpt] and there’s a great article in New Statesman. Here’s just one of the telling quotations Heffernan uses to illustrate her case. It comes from the economist Paul Krugmann, speaking of the blind spots in his own economic modelling:

“I think there’s a pretty good case to be made that the stuff that I stressed in the models is a less important story than the things I left out because I couldn’t model them.” [Paul Krugmann]

We all risk seeing only part of the story – the part we want to see. It’s really important to notice this and try to do something about it. I’ve written before now that Cultural Theory is one attempt at trying not to fool yourself. It seeks to understand how our social contexts effectively do some of our thinking for us. They make some thoughts easy and others hard. They make some things easy to see and render others invisible. Margaret Heffernan cites the example of Richard Fuld, the former head of Lehman Brothers. Before that company’s collapse Fuld would get to work by helicopter and chauffeured limo in such a way as to avoid seeing anyone. The point is that while he may have made the bubble in which he lived, nevertheless the bubble also made him.

The subtitle to Willful Blindness is “Why we ignore the obvious at our peril” .  Surely part of an answer is that the obvious is less obvious than it should be. It is our institutions, not just our brains, that make it so.

More reading:

Flaw in the model

Switching Strategies

How to spot a model

Models, reality and the limits to growth