It’s been hard to move recently for people leaping to conclusions. Everyone with an Internet connection has already posted an opinion about the supposedly obvious causes of the London riots.
Medhi Hassan’s heartfelt plea for pundits to to stop generalising certainly makes sense. The introduction reads:
The debate about the riots is being hijacked by those who want to push partisan agendas and narratives. But shouldn’t we wait for evidence?
Yes we should, in the same way we should shut the door after the horse has bolted. Unfortunately the evidence will not help us in quite the ways we might expect. The Cultural Cognition project people claim to have shown that in at least one public debate (over climate change) the greater scientific knowledge there is, the more (not less) the preconceived opinions are reinforced. The facts tend to fuel, not calm the fire.
The ground zero of meaning
Never let a crisis go to waste
image credit: Sean MacEntee/Flickr [CC]
Everyone loves a quiz and Psychology Today magazine has a cultural cognition quiz for you, courtesy of David Ropeik.
Roepik is the author of How Risky is it, Really? Why our fears don’t always match the facts. His website offers exerpts from the book and -wait for it –
While you’re here, though, you could take our little fourcultures quiz just to the right of this page. How much is there?
You know you want to.
…and if you really can’t get enough quiz in your life, why not try the cultural theory quiz posted at the OK Cupid website (no, really). According to its creator, ” The test items are taken from Gunnar Grendstad and Susan Sundback’s paper “Socio-demographic effects on cultural biases” published in Acta Sociologica, vol. 46, no. 4, 2003, pp. 289-306.”
Maybe one day I’ll get round to writing about my scepticism of these kinds of tests. There, I said it.
George Monbiot at the Guardian has finally begun to take account of Cultural Theory as a possible explanation for why people either believe or ‘refuse’ to believe in climate change. He cites an article in Nature by Dan Kahan of the Yale Law School Cultural Cognition Project.
Prof Kahan says:
‘we need a theory of risk communication that takes full account of the effects of culture on our decision-making.’
However, Monbiot claims the cultural biases in CT don’t fit his particular case, since he sees himself as an Egalitarian who has unwillingly been put in the invidious situation of defending scientists against their detractors, many of whom are themselves Egalitarians.
But a closer look at Monbiot’s article reveals that he has in mind an ‘ideal type’ of scientist, who precisely fits the Egalitarian conception of how scientists should behave. There are three key characteristics.
- First, Egalitarian scientists should do no evil. Weaponising anthrax is out, as is the development of terminator genes in food crops. A non-Egalitarian argument can be made for both these activities, but Monbiot isn’t interested in that.
- Second, Egalitarian scientists should produce freely accessible knowledge. Locking it away in pay-to-access journals isn’t on, and all well-meaning scientists should act together to end the monopolisation of knowledge the journal publishers have created for themselves (actually I think it’s a cartel, but we’ll let that pass).
- Third, and most importantly, the kind of scientific knowledge Monbiot as an Egalitarian is especially interested in is what he thinks scientists should be producing impartially: hard evidence of major threats to civilization. A fact, on this view, is something that has the power to bring the group closer together and promote group behaviour. What self-evidently guarantees the veracity of such facts is the classic Egalitarian resort to ‘consensus’.
Taken together, these features of ideal science make it clear that the Egalitarian worldview describes Monbiot’s position to a tee.
He asks how it is possible to persuade people who just don’t want to be persuaded – and has no answer. The answer, from a cultural Theory perspective, is fairly straightforward.
People and institutions with different cultural biases create, fund, support and pay attention to four very different types of evidence. What matters then is to produce and shape a variety of evidence, not only the Egalitarian evidence that Monbiot privileges as the only kind of truth.
Here are some suggestions: Continue reading
…according to law professor Don Braman, that is. NPR has an interview with members of the Cultural Cognition Project, who have been demonstrating experimentally that people’s climate change beliefs are strongly linked to their worldview.
It’s intuitively obvious that our views, opinions and beliefs are linked together a bit like constellations in the night sky, but when it comes to working out what exactly it is that connects them, it’s quite hard to come up with a viable answer. Now it seems the pattern is becoming clearer.
Dr Clare Saunders, from Southampton University, was awarded the first British Journal of Sociology prize for her 2008 ethnographic work on environmental organisations in London.
You can hear a podcast of her describing her research, and read the original article (as long as someone you love your institution subscribes to Wiley Interscience).
She argues that: Continue reading
Health communicators need to be able to handle… political issues skilfully and they need the training and tools to do so. Otherwise, their health messages run the risk of being ignored in a storm of political outrage. (Abraham 2009)
Prof Dan Kahan at the Yale Cultural Cognition project has been involved in work on cultural influences in the public debate about the HPV vaccine. For many the HPV vaccine will save lives and improve health, while providing strong returns for the manufacturers. For others, though, jabs are just risky or even downright dangerous. For yet others, in providing the vaccine to teenagers there is an implicit condoning of promiscuity. Whichever it is, the scientific evidence seems to fuel a political debate. Sales of Gardasil, says the Wall St Journal “have slowed over the past two years, as Merck has encountered difficulty persuading women ages 19 to 26 to get the shot.”
The Cultural Cogniton project is investigating just how people come to their beliefs about scientific evidence.
Some really interesting results: Continue reading
It was a trick of course. Yesterday I used Grid-Group cultural theory to ‘predict’ the Fatalist viewpoint of Nicholas Taleb, author of The Black Swan. But like the magician who successfully predicted the lottery numbers, it’s more about sleight of hand than about actual magic…
Despite the name, cultural theory isn’t really a theory at all. It’s a conceptual scheme – an heuristic we use, it might be argued, because we are cognitive misers and like making short cuts in our thinking. The world is big and hard to understand, so we make biased assumptions about what ‘usually’ happens, what ‘must’ happen, or what ‘really’ happens, what ‘the facts of the matter’ are, and so on. (to give just one example from millions, some more, some less trivial: Nick Cave’s deep-voiced assertion, ‘People they ain’t no good’). Moreover, we don’t just get these ideas out of our own heads somehow. They are embodied in the institutions in which we participate, from the family mealtime to the Copenhagen Climate summit, so that they are accounts of actual reality as we experience it. They really do explain how the world is – at least parts of it – so that our ideas seem like common sense. It is argued that these cultural biases or cultural solidarities coalesce into four basic ‘ideal types’, the four cultures for Grid-Group Cultural theory. But can we actually measure this? And can we really use the result to predict anything? Continue reading