extending non-linear analysis to short ecological time series
Image credit: Flickr/Katz2110
Health communicators need to be able to handle… political issues skilfully and they need the training and tools to do so. Otherwise, their health messages run the risk of being ignored in a storm of political outrage. (Abraham 2009)
Prof Dan Kahan at the Yale Cultural Cognition project has been involved in work on cultural influences in the public debate about the HPV vaccine. For many the HPV vaccine will save lives and improve health, while providing strong returns for the manufacturers. For others, though, jabs are just risky or even downright dangerous. For yet others, in providing the vaccine to teenagers there is an implicit condoning of promiscuity. Whichever it is, the scientific evidence seems to fuel a political debate. Sales of Gardasil, says the Wall St Journal “have slowed over the past two years, as Merck has encountered difficulty persuading women ages 19 to 26 to get the shot.”
The Cultural Cogniton project is investigating just how people come to their beliefs about scientific evidence.
Some really interesting results: Continue reading Cultural bias and the HPV vaccine
John Adams of Imperial College London produced a new preface for the Brazilian translation of his important book Risk. His very interesting analysis of the social construction of risk is strongly informed by Grid-group cultural theory:
“I have been increasingly impressed by the ability of cultural theory to bring a modicum of order and civility to debates about risk. It is not a typology for pigeonholing participants in debates about risk. Occasionally one encounters a pure type, but most of us are too complex and multi faceted to be captured by a simple label. It does however provide a useful framework and vocabulary for describing the attitudes encountered in discussions about the best way to approach an uncertain future. It helps people to introspect about their own biases and prejudices.”
‘No planes fell from the sky, but a lot happened to keep them from doing so’.
This is a common view of the Y2K bug among software engineers and IT professionals in Anglo-American societies. For them it may be true that their hard work saved civilization from digitally-challenged-date Armageddon, but everywhere else in the world, they did next to nothing and yet, conspicuously, planes still didn’t fall from the sky.
So what was going on?
The story of Y2K bug is a marvellous example of how our subjective conceptions don’t just shape our view of reality, they shape objective reality itself.
Was the Y2K bug a serious threat or not? You’d think there’d be a straight and clear answer to this question, but it seems impossible to find one. The distinction between subjective and objective truth appears to dissolve before our eyes and if it can do so in relation to a super-expensive, high-stakes, world-wide emergency like Y2K, where else can it similarly dissolve?
The outcome of the Y2K bug has been used as a vindication of the ‘precautionary principle’ but also as a critique of that principle and an argument in favour of the ‘fix on failure’ principle. Most of the positive reporting has focussed on the positive ‘unintended consequences’, the ‘surprising legacy’ of Y2K preparation (especially the structural development of the IT industry) rather than demonstrating that a disaster actually was averted.
Economist John Quiggin has been the single most cogent thinker on Y2K, especially since his measured scepticism predates the benefit of hindsight. Two of the points he makes are especially worth reflecting on: blame-allocation schemes generally produce bad policy; some form of institutionally-sanctioned scepticism is indispensable.
Below is a list of resources, placed in order of increasing depth of coverage/insight.
US Senate Committee final report
Public Radio miniseries – the surprising legacy of Y2K
Phillimore, J and Davison, A (2002) A precautionary tale: Y2K and the politics of foresight. Futures, 34 (2). pp. 147-157.
John Quiggin paper
For a fourcultures take on this kind of thing, see The Dam Bursts.
At New Statesman magazine, Hugh Aldersley-Williams quotes Mary Douglas and Aaron Wildavsky’s Risk and Culture,
“people select their awareness of certain dangers to conform with a specific way of life”. He worries that we may reach a state in which “all we have in common is our fears”.
Actually, it’s very unlikely we’ll reach a consensus on our fears. The question of risk is a vexed one. According to Ulrich Beck, modernity is the process by which progress is overtaken by its negative side effects, so that the side effects, especially pollution of all sorts, become the main event. This is the ‘risk society’ in which we are increasingly defined by our status vis a vis threats to life – we take ‘social risk positions’. In stark contrast, Frank Furedi sees this as shamefully defeatist. For Furedi human ingenuity is the flame that burns eternal and there is no threat that isn’t in the end a wonderful opportunity. He disparages Beck’s thesis as ‘the culture of fear’. So who is correct? My money is on something known as grid-group cultural theory (developed by Douglas, Wildavsky and others) which proposes there are four mutually antagonistic cultural perspectives which institutions and individuals in them can adopt. Beck speaks for ‘Egalitarianism’, Furedi for ‘Individualism’, but there are two others, “Fatalism’ and ‘Hierarchy’. All coalitions of risk (eg the idea that wearing seatbelts in cars has saved lives, see the work of John Adams) are no more than fairly unstable temporary agreements between two or more of these.