Search This Blog

Tuesday, October 15, 2013

Motivated Reasoning: Fuel for Controversies, Conspiracy Theories and Science Denialism Alike

From the Scientific American Blog "Absolutely Maybe"

By Hilda Bastian
October 14, 2013

Pieces of information, disputed and not, can be woven very quickly into competing explanatory narratives. Press the right buttons, then it can be lightning fast to order them into a line leading to this or that logical conclusion.

Those narratives can cause feuds that won’t quit and entrenched positions of extreme certitude from which people can’t budge. These days, it seems to be accompanied by a trend not just towards uncivil discourse, but a reveling in the power of incivility. As though an increase in aggression – as a first resort even – can make our society better.



It’s always been understandable to me, when people choose that path. In my early days as an activist, I went there too, ignoring to some extent the difficulty with reconciling that with my deep admiration for the heroes of non-violent action: non-violence in word and deed. Like many traversing the same issue within the context of a great desperate challenge of my generation of activists, I found it painful at first to try to reach for the integrity of other ideals from a vulnerable place.

I still don’t find it easy. Nor am I particularly great at it, although I seem to have less cause for bouts of shame and regret as the decades go by. Experience ultimately came into sync with my aspirations: although far harder than whipping up frenzy, non-aggression is usually ultimately more powerful.

Like many in the USA, especially perhaps those of us not allowed to go to work right now or with reason to feel anxious about the future, I’m particularly pre-occupied by these issues at the moment, and the various impediments to clear, community-spirited thinking.

And then researchers serendipitously dropped another relevant study on cognitive bias into the literature. Lewandowsky, Gignac and Oberauer conducted a study of people in the USA. It’s here in PLOS One.

They discuss a way that science communication processes can sometimes backfire. When the views of people seen as experts converge on an issue, it has a strong influence on other people’s thinking. So generally, a strong scientific consensus can be convincing to many others, too. Climate science is an example where growing consensus among scientists reduced the influence of climate change denial.

However, in people who are prone to conspiracist thinking, strong consensus around science can have the reverse effect: it can be seen as evidence that they’re all in cahoots. As happens for some people with vaccination, say. Presenting yet more facts or another study could paradoxically confirm their rejection of science.

The study’s authors describe conspiracist thinking as a cognitive style that doesn’t have to conform to expectations of coherence or consistency: its “explanatory reach” is therefore greater than competing scientific theories. Yet, it can also provide an explanation of why a consensus is wrong.

Lewandowsky and his colleagues surveyed Americans’ attitudes to two issues where views on science are polarized: climate science and GM foods. Based on their sample, conspiratorial ideation could, they conjecture, be a more consistently explanatory factor in science denialism than people’s educational levels or world views.

Cultural or political world view and conspiracist thinking may be close relatives. They could both be seen as motivated reasoning, according to Lewandowsky: “Motivated reasoning refers to the discounting of information or evidence that challenges one’s prior beliefs accompanied by uncritical acceptance of anything that is attitude-consonant.”

World view can be associated with some types of science rejection – but not others. Thus, people with a “conservative” political world view could be more likely to reject climate science than “liberals,” but less likely, say, to reject childhood vaccination. And people with a more “conservative” world view who are more highly educated could be more skeptical of climate science than those who have fewer years of education.

How might it be countered? One of the articles they point to on this question, is another Lewandowsky paper – about misinformation and countering it. Misinformation, it’s argued there, can be worse than ignorance. When you’re not informed, you could fall back on heuristics that could have a lower chance of leading you astray than when you’re misinformed. And it might be easier to acquire information than to wipe away misinformation.

When people have an organized explanatory narrative, they may need a complete functional narrative to replace it, not just isolated bits of information that break the internal logic. A logical and respectful explanation of how the mistaken belief arose might be useful. And hearing it repeatedly might help. Lots of food for thought and experimental research in science communication. Learning how to effectively correct misinformation and stay reasonable is feeling pretty urgent this week.


...................................................................................

Meanwhile, if you’re frustrated by thinking that’s resolutely impervious to evidence and need some levity, spare 10 minutes to watch Storm – an animated defense of scientific thinking and rant against its opposite.

No comments:

Post a Comment