Why do we believe what we believe? One of the many complicating factors in marine policy is that different stakeholders hold different beliefs. Is pulse trawling a boon or a disaster for the marine environment? Do Marine Protected Areas enhance fish catches? Does human activity change the global climate? Finding consensus on how to allocate marine resources is difficult enough when all the people involved agree on the facts. Problems become a lot more difficult when facts are disputed, or worse, when beliefs run parrallel to divisions between political, religious, or other segments of society.
I recently started reading some of the psychological literature on this issue and I came across an article by Dan Kahan of Yale Law School. It addresses the phenomenon that for some policy topics, such as climate change, nuclear power, or genetic modification, acceptance of the scientific consensus correlates strongly with one’s political preferences. In a nutshell, conservatives are more likely to be skeptical of climate science, and to accept the scientific consensus that transgenic crops are safe; progressives, on the other hand, tend to accept the scientific consensus on climate change but are often suspicious of genetic modification. Kahan’s article explains and tests three hypotheses for this phenomenon:
- Bounded Rationality. A common theory in psychology posits that our brains have two systems to process information: a “fast” system that works through low-effort heuristics and associations (System 1); and a “slow” system that works through high-effort systematic reasoning (System 2). According to this hypothesis a member of the Dutch Party for the Animals is against biotechnology not because she has read all the literature and visited all the conferences, but because she applies a heuristic that accepts a belief if people like her believe it.
- Ideological Asymmetry. Empirical studies have shown that conservatives tend to be averse to complexity and ambiguity: they prefer their beliefs and principles to be simple, clear, and unshakable. According to this hypothesis these preferences make them more closed-minded and less willing to accept facts that contradict their current beliefs.
- Expressive Utility. The naive economic view is that values come before affiliations: people hold particular values and beliefs, and then vote for parties and politicians that conform those values and beliefs. A recent publication by political scientists Christopher Achen and Larry Bartels, however, turns this idea on its head: people choose which politician they like, or what party they prefer to associate with, and adapt their values accordingly. According to the Expressive Utility hypothesis, people express group membership through their beliefs: in a sense, they are willing to ignore facts in order to avoid becoming a heretic to their group.
Kahan goes on to test the three hypotheses and he finds that people who use more systematic reasoning are more, not less, likely to align their beliefs with their political affiliation. This makes it unlikely that they use a heuristic (System 1); rather, they invest considerable effort to justify their rejection of scientific facts. This supports the Expressive Utility hypothesis: conservatives are skeptical of climate science because accepting it would make them a bad conservative. I find it a plausible conclusion, particularly considering the findings of Achen and Bartels, and it resonates with other findings that more highly educated people are more skeptical of vaccination.
I’m not buying the Ideological Asymmetry hypothesis that conservatives are inherently dogmatic. I’ve encountered similar closed-mindedness among progressives, especially the radical Left, so I suspect it is more a feature of political extremism than of any particular end of the political spectrum.
Neither am I convinced by the Bounded Rationality hypothesis, although I do think there could still be a grain of truth in it. As far as I understand the dual-thinking model (i.e. as an economist with a half-read copy of Thinking Fast and Slow on his bedside table), System 1 is a fast, associative heuristic: it makes people give the wrong intuitive answer to the Bat and Ball Problem when they don’t have the time or the energy to use the kind of cognitive effort typical of System 2. But climate skeptics or vaccinophobes are not being asked to give their response on the spot. They have Googled the issue, they have read blogs and articles, they have discussed it with friends, relatives, and colleagues. This is not a System 1 activity. But there could still be a mechanism at work that is related to Bounded Rationality – just not in the way suggested by dual-thinking theory.
All knowledge is hearsay
I’m sure there is a theory on this in social psychology or a related field, so I’d be happy to hear about it if anyone who reads this knows more. I suspect that what is going on is that very little of what we know (or think we know, or hold to be true) comes from genuine first-hand experience or logical reasoning. The vast majority of our beliefs depend crucially on information that we got from other people, for example through newspaper articles, scientific publications, radio programmes, and coffee-table chit-chat. Every such piece of information involves a messenger, and whether you accept that information depends on whether you trust the messenger. Moreover, we don’t have time to check all our beliefs, so any time we receive new information we need to decide whether to trust the messenger, to dismiss the message right away, or to verify it with other information – which, again, has to come from other messengers with varying degrees of trust. There will always be a point where we stop verifying and go with the information we have, in other words trusting the messengers of the information we have not been able to verify. So it is not a black-or-white question of using either the heuristics of System 1 or the systematic reasoning of System 2, but rather finding the mix of believing and verifying that leads to an efficient use of cognitive effort.
What makes us trust a given messenger? There used to be messengers with a particular authority, either through formal designations such as college degrees or through informal assignments such as reputations. The views of a professor would have more weight than those of a layman because we assumed that the professor had earned his degree by studying hard, gathering a lot of knowledge, and being really smart; newspapers like The Economist or The Washington Post had solid reputations for honest and well-researched reporting. Both forms of authority have eroded of late due to the increased role of social media and online news outlets, but also a growing anti-intellectualism.
And this erosion of the authority of traditional messengers also points towards another mechanism: we tend to trust people who are like ourselves. A conservative is more likely to reject the consensus on climate change because the Wall Street Journal does so too, and he is more likely to trust the Wall Street Journal than the New York Times. Likewise, an environmentalist is more likely to believe The Ecologist than The Economist on the merits of free trade and biotechnology.
And of course I’m no exception. On all the issues I mentioned above, pulse trawling, MPAs, climate change, vaccinations, and biotechnology, I tend to adopt the scientific consensus. Is that because scientists should be trusted, objectively, to do their work right or is it just because I work in science myself so I am more willing to trust them?