Work and pleasure, arts and science

One can dream up uncountable categories in any profession, of course, but among academics, and perhaps especially among economists, two types stand out for me: the athlete and the artist/entrepreneur.

Athletes want to be the best in whatever competition they perceive to be in. Rankings are all that matters: all admiration goes to those in the Top 3. Athletes have a strong sense of who is ‘in’ and who is ‘out’: you want to associate with people who are ‘in’ because they publish in all the cool journals, go to the cool conferences and some of that coolness may someday rub off on you. Like real athletes, these academics choose their game, learn the rules, and try to be really good at it. Does this field require me to eschew interdisciplinary research, and prove difficult mathematical propositions? Then by heck I’m going to be the best at it. An athlete is another athlete’s competitor, first and foremost: if other athletes score he cringes his teeth in jealousy and swears to beat them in the next game.

Artists/entrepreneurs want to make a good product. A product is good if they themselves think it is good (the artist) or if it is good enough for a sufficient number of people (the entrepreneur). Artists/entrepreneurs don’t choose games or follow rules: they invent their own game, their own rules. When other scientists produce a great product, like a highly original paper, an artist/entrepreneur is eager to read it, and learn from it. Where athletes are driven by a constant comparison of themselves with others, artists/entrepreneurs are intrinsically motivated: they want to make something they themselves can be proud of.

Fiddle Tunes and IIFET

I have a lot more affinity with artists/entrepreneurs than with athletes – no surprises there. The current system in academia is largely geared towards athletes, with its emphasis on journal citation scores, H-index, and university rankings. This worries me. Athletes may be rule-followers, they are also more likely to cheat – just witness the doping scandals in bicycle racing and other sports. Rule-following also kills creativity – an essential ingredient of science.

IMG_4434 smallI had plenty of opportunity to reflect on the importance of creativity and inventing your own rules in science in the past three weeks. The first week of July I was at the Festival of American Fiddle Tunes in Port Townsend, Washington. It was an overwhelming experience to immerse myself in the music and hospitality of all the folks at this beautiful spot on a peninsula at the Puget Sound. One of the highlights was an improvisation workshop by bluegrass fiddler Tatiana Hargreaves. I don’t want to give away too many details about what we did (perhaps to preserve the secret but actually just because the truth is too embarrassing), but an important lesson that scientists might want to draw from it is that to get out of your comfort zone you should not take yourself too seriously!

DSC00089_smallAnd then there was the conference of the International Institute of Fisheries Economics and Trade (IIFET) in Seattle, in the third week of July. It was my second IIFET meeting but I’m sure it won’t be my last. One of the things I like about IIFET is its broadness, including not only economists but also policy scientists, sociologists, and people from NGOs and the fishing industry. Where the environmental economics conferences can feel like a gathering of athletes, IIFET is the place to go for artists/entrepreneurs. I was also excited to hear that 2020 will see the second edition of MSEAS, a conference on marine social-ecological systems, in Japan! The first edition, in Brest in 2016, yielded what must be the first comic in a peer-reviewed journal – another example of how art and science can make a happy marriage. More of that please!

All knowledge is hearsay

Why do we believe what we believe? One of the many complicating factors in marine policy is that different stakeholders hold different beliefs. Is pulse trawling a boon or a disaster for the marine environment? Do Marine Protected Areas enhance fish catches? Does human activity change the global climate? Finding consensus on how to allocate marine resources is difficult enough when all the people involved agree on the facts. Problems become a lot more difficult when facts are disputed, or worse, when beliefs run parrallel to divisions between political, religious, or other segments of society.

I recently started reading some of the psychological literature on this issue and I came across an article by Dan Kahan of Yale Law School. It addresses the phenomenon that for some policy topics, such as climate change, nuclear power, or genetic modification, acceptance of the scientific consensus correlates strongly with one’s political preferences. In a nutshell, conservatives are more likely to be skeptical of climate science, and to accept the scientific consensus that transgenic crops are safe; progressives, on the other hand, tend to accept the scientific consensus on climate change but are often suspicious of genetic modification. Kahan’s article explains and tests three hypotheses for this phenomenon:

  1. Bounded Rationality. A common theory in psychology posits that our brains have two systems to process information: a “fast” system that works through low-effort heuristics and associations (System 1); and a “slow” system that works through high-effort systematic reasoning (System 2). According to this hypothesis a member of the Dutch Party for the Animals is against biotechnology not because she has read all the literature and visited all the conferences, but because she applies a heuristic that accepts a belief if people like her believe it.
  2. Ideological Asymmetry. Empirical studies have shown that conservatives tend to be averse to complexity and ambiguity: they prefer their beliefs and principles to be simple, clear, and unshakable. According to this hypothesis these preferences make them more closed-minded and less willing to accept facts that contradict their current beliefs.
  3. Expressive Utility. The naive economic view is that values come before affiliations: people hold particular values and beliefs, and then vote for parties and politicians that conform those values and beliefs. A recent publication by political scientists Christopher Achen and Larry Bartels, however, turns this idea on its head: people choose which politician they like, or what party they prefer to associate with, and adapt their values accordingly. According to the Expressive Utility hypothesis, people express group membership through their beliefs: in a sense, they are willing to ignore facts in order to avoid becoming a heretic to their group.

Kahan goes on to test the three hypotheses and he finds that people who use more systematic reasoning are more, not less, likely to align their beliefs with their political affiliation. This makes it unlikely that they use a heuristic (System 1); rather, they invest considerable effort to justify their rejection of scientific facts. This supports the Expressive Utility hypothesis: conservatives are skeptical of climate science because accepting it would make them a bad conservative. I find it a plausible conclusion, particularly considering the findings of Achen and Bartels, and it resonates with other findings that more highly educated people are more skeptical of vaccination.

I’m not buying the Ideological Asymmetry hypothesis that conservatives are inherently dogmatic. I’ve encountered similar closed-mindedness among progressives, especially the radical Left, so I suspect it is more a feature of political extremism than of any particular end of the political spectrum.

Neither am I convinced by the Bounded Rationality hypothesis, although I do think there could still be a grain of truth in it. As far as I understand the dual-thinking model (i.e. as an economist with a half-read copy of Thinking Fast and Slow on his bedside table), System 1 is a fast, associative heuristic: it makes people give the wrong intuitive answer to the Bat and Ball Problem when they don’t have the time or the energy to use the kind of cognitive effort typical of System 2. But climate skeptics or vaccinophobes are not being asked to give their response on the spot. They have Googled the issue, they have read blogs and articles, they have discussed it with friends, relatives, and colleagues. This is not a System 1 activity. But there could still be a mechanism at work that is related to Bounded Rationality – just not in the way suggested by dual-thinking theory.

All knowledge is hearsay

I’m sure there is a theory on this in social psychology or a related field, so I’d be happy to hear about it if anyone who reads this knows more. I suspect that what is going on is that very little of what we know (or think we know, or hold to be true) comes from genuine first-hand experience or logical reasoning. The vast majority of our beliefs depend crucially on information that we got from other people, for example through newspaper articles, scientific publications, radio programmes, and coffee-table chit-chat. Every such piece of information involves a messenger, and whether you accept that information depends on whether you trust the messenger. Moreover, we don’t have time to check all our beliefs, so any time we receive new information we need to decide whether to trust the messenger, to dismiss the message right away, or to verify it with other information – which, again, has to come from other messengers with varying degrees of trust. There will always be a point where we stop verifying and go with the information we have, in other words trusting the messengers of the information we have not been able to verify. So it is not a black-or-white question of using either the heuristics of System 1 or the systematic reasoning of System 2, but rather finding the mix of believing and verifying that leads to an efficient use of cognitive effort.

What makes us trust a given messenger? There used to be messengers with a particular authority, either through formal designations such as college degrees or through informal assignments such as reputations. The views of a professor would have more weight than those of a layman because we assumed that the professor had earned his degree by studying hard, gathering a lot of knowledge, and being really smart; newspapers like The Economist or The Washington Post had solid reputations for honest and well-researched reporting. Both forms of authority have eroded of late due to the increased role of social media and online news outlets, but also a growing anti-intellectualism.

And this erosion of the authority of traditional messengers also points towards another mechanism: we tend to trust people who are like ourselves. A conservative is more likely to reject the consensus on climate change because the Wall Street Journal does so too, and he is more likely to trust the Wall Street Journal than the New York Times. Likewise, an environmentalist is more likely to believe The Ecologist than The Economist on the merits of free trade and biotechnology.

And of course I’m no exception. On all the issues I mentioned above, pulse trawling, MPAs, climate change, vaccinations, and biotechnology, I tend to adopt the scientific consensus. Is that because scientists should be trusted, objectively, to do their work right or is it just because I work in science myself so I am more willing to trust them?

Dutch biologists complain about publishing culture in academia

An interesting article on the current academic climate in the Dutch newspaper NRC Handelsblad: biologists complain about the pressure in the current academic climate not to try to replicate (let alone refute) other scientists’ results, but to exaggerate the results of your own research.

“In my field there are articles in Nature, Cell or Science, of which all experienced people know: this can’t be right,” says Hans Clevers, director of the Hubrecht Instituut in Utrecht and former president of the Dutch Academy of Science, “but rarely does somebody write that explicitly in an article. So every now and again I am approached at a conference by a PhD researcher from a remote university who has been trying for years to replicate that publication. It is very inefficient.” […] Ecologist Raymond Klaassen of Groningen University blames the “short-winded academic climate, that focuses on scoring.” “If you find a deviating pattern in one year, then the current practice is to publish that with a lot of ballyhoo in as high-ranking a journal as you can.”

The Dutch word used in the original article (which I translated here as “short-winded”) is “hijgerig”: from hijgen, Dutch for “to pant”. It evokes an image of heavy competition and short-termism. It reminds me of the atmosphere at a high-ranking Dutch university that has made quite a name in behavioural economics: scoring was the norm, in the best economics journals, but I saw little of a long-term research agenda. Nevertheless, I don’t believe it got as bad there as the biologists describe in this article (knock on wood). I do see it in fisheries science: 2048, anyone?

On interdisciplinarity

Check out the really cool cover of Nature’s special feature on interdisciplinarity!

Of course, as an economist I especially like their inclusion of “Invisible Hand” as the sole superhero representing the social sciences in their scientific team of Avengers. But it is also symbolic for the fact that economists have, in my view, gone the furthest in integrating their discipline with the natural sciences. This holds particularly for environmental and resource economists, who by definition deal with problems of the natural environment like pollution and overfishing. The reason is pretty geeky: most economic research is quantitative, and quite a lot involves the development of mathematical models. And whaddayaknow: so do climate science, population biology, hydrology, and a host of other natural sciences. Give me your equations and I’ll plug them into my CGE model.

It is actually much, much harder to truly integrate qualitative social sciences like sociology or anthropology with quantitative sciences – even with a social science like economics. Models like IMAGE and DICE describe the global climate as well as the economy; the Gordon-Schaefer fisheries model and Colin Clark’s work on renewable resource use, which use basic models from population biology like logistic growth, are part of the standard canon of resource economics since decades; when Daniel Pauly criticizes the limited impact of the “social sciences” on fisheries research, he lumps together economics with biology, not sociology. Meanwhile, it has taken until 2009 that the Nobel committee finally recognized anthropologist Elinor Ostrom for her contributions to the economics of common pool resources, and economists and sociologists share little but contempt for each others’ fields. The Indian economist Jagdish Bhagwati is said to have joked that good economists reincarnate as physicists; wicked economists reincarnate as sociologists. But Ostrom’s Nobel also shows that things are changing, especially in the field of institutional economics. Let’s have more of that in the future.