My two cents on the pulse ban

As I am writing this the European Parliament, the European Council, and the European Commission are still discussing a ban on pulse fishing. Dutch fishers have invested millions of euros in this technology, assuming they would be able to use it for many years to come. And judging by the scientific results so far, they had every reason to do so: pulse trawls have lower fuel use, lower CO2 emissions, less penetration of the sea floor, lower bycatch of plaice than traditional beam trawls. So no wonder they are angry and frustrated at the prospect of a ban.

Most likely it will come to pass. There are plenty of articles on how The Netherlands have lost the battle for the hearts and minds of other EU member states, such as this article in a Dutch newspaper, or this paper by my colleagues at Wageningen University (paywall – sorry!). By and large the experts seem to agree that the Dutch government dramatically overplayed its hand, and that a more careful introduction of the technology may have been more effective in the long run. I can’t judge this but I have two reflections.

Don’t people ever listen?

First, the scientists working on the effects of pulse trawling are also frustrated, and it’s a frustration I also sense with respect to other societal debates. By and large, the scientific consensus is that pulse trawling is most likely better than its alternative, beam trawling:

  • “No injuries were found in fish exposed to the electrical pulses.” (Soetaert et al., 2018, North American Journal of Fisheries Management)
  • “Exposure of Sole embryos at 2 d postfertilization and larvae at 11 d posthatching to pulsed DC used to catch brown shrimp did not result in a lower survival 8 d after exposure. Additionally, no differences in yolk sac resorption and morphometric length measurements of the notochord, muscle, eye, and head were observed in the developing larvae.” (Desender et al., 2018, North American Journal of Fisheries Management)
  • “Compared to tickler-chain beam trawlers, pulse trawlers showed relatively higher discard survival under fishing conditions pertinent to these studies.” (van der Reijden et al., 2017, ICES Journal of Marine Science)
  • “These results indicate that, under the laboratory circumstances as adopted in this study, the small-spotted catshark are still able to detect the bioelectrical field of a prey following exposure to [pulsed direct current] used in pulse trawls.” (Desender et al., 2017, Journal of Experimental Marine Biology and Ecology)
  • “These data reveal the absence of irreversible lesions in sole as a direct consequence of exposure to electric pulses administered in the laboratory, while in cod, more research is needed to assess cod’s vulnerability for spinal injuries when exposed to the cramp pulses.” (Soetaert et al., 2016, Fisheries Research)
  • “Electrode diameter and pulse amplitude showed a positive correlation with the intensity of the fish’s reaction. However, the present experiments confirmed that cod also show variable vulnerability, with injury rates ranging from 0% to 70% after (almost) identical exposures near the electrode.” (Soetaert et al., 2016, Marine and Coastal Fisheries)
  • “In conclusion, under the circumstances as adopted in this study, the electrical field seemed to have only limited immediate impact on the exposed animals.” (Desender et al., 2016, Fisheries Research)
  • “Some of the large cod (n = 260) developed haemorrhages and fractures in the spine, and haemal and neural arches in the tail part of the body. The probability of injuries increased with field strength and decreased when frequency was increased from 100 to 180 Hz. None of the small cod (n = 132) were injured and all survived. The field strength at the lateral boundaries of the trawl was too low to inflict injuries in cod.” (de Haan et al., 2016, ICES Journal of Marine Science)
  • “The evidence presented here suggests that the electrified trawls are superior to conventional trawls regarding different aspects, including ecological impact on the North Sea (less bottom impact), management of commercial fishing stocks (less discards) and carbon footprint (reduction of fuel consumption).” (Soetaert et al., 2015, Fish and Fisheries)
  • “The pulse trawls had fewer fish discards […]. The pulse fishing technique resulted in a lower fuel consumption (37-49%), and consequently in spite of lower landings net revenues were higher. A downside of using pulse trawls is the possible spinal damage of marketable cod (Gadus morhua L.), but because total cod landings by beam trawls are low (4-5%), the implication will likely be limited.” (van Marlen et al., 2014, Fisheries Research)

So to summarise:

  • Cod may indeed be affected more severely by pulse trawls than by beam trawls;
  • For other species the effects appear negligible;
  • There are clear benefits in lower fuel use, greenhouse gas emissions, selectivity, and sea floor penetration.

Basically it’s a trade-off between a possible (but possibly limited) damage to cod versus higher fuel use, greenhouse gas emissions, and damage to shellfish and other benthic life. These scientific findings have led ICES to conclude that “pulse trawling has fewer environmental and ecological effects than beam trawls.

The response by BLOOM, the NGO at the forefront of this crusade against pulse trawling, to ICES’s advice? Bloom_FR status 1002253942108061698

Note that all studies I just cited appeared in peer-reviewed scientific journals. Add to this that ICES is more or less to North Atlantic fisheries research what the IPCC is to climate research, and you see the similarities. How often do climate scientists get to hear they’re only in it for the money? How often do pseudosceptics ignore the vast body of scientific evidence that climate change is happening, that much of it is driven by man-made emissions, and that this is a huge problem? Really, there is not much difference between climate deniers (I’m sorry, I mean to say deniers of the vast current scientific consensus that climate change is happening, anthropogenic, and problematic) and the folks at BLOOM.

And these are not the only examples where science and reason loses against emotions, fake news, and conspiracy theories. Australian green NGOs convinced the government to revoke the license of the Margiris trawler, even though CSIRO found its concessions respected ecological limits. Scientists time and again found no detrimental effects of genetic modification, but opponents pay no attention or refer to studies that have been done so poorly they had to be retracted. And don’t get me started on vaccination.

Perhaps we’re not speaking the right language?

This brings me to the second lesson: all too often scientists expect facts to speak for themselves. It may work like that for us (or so we think, wrongfully), but there is a big bad world out there where there is emotion, cherry-picking, motivated reasoning, political cynicism, and other monsters that will tear that illusion to shreds. Whether we like it or not, facts are not enough if we want our research to have the impact it deserves. Scientists, especially natural scientists (but probably also economists), need to learn that explaining your research is not enough: we must also consider how our findings are being used and interpreted in the wider political and societal debate, and we might need to involve different societal actors in the research in a much earlier stage to build trust and to understand the concerns and loyalties that determine people’s support for, or opposition to a policy.

These two, for me, are the main lessons from this sorry saga.

All knowledge is hearsay

Why do we believe what we believe? One of the many complicating factors in marine policy is that different stakeholders hold different beliefs. Is pulse trawling a boon or a disaster for the marine environment? Do Marine Protected Areas enhance fish catches? Does human activity change the global climate? Finding consensus on how to allocate marine resources is difficult enough when all the people involved agree on the facts. Problems become a lot more difficult when facts are disputed, or worse, when beliefs run parrallel to divisions between political, religious, or other segments of society.

I recently started reading some of the psychological literature on this issue and I came across an article by Dan Kahan of Yale Law School. It addresses the phenomenon that for some policy topics, such as climate change, nuclear power, or genetic modification, acceptance of the scientific consensus correlates strongly with one’s political preferences. In a nutshell, conservatives are more likely to be skeptical of climate science, and to accept the scientific consensus that transgenic crops are safe; progressives, on the other hand, tend to accept the scientific consensus on climate change but are often suspicious of genetic modification. Kahan’s article explains and tests three hypotheses for this phenomenon:

  1. Bounded Rationality. A common theory in psychology posits that our brains have two systems to process information: a “fast” system that works through low-effort heuristics and associations (System 1); and a “slow” system that works through high-effort systematic reasoning (System 2). According to this hypothesis a member of the Dutch Party for the Animals is against biotechnology not because she has read all the literature and visited all the conferences, but because she applies a heuristic that accepts a belief if people like her believe it.
  2. Ideological Asymmetry. Empirical studies have shown that conservatives tend to be averse to complexity and ambiguity: they prefer their beliefs and principles to be simple, clear, and unshakable. According to this hypothesis these preferences make them more closed-minded and less willing to accept facts that contradict their current beliefs.
  3. Expressive Utility. The naive economic view is that values come before affiliations: people hold particular values and beliefs, and then vote for parties and politicians that conform those values and beliefs. A recent publication by political scientists Christopher Achen and Larry Bartels, however, turns this idea on its head: people choose which politician they like, or what party they prefer to associate with, and adapt their values accordingly. According to the Expressive Utility hypothesis, people express group membership through their beliefs: in a sense, they are willing to ignore facts in order to avoid becoming a heretic to their group.

Kahan goes on to test the three hypotheses and he finds that people who use more systematic reasoning are more, not less, likely to align their beliefs with their political affiliation. This makes it unlikely that they use a heuristic (System 1); rather, they invest considerable effort to justify their rejection of scientific facts. This supports the Expressive Utility hypothesis: conservatives are skeptical of climate science because accepting it would make them a bad conservative. I find it a plausible conclusion, particularly considering the findings of Achen and Bartels, and it resonates with other findings that more highly educated people are more skeptical of vaccination.

I’m not buying the Ideological Asymmetry hypothesis that conservatives are inherently dogmatic. I’ve encountered similar closed-mindedness among progressives, especially the radical Left, so I suspect it is more a feature of political extremism than of any particular end of the political spectrum.

Neither am I convinced by the Bounded Rationality hypothesis, although I do think there could still be a grain of truth in it. As far as I understand the dual-thinking model (i.e. as an economist with a half-read copy of Thinking Fast and Slow on his bedside table), System 1 is a fast, associative heuristic: it makes people give the wrong intuitive answer to the Bat and Ball Problem when they don’t have the time or the energy to use the kind of cognitive effort typical of System 2. But climate skeptics or vaccinophobes are not being asked to give their response on the spot. They have Googled the issue, they have read blogs and articles, they have discussed it with friends, relatives, and colleagues. This is not a System 1 activity. But there could still be a mechanism at work that is related to Bounded Rationality – just not in the way suggested by dual-thinking theory.

All knowledge is hearsay

I’m sure there is a theory on this in social psychology or a related field, so I’d be happy to hear about it if anyone who reads this knows more. I suspect that what is going on is that very little of what we know (or think we know, or hold to be true) comes from genuine first-hand experience or logical reasoning. The vast majority of our beliefs depend crucially on information that we got from other people, for example through newspaper articles, scientific publications, radio programmes, and coffee-table chit-chat. Every such piece of information involves a messenger, and whether you accept that information depends on whether you trust the messenger. Moreover, we don’t have time to check all our beliefs, so any time we receive new information we need to decide whether to trust the messenger, to dismiss the message right away, or to verify it with other information – which, again, has to come from other messengers with varying degrees of trust. There will always be a point where we stop verifying and go with the information we have, in other words trusting the messengers of the information we have not been able to verify. So it is not a black-or-white question of using either the heuristics of System 1 or the systematic reasoning of System 2, but rather finding the mix of believing and verifying that leads to an efficient use of cognitive effort.

What makes us trust a given messenger? There used to be messengers with a particular authority, either through formal designations such as college degrees or through informal assignments such as reputations. The views of a professor would have more weight than those of a layman because we assumed that the professor had earned his degree by studying hard, gathering a lot of knowledge, and being really smart; newspapers like The Economist or The Washington Post had solid reputations for honest and well-researched reporting. Both forms of authority have eroded of late due to the increased role of social media and online news outlets, but also a growing anti-intellectualism.

And this erosion of the authority of traditional messengers also points towards another mechanism: we tend to trust people who are like ourselves. A conservative is more likely to reject the consensus on climate change because the Wall Street Journal does so too, and he is more likely to trust the Wall Street Journal than the New York Times. Likewise, an environmentalist is more likely to believe The Ecologist than The Economist on the merits of free trade and biotechnology.

And of course I’m no exception. On all the issues I mentioned above, pulse trawling, MPAs, climate change, vaccinations, and biotechnology, I tend to adopt the scientific consensus. Is that because scientists should be trusted, objectively, to do their work right or is it just because I work in science myself so I am more willing to trust them?