Background
People pick and choose their science, and often they do it in ways that seem rationally inconsistent. One lens through which we can view this is Bryan Caplan's idea of rational irrationality along with a Borland and Pulsinelli's concept of social harassment costs.
According to Caplan:
"...people have preferences over beliefs. Letting emotions or ideology corrupt our thinking is an easy way to satisfy such preferences...Worldviews are more a mental security blanket than a serious effort to understand the world."
This means that:
"Beliefs that are irrational from the standpoint of truth-seeking are rational from the standpoint of utility maximization."
In a previous post, I visualized social harassment costs that vary depending on one's peer group. If these costs exceed a certain threshold (k), consumers might express preferences that otherwise might seem irrational from a scientific standpoint. For example, based on peer group, a consumer might embrace scientific evidence related to climate change, but due to strong levels of social harassment, reject the views of the broader medical and scientific community related to the safety of genetically engineered foods.
See
here, and
here.Identity Protective Cognition
In Misconceptions, Misinformation, and the Logic of Identity-Protective Cognition (Kahan, 2017) the concept of identity protective cognition adds more to the picture. Below are three aspects of identify protective cognition:
- What people accept as factual information is shaped primarily by their values and identity
- Identity is a function of group membership, i.e. it's tribal in nature
- If people choose to hold beliefs that are different from what the 'tribe' believes, then they risk being ostracized (i.e. they face social harassment costs)
- As a result, individual thinking and thought patterns evolve to express group membership and what is held to be factual information is really an expression of 'loyalty to a particular identity-defining affinity group.
Additionally Kahan discusses some important implications of this sort of epistemic tribalism. Additional education and more accurate information aren't necessarily effective tactics for addressing the problems of misinformation and disinformation. In fact, what Kahan's and others research have shown is that it can actually make the problem worse.
'those highest in science comprehension use their superior scientific-reasoning proficiencies to conform policy-relevant evidence to the position that predominates in their cultural group....persons using this mode of reasoning are not trying to form an accurate understanding of the facts in support of a decision...with the benefit of the best available evidence....Instead they are using their reasoning to cultivate an affective stance that expresses their identity and their solidarity with others who share their commitments.'
In this way, identity protective cognition creates a sort of spurious relationship between what may be perceived as facts and the beliefs we adopt or choices we make. It gives us the impression that our beliefs are being driven by facts when the primary driver may actually be cultural identity.
This sort of tribalism can result in a sort of
tragedy of the science communication commons - similar to the
tragedy of the commons in economics where what seems rational from an individual standpoint (adopting the beliefs of the group to avoid punishment) is irrational from the standpoint of accuracy of beliefs and has negative consequences for society at large. As a result:
'citizens of a pluralistic democratic society are less likely to converge on the best possible evidence on threats to their collective welfare.'
Of course this has consequences for elections, the regulatory environment, and decisions by businesses and entrepreneurs in terms of what products to market and where to invest capital and resources. Ultimately this impacts quality of life and our ability to thrive in a world with a changing climate and bitter partisanship and social unrest.
The Problem and the Solution
As discussed above, this sort of tribal epistemology is not easily corrected by providing correct information or education. In fact it drives one to seek out misinformation in support of one's identity while ignoring what is factually correct. The authors speak broadly about the role that 'pollutants' or 'toxins' in the science communication environment play in promoting this tribal mentality. One form of social harassment cost that may be driving this is cancel culture or call out culture. Cancel culture works like an immune system that scans the network of believers and seeks out non-conforming views, and tags it to be attacked by others in the group. This drives even the brightest to seek out misinformation instead of avoiding it.
Another pollutant to the science communication environment is troll epistemology and related efforts to produce a 'firehose of falsehood' (see Paul and Matthews, 2016). Whether intentional or not, modern media technology provides the infrastructure to produce an effect similar to modern propaganda techniques pioneered in Russia. This emphasizes flooding the science communication environment via high-volume and multichannel, rapid, continuous, and repetitive false or unsubstantiated claims with no commitment to objective reality or logical consistency.
Authors conclude:
'the most effective manner to combat the effect of misconceptions about science and outright misinformation is to protect the science communication environment from this distinctive toxin.'
Related Posts and References
Borland,Melvin V. and Robert W. Pulsinelli. Household Commodity Production and Social Harassment Costs.Southern Economic Journal. Vol. 56, No. 2 (Oct., 1989), pp. 291-301
Frimer, J. A., Skitka, L. J., & Motyl, M. (2017). Liberals and conservatives are similarly motivated to avoid exposure to one another's opinions. Journal of Experimental Social Psychology, 72, 1–12. https://doi.org/10.1016/j.jesp.2017.04.003
Kahan, Dan M., Misconceptions, Misinformation, and the Logic of Identity-Protective Cognition (May 24, 2017). Cultural Cognition Project Working Paper Series No. 164, Yale Law School, Public Law Research Paper No. 605, Yale Law & Economics Research Paper No. 575, Available at SSRN: https://ssrn.com/abstract=2973067 or http://dx.doi.org/10.2139/ssrn.2973067
Paul, Christopher and Miriam Matthews, The Russian "Firehose of Falsehood" Propaganda Model: Why It Might Work and Options to Counter It. Santa Monica, CA: RAND Corporation, 2016. https://www.rand.org/pubs/perspectives/PE198.html.