Why do old ways of thinking and doing persist even after it becomes obvious they are totally wrong-headed, or at least suboptimal? Here are a few possibilities I’ve already explored: 

Satisficing: a mindset that is satisfied with a good-enough result, rather than the optimal solution.

Bureaucratic inertia: the tendency of bureaucratic organizations to perpetuate the established procedures.

Path dependency: when it’s easier or less costly to continue along an already set path than to create an entirely new one that promises even greater returns.

Lock-In Effect: when early adoption of a technology, policy or practice locks-in specific pathways that are difficult and costly to escape.

Switching Costs: switching to new ways of doing things involves time, money, effort, uncertainty, risk, disruption, feelings of incompetence, and changing roles/relationships. Better to stick to the tried and true.

Habit persistence:  habits tend to be sticky, because they achieve good-enough results with relative ease. You know what you’re doing, you know what to expect, and outcomes are okay.  Change takes us out of our comfort zone.

Sunk Costs: when people and institutions continue something just because they have already invested unrecoverable resources. The tyranny of sunk costs keeps us throwing good money after bad, whether that bad is a failed policy or toxic relationship.

Special Interest Groups: people and groups with an interest in preserving the status quo resist innovation as a threat to their power and hard-won gains.

This post will explore how ideological commitment undermines self-correction. First, a definition of ideology:

“By ideology, we mean, roughly, a mental model of the world and the social order that is both descriptive (how the world is) and normative (how it should be) - Cory Clark and Bo Winegard (2020) Tribalism in War and Peace: The Nature and Evolution of Ideological Epistemology and Its Significance for Modern Social Science.

Commitment to an ideology imbues it with a “sacred value”, or “roughly, a value that is held particularly fervidly and that one is incredibly reluctant to relinquish” (Clark and Winegard, 2020). The ideologically committed are therefore akin to True Believers; they’re all in.

I like Clark and Winegard’s take on ideological epistemology, but it leaves out something essential to many ideologies: a mental model of how to get from the world as it is to how it should be. This is a multilayered mental model of dynamic process, encompassing an understanding of change and transformation, from the meta to the particular, e.g., the exploitative nature of capitalism to five-year plans, God’s beneficence to the practice of prayer.

I’m thinking it’s hardest to self-correct at the meta level; pull out a few of those bricks and the whole edifice becomes wobbly. Go soft on capitalism and suffer an identity crisis. Think five-year plans are poorly implemented? Bring it up at a committee meeting.

Then again, a lot of True Believers seem to resist even minor tweaks to their ideas and plans for making the world a better place, even when there’s a strong case that these ideas and plans aren’t working as intended. For example, some trans activists continue to maintain that gender-affirming care for adolescents is an unmitigated good, despite evidence to the contrary. What’s up with that?

In other words, what keeps the ideologically committed from changing their minds? Some possibilities: confidence that dogged persistence will eventually prove the doubters wrong; fear of losing one’s ideological bearings; and the risk of being ostracized by fellow believers or mocked by adversaries.

Reference:

Cory J. Clark & Bo M. Winegard (2020) Tribalism in War and Peace: The Nature and Evolution of Ideological Epistemology and Its Significance for Modern Social Science, Psychological Inquiry, 31:1, 1-22, DOI: 10.1080/1047840X.2020.1721233