The Flip-Flop Problem

In political and public life, there's a particular insult reserved for those who change their positions: the flip-flop. It implies shallowness, opportunism, spinelessness. A person who changes their mind is suspected of having had no real convictions in the first place.

This is, I think, one of the more corrosive ideas in our public discourse. And it doesn't stay in politics. It seeps into everyday intellectual life — into the way we argue with friends, the way we present ourselves online, the way we relate to our own past views.

We have come to treat consistency as a virtue in itself, divorced from whether the consistent position is actually correct.

What Consistency Actually Signals

Consistency means holding the same view over time. That's all. It says nothing about whether the view is well-founded, whether new information has arrived, or whether the original reasoning was sound. A person can be consistently wrong. A person can be consistently defending a position they privately doubt because changing their mind feels like too high a social cost.

Consistency can be a sign of integrity — if it reflects a genuinely held, well-reasoned position that hasn't been undermined by evidence. But consistency can equally be a sign of stubbornness, ego, or tribalism dressed up as principle.

The thing we should actually admire is not consistency. It's calibration — the willingness to hold beliefs in proportion to the evidence, and to update them when the evidence changes.

Why Changing Your Mind Is Hard

If updating your beliefs is intellectually virtuous, why is it so hard in practice? A few reasons:

  • Identity investment. We attach our sense of self to our beliefs. Changing a belief can feel like losing part of who you are.
  • Social cost. In groups, position changes can signal disloyalty. The tribe notices, and not always kindly.
  • Sunk cost. The longer you've held a position — especially one you've argued for publicly — the more psychologically expensive it becomes to abandon it.
  • The appearance of weakness. We conflate certainty with competence. Admitting you were wrong can feel like undermining your credibility, even when it actually demonstrates good judgment.

The Intellectual Courage in "I Was Wrong"

Consider what it actually takes to say "I've changed my mind." You have to acknowledge that your previous view was mistaken or incomplete. You have to risk the social penalty. You have to be willing to look, at least briefly, uncertain or inconsistent in others' eyes.

That is not weakness. That is intellectual honesty at some personal cost. That is exactly what careful thinking looks like in practice.

The people I trust most in any domain are not the ones who have never changed their minds. They're the ones who change their minds for good reasons — and can tell you what those reasons are.

A Standard Worth Adopting

I try to ask myself, with some regularity: what would it take to change my mind about this? If I can't answer that question — if there's no conceivable evidence or argument that would move me — then I'm not holding a belief. I'm holding a loyalty. And loyalties and beliefs are different things, even if they sometimes wear the same clothes.

Changing your mind is how knowledge actually grows. It's how people actually mature. And it might be, right now, one of the more subversive and necessary things any of us can model.