.png)
Our world isn’t shaped by reality, but by those utterly convinced they are seeing it clearly. It’s the most dangerous bias and the invisible fuel of institutional failure.


Kirti Tarang Pande is a psychologist, researcher, and brand strategist specialising in the intersection of mental health, societal resilience, and organisational behaviour.
February 3, 2026 at 10:21 AM IST
Have you ever noticed it? In every conflict, someone is certain they are simply “being reasonable.” That person is usually why the conflict escalates. And in our minds, that person is always ‘them’, and never ‘me’.
What was your last serious disagreement? Weren’t you the rational one, the objective one? “I’m just responding to the facts,” you assured yourself. Meanwhile, the other person? Oh! They were biased, emotional and irrational. That very feeling of clear-eyed neutrality, the pride in your own reasoning, that is the psychological trap that escalates arguments, corrupts organizations, and divides nations. It is the reason institutions so often believe the problem lies entirely on the other side.
This isn’t a character flaw. It’s a universal psychological wiring issue. Cognitive psychologists call it the bias blind spot: the tendency to see our own judgments as neutral reality, while viewing everyone else’s as distorted. Smart, experienced, high-status people fall for it daily—managers, judges, editors, ministers, investors. And because these individuals form our institutions, what starts as a minor disagreement scales into entrenched, self-reinforcing escalation.
Do you know why we all have this blind spot bias? It’s because of our naive realism- It is because we literally believe that we perceive the world as it is. So, by definition, anyone who disagrees must be seeing it wrong. And that’s how our mind tricks us, slyly, sneakily and effectively.
And that’s why that arrogant senior- they have a confirmation bias, that junior in the team has self-interest bias, our spouse has hindsight bias but we? No!!! We are the poster child of reasonableness.
That is why “just being rational” often blocks solutions and makes institutions brittle.
Think of a room you’ve been in. A conference room. Or a courtroom. A newsroom. A ministry office. Or an investment committee. Someone says (politely, carefully)—“I think this policy is hurting people.” Or, “I think we might be missing something.”
And without anyone saying it out loud, the room responds internally: They’re emotional. They’re ideological. They just don’t understand the facts. Though the tone remains professional, procedural and civil. The conflict hardens. And we come out of the room wondering:
Why don’t we argue like adults?
When we are in a disagreement why don’t we put our points firmly, evidence first but with enough humility to see the counter-point as tools to refine our own perspective. Why don't we look at our thoughts and feelings as data instead of facts, which like any data should be available for scrutiny? And then, we wonder why systems fail.
Well, now we know it’s because the design of human cognition guarantees blind spots. Our certainty becomes policy; our “objectivity” becomes escalation. Those who pride themselves on reason often ensure nothing gets resolved.
Step into any institution, market, or government body, and the pattern is obvious. A regulator insists, “We only follow facts,” yet quietly chooses which facts to highlight, which voices to hear, and which problems to measure. A ministry says, “We are just implementing the law,” while citizens cry, “This is hurting us.” Both sides are convinced of neutrality. Both see the other as biased. Escalation ensues and it’s not from malice, but from our cognitive programming.
In investment committees, the same story repeats. Every investor believes they are sober, rational, and dispassionate. The “crowd,” meanwhile, is panicked, euphoric, manipulated. Yet the blind spot lies in the committee’s own confidence. What feels like common sense, an objective judgment, it can be the most dangerous risk of all.
Even in newsrooms or courts, where process and rules are worshipped, the bias blind spot persists. Editors believe they are holding power accountable. Readers believe they are asking honest questions. Judges believe they are strictly procedural. But each interprets challenges as ideology rather than insight. Simple disputes become moralized wars. Every concession feels like surrender, every compromise like appeasement.
The ‘Me’ blindspot becomes a ‘we’ problem, spreading across policy, market and geopolitics escalation. Which is why, when two states interpret their own actions as defensive and the other’s as aggressive, escalation feels like prudence on both sides. Each side can point to its own internal narrative—“we are responding,” “we are stabilizing,” “we are restoring order”—and experience that narrative as neutral description. Meanwhile, the opponent’s narrative is heard as propaganda, which is often true and still incomplete, because propaganda is easiest to spot when it isn’t yours.
Yet this is a solvable problem. We can prevent bias before it takes root by controlling what information enters a decision.
One powerful method is to remain “blind” to irrelevant affiliations. We can deliberately restricting access to biasing information. Focus on the issue, not the person. Instead of, “You are inconsiderate,” say, “I felt overlooked when plans changed without notice.”
Since, we live in an age of filters; we can use one for our thinking. Apply the “What if the opposite is true?” filter. Ask yourself, “What if the opposite of what I believe is true?” This forces your brain to search for evidence it previously ignored.
The bias blind spot strengthens when we’re triggered. So, when you feel defensiveness rise, impose a mandatory six-second pause before speaking. That moment of “blindness” to impulse lets your analytical brain catch up.
Another tool is pre-commitment, that is agreeing to criteria before the heat of judgment. Many institutional failures are not failures of intelligence, but of consistency under threat.
This approach generalises: in hiring, procurement, grants, or editorial decisions, design systems that prevent temptation rather than rely on virtue.
Of course! Education also matters but not as a one-off seminar that flatters, but as a shared language that makes “I might be missing something” a respected sentence.
However, the deepest intervention is conversational. Conflict spirals are maintained in language. To interrupt them, try a move that feels countercultural in high-status rooms: name your own possible bias before naming theirs. Not as theater, but as a signal that your confidence leaves room for doubt. Often, this alone changes the emotional geometry of a room, making it safe for others to do the same.
It all comes down to a simple diagnostic:
Where do we treat our own judgments as reality and others’ as ideology?
Where do we confuse process with neutrality and expertise with immunity?
Where do we reward certainty and punish nuance?
Asking these questions is what makes institutions truly resilient.
Descalation doesn’t demand surrendering standards, truth, or rigor. It demands discipline: treat our certainty as data, not proof. It demands an understanding that systems fail because of our cognitive design that treats certainty as virtue. Recognizing this, leaders and the institutions they shape, can finally argue like adults: evidence-first, firm, and with just enough humility to prevent escalation from becoming strategy.