LearningLibrary

Critical Thinking·Cognitive Bias

Why Smart People Believe Wrong Things

A puzzle sits at the center of modern psychology: the people most capable of reasoning carefully are not noticeably better at holding accurate beliefs about contested questions. On topics like climate science, economic policy, or the safety of vaccines, higher cognitive ability and more education often correlate with stronger conviction — but not with greater accuracy. The well-informed are sometimes the most spectacularly wrong, and they are wrong with footnotes.

The naive theory of belief assumes that intelligence acts like a filter: smarter reasoners catch more errors, so their beliefs drift toward truth. The empirical picture is messier. Dan Kahan's work on what he calls identity-protective cognition found that numerate subjects given a politically loaded statistical problem were more likely than less-numerate subjects to get the answer wrong — but only when the correct answer threatened their political identity. On neutral problems, numeracy helped. On charged ones, it weaponized.

The mechanism is motivated reasoning, the tendency to evaluate evidence according to whether its conclusion is welcome. Crucially, motivated reasoning is not a failure of effort. It is a deployment of effort in a particular direction. A skilled reasoner asked to assess a study whose conclusion she dislikes will scrutinize its methods, notice its sample-size limitations, flag its confounds, and emerge with a measured rejection. Asked to assess a study whose conclusion she likes, she will accept it more readily. Both assessments may be individually defensible. The asymmetry is the bias.

This points to a deeper structural feature: the asymmetry of skepticism. Skepticism is a finite resource, and we spend it where it stings least. Beliefs that align with our identity, our profession, our social world, or our prior public commitments carry hidden costs if abandoned, so the threshold of evidence required to abandon them rises silently. Beliefs that threaten nothing pass through cheaply. The result is not stupidity but a kind of intelligent stupidity — careful, articulate, well-cited error.

A related trap is what philosophers call epistemic learned helplessness in reverse: high confidence in one's own reasoning. Someone who has successfully reasoned their way through difficult problems before — a physicist, a lawyer, a doctor — has good evidence that their reasoning is reliable. That track record becomes a license to trust their conclusions in domains where they have no comparable training. The Nobel laureate publishing on nutrition, the brilliant programmer with confident takes on epidemiology: each is extending a tool that worked in one shop to a job it was not built for. The very feedback that calibrated their reasoning in their home domain is absent in the new one, but the confidence persists.

What protects against this? Not more intelligence — that is the diagnosis, not the cure. The interventions that show some empirical traction are unglamorous. Being asked to argue the opposing position seriously, not as a strawman, can soften motivated reasoning. Tracking one's predictions over time — actually writing them down and checking — punctures the illusion that one's track record is better than it is. Cultivating relationships with people who disagree thoughtfully, and granting them genuine epistemic authority rather than treating them as puzzles to solve, raises the cost of comfortable error. None of these are about thinking harder. They are about exposing one's thinking to friction it would not generate on its own.

There is a final, uncomfortable implication. If intelligence does not protect against wrong belief, then the feeling of having reasoned carefully is not strong evidence that one has reached the truth. The internal experience of a well-considered conclusion feels the same whether the conclusion is right or wrong; the feeling is generated by the process of reasoning, not by its accuracy. A sophisticated reasoner who notices this can begin to treat her own conviction with the wary respect she would extend to a stranger's — not dismissing it, but not trusting it as much as it asks to be trusted. This is a strange posture to hold toward one's own mind. It is also, on the evidence, the posture most likely to keep it honest.

Vocabulary

identity-protective cognition
The tendency to process information in ways that protect beliefs tied to one's group identity, even at the cost of accuracy.
motivated reasoning
The tendency to evaluate evidence according to whether its conclusion is desirable, applying stricter scrutiny to unwelcome claims than to welcome ones.
asymmetry of skepticism
The pattern of demanding more evidence to overturn beliefs whose loss would be costly than to accept beliefs that threaten nothing.
epistemic learned helplessness
Originally, the stance of refusing to update on clever arguments because one knows one cannot reliably evaluate them; here invoked in its inverse form, where past reasoning success leads to overconfidence in new domains.
numerate
Skilled at quantitative reasoning and the interpretation of numerical evidence.

Check your understanding

Question 1 of 5recall

According to the passage, what did Dan Kahan's research find about numerate subjects facing politically loaded statistical problems?

Closing question

Think of a belief you hold confidently in a domain where you have no formal training. What would it take — concretely — for you to notice if you were wrong?

More in critical thinking