Why Smart People Do Dumb Things
There are two versions of you inside your head: one that reacts fast but gets things wrong, and one that thinks clearly but can’t be bothered to show up.
I always assumed smart people don’t do dumb things.
The stronger your logic, the better your judgment should be, right? Then I read Daniel Kahneman’s Thinking, Fast and Slow, and realized: being smart and being rational are not the same thing at all.
Here’s a quick puzzle. A bat and a ball cost $1.10 together. The bat costs $1.00 more than the ball. How much does the ball cost?
If you blurted out “ten cents,” congratulations—you just made the same mistake as over half the students at Harvard and MIT. The correct answer is five cents. The puzzle isn’t hard. What’s hard is that our brains don’t want to bother verifying an answer that already feels right.
This happens every day. You see “was $899, now just $299” on a shopping site and your first thought is “what a deal”—not “it was never worth $899.” You see a headline that says “Shocking! 90% of people don’t know this” and your finger has already tapped before your brain starts wondering if you’re being clickbaited.
Kahneman divides the mind into two systems. System 1 is fast, automatic, intuitive—a relentless association machine that never stops running. System 2 is slow, effortful, logical—but inherently lazy, skipping out whenever it can. Put simply, there are two versions of you inside your head: one that reacts fast but gets things wrong, and one that thinks clearly but can’t be bothered to show up.
And that’s the whole problem. Smart people have a more powerful System 2—so what? A top-of-the-line computer that never gets turned on is no different from a beat-up old laptop.
You think you’re thinking. You’re really just agreeing.
People who’ve studied probability still can’t get probability right
You might think: okay, ordinary people fall for this, but surely people with rigorous training don’t?
Kahneman and Tversky ran a famous experiment. They gave subjects this description:
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Then they asked: Is it more likely that Linda is “a bank teller,” or “a bank teller who is active in the feminist movement”?
89% of graduate students chose the latter.
But this violates the most basic rule of logic. “Bank teller and feminist” is a subset of “bank teller”—a conjunction can never be more probable than either of its components alone. It’s like asking: Is Zhang San more likely to be from Beijing, or to be from Beijing and living in Haidian District? The answer is obvious.
Kahneman gave the same questionnaire to PhD students in Stanford’s decision science program—people who had all taken advanced probability theory and decision theory. 85% made the exact same mistake.
One colleague knew the right answer, but said: “There’s a little man in my head, jumping up and down, yelling at me: ‘She can’t just be a bank teller—look at that description!’”
That little man jumping up and down? That’s System 1.
Think about how this plays out in real life. A friend tells you, “My coworker Xiao Wang—graduated from a top university, worked at a big tech company for three years, just quit.” Then asks: Is he more likely to be “looking for a job,” or “looking for a job while also studying for civil service exams”? You probably want to pick the second one too. But logically, the second is a subset of the first—its probability can only be smaller, never larger.
We’re not being fooled by the world. We’re being fooled by our own intuition.
Knowing the answer doesn’t help
If the Linda problem proves that “smart people still make mistakes,” the next story is even more striking: even when you know you’re wrong, you still won’t change.
Kahneman got hold of performance data for 25 wealth advisors at a major investment firm, covering eight consecutive years. His reasoning: if these people have genuine, stable investment skill, their yearly performance rankings should show some consistency—the correlation should be positive.
The result was 0.01.
Zero. No different from rolling dice.
At the dinner that evening, Kahneman shared his findings with the firm’s senior executives. They nodded politely with blank expressions, then went back to their drinks. The next morning, a director drove him to the airport and said with unshakable confidence: “I’ve done very well at this firm. No one can deny that.”
The entire company continued operating exactly as before, as if nothing had happened.
This kind of thing is everywhere. Look at mutual fund rankings—last year’s star manager is this year’s bottom performer, but someone always thinks “the one I picked is different this time.” Same with friends who trade stocks. When a pick goes up, they have great instincts. When it drops, it’s the market’s fault. Three years later they’re down 30%, but they still believe they “understand stocks.”
Kahneman calls this the “illusion of validity.” As long as someone does something repeatedly and occasionally succeeds—even if that success is pure luck—they’ll be convinced they have real skill. The illusion works like the Müller-Lyer optical illusion: you know the two lines are the same length, but your eyes refuse to agree.
Knowing doesn’t mean you can change. Even knowing, you still can’t change. That’s the part that really stings.
Have you ever told yourself before finals, “Two weeks left—plenty of time”? Then two weeks turned into an all-nighter. Have you ever looked at a project and thought, “This isn’t complicated, three days should do it”? Then three days turned into three months. This is the planning fallacy: you only look at what’s in front of you, instinctively ignoring a statistical fact—you said the exact same thing last time.
Even when you’ve heard the statistics with your own ears, System 1 still pounds its chest and says: we’re different.
By now you might be wondering: if being smart doesn’t help, and knowing doesn’t help, then what can you actually do?
At the end of the book, Kahneman admits that decades of studying cognitive biases haven’t made his own intuition any better. He still falls prey to overconfidence. He still commits the planning fallacy. He’s just learned to slow down at certain moments and ask himself: am I standing in a minefield right now?
Psychologist Keith Stanovich put it precisely: high IQ does not equal high rationality. A person can simultaneously have the sharpest mind and the laziest thinking. People who accept their gut answers without question aren’t stupid—they’re lazy. More precisely, it’s not them that’s lazy. It’s their System 2.
Of course, you can’t engage System 2 for everything—you’d spend half an hour agonizing over what to have for breakfast. But at least for the big decisions—changing jobs, making investments, choosing a major, estimating how long a plan will take—try these three steps:
- Pause. Don’t rush to a conclusion.
- Look up how similar things turned out for other people.
- Pretend you’re giving advice to a friend. What would you say?
The third one is remarkably effective. When giving someone else advice, your System 2 is actually willing to clock in. You’re always more rational about your friends’ decisions than your own.
So the next time you feel absolutely certain about a decision, it’s worth pausing to ask yourself: is this certainty the product of careful deliberation, or is it just that little man in your head, jumping up and down, making the call for you?