Behavioral Biases Part 1 - Heuristics and Judgment Traps

Chapter 2 of “Behavioral Finance for Private Banking” is where the book gets really practical. This is where Hens, De Giorgi, and Bachmann lay out the specific mental traps that mess up our investment decisions. And there are a lot of them.

The authors break it down into stages. First, you select information. Then you process it. Then you act on it. And at every stage, your brain takes shortcuts that feel smart but lead you in the wrong direction.

This post covers the first two stages: how we pick information and how we process it. There’s so much material here that I’m splitting Chapter 2 into two parts. This is Part 1.

How We Select Information (Badly)

Before you even start analyzing anything, your brain is already filtering what you see. And that filter is not objective.

Attention Bias: The Invisible Gorilla Problem

There’s a famous experiment where people watch a video of basketball players passing a ball. The task is to count passes by the team in white shirts. While people focus on counting, a person in a gorilla suit walks slowly through the scene, stops in the middle, waves, and walks away.

Most people don’t see the gorilla. They are so focused on counting passes that something completely obvious becomes invisible.

Here’s the thing. This happens to investors all the time. When all the news is about one story, say the US debt ceiling debate in summer 2011, nobody is watching the other stuff. The “gorilla” in that case was a coming US recession. Everyone was watching the political drama. Stock markets crashed right after the debt ceiling was resolved, and people were surprised. But the recession signals were there. They just weren’t looking.

The fix is boring but effective: always check a standard list of factors (economy, politics, valuations, market sentiment) regardless of what’s trending in the news.

Selective Perception: You See What You Expect

In another experiment, people were shown playing cards where some were doctored. Like a black three of hearts. Most people didn’t notice. They saw a normal card because that’s what they expected to see. It took them four times longer to spot something wrong.

For investors, this is dangerous. Before the 2007-2008 crash, traditional indicators like P/E ratios showed no warning signs. Investors who relied on those tools from previous experience missed that the real danger was in the housing market, not stock valuations. Experience made them blind.

The antidote: ask yourself, “What am I expecting to see here? Why do others disagree with me?”

Confirmation Bias: Searching for “Yes”

This one is probably the most common bias. People naturally search for information that confirms what they already believe. In a classic experiment by Wason, participants had to find a rule for number sequences. The rule was “three numbers in increasing order.” Most people guessed something specific like “increasing by 2” and only tested examples that fit their guess. They never tried examples that would prove their guess wrong.

Same thing with investing. If you own a stock, you’ll read the same news article differently than someone who doesn’t. Studies show it is harder to see bad news about a company when you hold shares of that company. You’re not looking for truth. You’re looking for confirmation.

The solution: talk to people who disagree with you. And actually listen. Ask yourself, “If I had no position right now, would I buy this stock today based on current evidence?”

Availability Bias: Dramatic Equals Likely

Our brains think that things we can easily remember are more common. Most Americans believe car accidents and murders kill more people than diabetes or stomach cancer. The opposite is true. But car crashes and murders are on the news. Diabetes is not.

For investors, this means attention-grabbing stocks get bought more. A study by Barber and Odean showed that individual investors tend to buy stocks that are in the news, have unusual trading volume, or had extreme price moves. Not because those stocks are good investments. Just because they caught their attention. And attention-driven buying does not generate better returns.

How We Process Information (Also Badly)

OK, so you’ve selected your information. Now your brain needs to make sense of it. Here’s where things get even more interesting.

Representativeness Bias: Pattern Matching Gone Wrong

People judge probability by how much something “looks like” what they expect. The authors give a great example. Say a fund manager beats the market two out of three years. Which track record is most likely over 5-6 years?

  • (a) Beat, Fail, Beat, Beat, Beat
  • (b) Fail, Beat, Fail, Beat, Beat, Beat
  • (c) Fail, Beat, Beat, Beat, Beat, Beat

Most people pick (b) because four wins out of six matches the two-thirds success rate. But here’s the problem: sequence (b) is actually sequence (a) with an extra “Fail” at the beginning. Adding a condition always makes something less likely, not more. This is called the conjunction fallacy.

This bias also makes investors overreact to short streaks. A stock goes up three months in a row and people think “this is a winner.” Research by De Bondt and Thaler showed that past losers actually outperform past winners. People overreact in both directions.

There’s also the base rate fallacy. When people get a personality description that sounds like an engineer, they ignore the actual proportion of engineers versus lawyers in the group. The stereotype overrides the statistics.

Conservatism: Ignoring New Evidence

This is the opposite problem. Sometimes people stick too much to their original beliefs and update too slowly when new information arrives.

The classic example is the Monty Hall problem. You pick one of three doors. The host opens another door showing a goat. Should you switch? Most people say no, thinking it’s 50/50. But switching actually gives you a two-thirds chance of winning. People underweight the new information the host just gave them.

In finance, this shows up as post-earnings-announcement drift. A company reports surprisingly good earnings. The stock should jump immediately to its new fair value. But it doesn’t. It drifts up slowly over weeks because investors adjust too slowly to the new information. Even professional analysts do this.

Gambler’s Fallacy: The Roulette Trap

Ever wondered why casinos show the last numbers drawn on the roulette display? Because people use that information even though it’s completely useless. If red came up five times in a row, people bet on black. But the wheel has no memory. Each spin is independent.

When researchers ask people to write down fake “random” coin flip sequences, people put in way too many alternations. Real randomness has longer streaks than people expect.

In investing, this leads to premature bets on trend reversals. A stock hits an all-time high and people sell because “it must come back down.” This is one reason for the disposition effect, where investors sell winners too early and hold losers too long.

Hot-Hand Bias: Seeing Skill in Luck

This is the flip side of gambler’s fallacy. People see streaks in human performance and assume skill, even when it’s random.

The authors give a powerful example. Imagine 1,000 fund managers with zero skill, just flipping coins. After one year, 500 will have “beaten the market.” After ten years, one will have a perfect ten-year track record. With 10,000 managers, ten will have that record. Purely by chance. Yet investors will pay those managers premium fees, convinced they found a genius.

Research confirms this. Private investors judging fund performance mostly look at returns and ignore how many competing funds exist. They don’t realize that in a large population of funds, some will look amazing just by luck.

Anchoring: The Number That Sticks

In a famous experiment, people spun a wheel of fortune and then estimated the percentage of African countries in the United Nations. If the wheel landed on 65, the median estimate was 45%. If it landed on 10, the median estimate was 25%. A completely random number changed their answer.

Even experts fall for this. Real estate agents given different listing prices for the same house gave different appraisals. And only 1 out of 10 agents even mentioned the listing price as a factor in their judgment. They didn’t know they were anchored.

For investors, the current price is a powerful anchor. The purchase price is an anchor. Analyst estimates are anchors. One useful counter-strategy: before making a decision, list reasons why the opposite outcome might happen.

Framing: Same Problem, Different Answer

How you present a choice changes what people pick, even when the outcomes are mathematically identical.

Here’s the book’s example. Choose between (A) a guaranteed $2,400 gain, or (B) a 25% chance of $10,000. Then separately, choose between (C) a guaranteed $7,500 loss, or (D) a 75% chance of losing $10,000. Most people pick A and D. But if you do the math, combining B and C is actually $100 better than A and D in every scenario. The gain/loss framing tricks your brain.

This is connected to prospect theory. People are risk-averse with gains (“give me the sure thing”) but risk-seeking with losses (“let me gamble to avoid the loss”). So the same decision gets different answers depending on whether you frame it as winning or losing.

Another framing trap: when employees pick pension plans, they tend to split money equally across available options. If there are more stock funds than bond funds in the menu, they end up holding more stocks. The menu design frames their allocation.

Overconfidence: The Most Dangerous Bias

More than 50% of drivers think they’re above average. That’s mathematically impossible for a symmetric distribution. This is the “better than average” effect, and it applies to investing too.

There’s also miscalibration. When people are asked to give a range they’re 90% confident contains the right answer, the true answer falls outside their range way more than 10% of the time. Private investors systematically underestimate future market volatility. In one study, 72% of investor volatility forecasts were lower than what the market actually expected.

But here’s the most interesting finding from this section. In a trading experiment, students were given different amounts of inside information. The best-informed students made the most money. That makes sense. But the worst performers were not the least-informed. They were the ones with medium information. Why? Because the medium-informed students tried to exploit the least-informed ones. But the least-informed students knew they had no edge, so they didn’t trade actively. The medium-informed ones, overconfident in their partial knowledge, got exploited by the best-informed traders.

The lesson is sharp: if you know you have no information advantage, index investing is fine. The worst position is thinking you know more than you do.

Studies by Barber and Odean confirm this at scale. Investors who trade the most make the lowest returns.

The Big Picture

So here’s what happened in this first half of Chapter 2. The authors showed that our brains are basically full of shortcuts that worked fine for survival but are terrible for investing.

When selecting information, we focus on what’s dramatic, confirm what we already believe, and miss what we don’t expect. When processing information, we see patterns in randomness, anchor to irrelevant numbers, get fooled by how questions are framed, and think we’re smarter than we are.

The good news: most of these biases have known countermeasures. Statistical thinking, considering opposite viewpoints, using checklists, and being honest about what you don’t know.

Part 2 will cover the emotional biases and what strategies help to reduce all these errors.


Previous: What Is Behavioral Finance? An Introduction

Next: Behavioral Biases Part 2 - Emotions and Debiasing

About

About BookGrill

BookGrill.org is your guide to business books that sharpen leadership, refine strategy and build better organizations.

Know More