Perception Biases That Mess With Your Money Decisions
You know that feeling when you buy a stock right after seeing a scary headline, and then two weeks later you wonder what you were thinking? That’s a perception bias doing its work on your brain.
Chapter 10 of Burton and Shah’s book is about four perception biases that mess with how you see financial decisions. Not how you weigh options. Not how you calculate risk. But how you perceive the problem itself. You’re solving the wrong puzzle, and you don’t even know it.
Let’s walk through all four.
Saliency: Whatever Happened Last Feels Most Important
Saliency is simple. If something happened recently, you overweight its probability. If it hasn’t happened in a while, you treat it like it will never happen.
Nobody buys flood insurance unless there was a flood last month. Nobody buys airplane crash insurance except at the airport, right before boarding. And when the economy has been growing for years, nobody thinks about recessions. Crashes become unthinkable.
Here’s the thing. This works in both directions. After a crash, people act like crashes are everywhere. After a boom, people forget crashes exist.
The 2008 financial crisis is the textbook example. During the housing boom, banks, investors, and regulators all acted like a major collapse was impossible. Not unlikely. Impossible. Researchers Gennaioli, Shleifer, and Vishny built a formal model around this. They found that a model where people simply ignore unlikely-but-possible bad events fits the crisis data better than models where people rationally account for small probabilities.
The best model for what happened wasn’t “people underestimated the risk.” It was “people pretended the risk didn’t exist.”
And then after the crisis? Credit standards went through the roof. Banks wouldn’t lend to anyone. Saliency flipped the switch from one extreme to the other.
For your portfolio, this means: when everything feels safe and you can’t imagine losing money, that’s exactly when you’re most vulnerable. And when everything feels terrifying and you want to sell everything, the danger is probably not as bad as your brain is telling you.
Framing: Same Facts, Different Decision
This one will mess with your head. Framing means that how a question is worded changes your answer, even when the actual content is identical.
Kahneman and Tversky ran a famous experiment. They told people: an unusual disease will kill 600 people. You pick a response program.
Version A: Program A saves 200 people for sure. Program B has a 1/3 chance of saving all 600 and a 2/3 chance of saving nobody.
Most people picked A. Save 200 for sure. That’s the safe choice.
Version B: Program C means 400 people die for sure. Program D has a 1/3 chance nobody dies and a 2/3 chance all 600 die.
Most people picked D. Take the gamble.
But here’s the punch line. Program A and Program C are the exact same thing. Program B and Program D are the exact same thing. “Save 200 out of 600” is the same as “400 die out of 600.” The math is identical.
So why do people flip their choice? Because when you frame it as saving lives, people play it safe. When you frame it as people dying, people gamble. The reference point changes everything.
This shows up in investing all the time. A fund manager says “this fund returned 8% last year.” That sounds good. But reframe it: “this fund underperformed the market by 4%.” Suddenly you want to switch funds. Same fact. Different frame. Different decision.
Kahneman and Tversky proved this with money lotteries too. They showed people two identical sets of lottery choices, just split up differently. 73% of people picked the dominated option because of how the choices were framed. Sophisticated respondents fell for it just as hard. Even when people answered both versions within minutes of each other, they still contradicted themselves.
The lesson? When someone presents you with an investment decision, ask yourself: what would this look like framed differently? If a stock dropped from $100 to $60, is that “40% loss” or “buying at a 40% discount”? The frame you choose will push you toward different decisions.
Anchoring: The First Number Wins
Anchoring happens when you have to guess something you don’t know much about, and some random number you heard earlier pulls your guess toward it.
The classic experiment: guess how many jellybeans are in a jar. Right before guessing, one group hears “there are 1,000 stars in the sky.” Another group hears “there are 10,000 stars in the sky.” The star count has absolutely nothing to do with jellybeans. But the group that heard 10,000 consistently guesses higher.
Different rooms with the same jar of jellybeans produce wildly different average estimates, pulled in the direction of whatever anchor they heard.
Same thing with historical facts. Ask people when Galileo lived. First mention Columbus in 1492, and people guess Galileo lived closer to the 1400s. First mention the Magna Carta in 1215, and they place him closer to the 1200s. (He was born in 1564.)
Now think about investing. You’re looking at a stock. Someone mentions it traded at $300 last year. Even if everything about the company has changed, that $300 number is now stuck in your head. It becomes your anchor. You might think $200 is “cheap” because you’re comparing to $300, when the company might actually be worth $120.
Burton and Shah mention something interesting though. Anchoring isn’t always irrational. There’s a classic math problem called the Secretary Problem where the optimal strategy involves looking at the first few options to build a reference point, then using that reference to decide. Our brains might be wired to anchor because sometimes it’s a smart shortcut.
But here’s the problem. We anchor on random noise too. And when someone deliberately gives you an anchor, like a seller naming a high price first, your “rational” counter-offer will still be pulled toward their number. The anchor works even when you know it’s there.
Sunk-Cost Bias: You Already Paid, So Now You’re Stuck
You bought a $200 concert ticket for your favorite singer. Thursday comes. You don’t want to go anymore. You tried to sell the ticket, give it away. Nobody wants it. Do you go?
Most people say yes. You paid $200! You can’t just waste it!
But an economist would say: the $200 is gone. It’s a sunk cost. It doesn’t matter anymore. The only question is: will you enjoy the concert tonight? If the answer is no, stay home. Going to a concert you don’t enjoy doesn’t un-spend the $200.
And here’s a related twist. Say you lose the $200 ticket on the way to the concert. Would you buy another ticket for $200? Most people say no. But if instead of losing the ticket, you lost a $200 bill on the way there, more people would still buy the ticket. Same financial loss. But people mentally file the lost ticket under “concert expenses” and the lost cash under “general losses.” So the ticket loss makes the concert feel like it costs $400, while the cash loss doesn’t.
This matters for investing because sunk-cost thinking is behind some of the worst portfolio decisions. You bought a stock at $50. It drops to $30. You don’t want to sell because selling “locks in the loss.” But the stock doesn’t know what you paid for it. The only question that matters is: given what you know now, is this stock worth more or less than $30?
Burton and Shah point out another version of this: investors who sell during a crash and then refuse to buy back in when prices are higher. They sold at $40, stocks recovered to $60, and now they’re waiting for prices to drop back to $40 so they don’t have to admit their timing was wrong. They’d rather miss the recovery than feel regret. Sometimes they wait so long that the regret fades with time, and then they buy back in at $100.
Why This Matters
These four biases are all about perception. You’re not bad at math. You’re not stupid. Your brain is just wired to see problems in ways that don’t always match reality.
Saliency makes recent events feel more probable than they are. Framing makes identical choices feel different. Anchoring makes random numbers affect your estimates. And sunk costs make past spending affect future decisions.
The common thread? Emotions. Regret, fear, the desire to avoid loss. Traditional economics assumes these don’t matter. But they clearly do.
And as Burton and Shah point out, none of this is good news for the efficient market hypothesis. If investors are systematically seeing problems wrong, prices can’t perfectly reflect true value.
Knowing about these biases won’t make you immune. Even Kahneman himself admitted he still falls for them. But at least you can pause and ask: am I reacting to reality here, or to how reality was presented to me?
That question alone is worth a lot of money.
Previous: Prospect Theory - Why Losing Hurts More Than Winning Feels Good