What Blackjack and Gambling Teach Us About Investing
Chapter 17 starts with a confession that always gets Wilmott in trouble with bank training managers. He wants to call his lecture “Investment Lessons from Blackjack and Gambling.” They want him to change the title because regulators might frown on it. Wilmott thinks this is silly. Investment and gambling share the same mathematical roots. And most professional gamblers he knows understand risk and money management better than most risk managers at banks.
This chapter connects gambling math to investing in a way that is both fun and genuinely useful.
The Rules of Blackjack (Quick Version)
You sit at a table with a dealer. You get two cards, dealer gets two (one face up, one face down). Face cards count as 10, aces as 1 or 11, everything else at face value. Goal: get closer to 21 than the dealer without going over (busting).
You can hit (take another card), stand (stop), double down (double your bet but take only one more card), or split pairs. The dealer has no choices. The dealer must hit on 16 or below and stand on 17 or above.
The house edge comes from one asymmetry: if you bust, you lose immediately, even if the dealer would have busted too. This simple rule gives the casino a 5-6% advantage over a player with no strategy.
Ed Thorp and Beat the Dealer
In 1962, MIT math professor Ed Thorp published “Beat the Dealer,” a book that genuinely changed Las Vegas. He identified three keys to winning at blackjack:
1. Optimal strategy. Know when to hit, stand, double down, or split based on your cards and the dealer’s face-up card. Computer simulations determined the mathematically optimal play for every possible situation. Following this strategy alone cuts the house edge down to roughly even.
2. Card counting. Unlike roulette, blackjack has memory. Cards already dealt change the odds for remaining hands. A deck rich in 10s and aces favors the player (more naturals, more dealer busts). A deck rich in low cards (2 through 6) favors the house.
The simplest counting system: start at zero, subtract 1 for every ace or 10-count card dealt, add 1 for every 2 through 6. When the count (adjusted for remaining cards) is high, the deck favors you.
3. Money management. Knowing when the deck is favorable is only half the battle. You also need to know how much to bet. Bet big when the odds favor you, bet small when they do not. This is where the Kelly criterion comes in.
The Kelly Criterion
This is the star of the chapter, and it applies far beyond blackjack. The Kelly criterion tells you the optimal fraction of your bankroll to bet in order to maximize your long-term growth rate.
Say you start with $1,000 and bet fraction f on each hand. After M hands, your wealth is:
W = 1000 times product of (1 + f times phi_i)
where phi_i is the outcome of each hand (win +1, lose -1, etc.).
If f is too large (say f = 1, betting everything), you will eventually go broke, even with positive expected returns. One bad hand wipes you out. If f is too small, your money grows painfully slowly.
The expected long-term growth rate works out to approximately:
g = f times mu minus (1/2) times f squared times sigma squared
where mu is the expected return per hand and sigma is its standard deviation.
To maximize g, take the derivative and set it to zero. The optimal fraction is:
f = mu / sigma squared*
This is the Kelly criterion. It tells you the exact fraction of your wealth to risk on each bet.
Why Half-Kelly Is Popular
The growth rate curve peaks at f* and then drops. At twice the Kelly fraction, your expected growth rate is zero. Beyond that, you expect to lose money despite having positive expected returns. That is a scary zone.
Since in real life you never know mu and sigma exactly, there is a real risk of accidentally being in the “crazy zone” above the optimal fraction. For this reason, many practitioners use “half Kelly,” betting half the Kelly amount. This cuts your volatility in half, keeps you safely away from the danger zone, and only reduces your expected growth rate by 25%.
Think about that tradeoff. Half the risk for only 25% less growth. That is a pretty good deal.
Roulette: Can You Beat Physics?
Before his blackjack work, Ed Thorp tackled roulette. In 1955, as a physics grad student at UCLA, he wondered if Newtonian mechanics could predict where the ball would land.
He teamed up with Claude Shannon (yes, the father of information theory). They bought a professional roulette wheel, calibrated a mathematical model, and found they could predict a single number with a standard deviation of 10 pockets. This gave them a 44% edge on single-number bets.
Shannon and Thorp built the world’s first wearable computer: a cigarette-pack-sized device with 12 transistors, operated by toe switches, that communicated predictions via an earpiece. Shannon contributed a clever insight: since humans take time proportional to log(n) to choose among n options, betting on octants (groups of 8 numbers) was better than individual numbers.
They tested it in Las Vegas in 1961. It worked, despite hardware problems (broken wires, earpieces falling out). Nevada eventually outlawed such devices in 1985.
Thorp decided the stock market was a bigger casino with nicer working conditions, and took his probability skills there. When Paul Newman asked him how much he could make at blackjack, Thorp said $300,000 a year. “Why aren’t you out there doing it?” Because he could make more on Wall Street, with better colleagues.
Horse Racing and No Arbitrage
The concept of risk-neutral pricing shows up in sports betting too, and Wilmott draws the parallel beautifully.
In horse racing, the bookmaker sets odds not based on which horse is most likely to win, but based on how people bet. The odds are adjusted so that the bookie profits regardless of the outcome. This is exactly analogous to risk-neutral pricing in derivatives: the “probability” used for pricing is not the real probability, but one constructed to ensure no arbitrage.
Consider a soccer match between England and Germany. In England, patriotic bettors wager heavily on England, making the English odds look favorable. In Germany, the opposite happens. In principle, you could bet on Germany in England and on England in Germany and potentially arbitrage. In practice, bookies lay off bets against each other internationally, eliminating such opportunities.
The math: if there are N horses and W_i is wagered on horse i, the bookie sets odds q_i such that:
W_i times (1 + q_i) is less than or equal to total wagers
for every horse i. The odds reflect betting patterns, not real probabilities.
How to Bet on Horses (Three Strategies)
Wilmott gives a worked example with a horse race and shows three different betting strategies:
Maximize expected return: Put everything on the horse with the best expected payoff. Result: 40% expected return, but 280% standard deviation. Huge upside, huge downside.
Minimize standard deviation: Find the bet allocation that eliminates all uncertainty. Result: zero standard deviation, but a guaranteed loss of 62%. This is basically what the bookie does (in reverse). Zero risk, guaranteed outcome, but you are on the wrong side.
Maximize return divided by standard deviation (Sharpe-like ratio): Balance return against risk. Result: 31% expected return with 164% standard deviation. A compromise that the Central Limit Theorem says is a natural optimization target.
The parallel to portfolio theory is exact. You face the same choices with stocks: go all-in on the highest-return asset, minimize volatility, or find some risk-adjusted optimum. The math is the same whether you are at the racetrack or on a trading desk.
The Connection to Investing
Wilmott is blunt at the end: “The mathematics of gambling is almost identical to the mathematics of investing. The main difference between gambling and investing is that the parameters are usually easier to measure with gambling games.”
At a blackjack table, you can count the cards and compute exact odds. In the stock market, you never know the true expected return or volatility. Everything is estimated, with error.
The Kelly criterion works for both. If you are a technical trader who follows signals (golden crosses, head-and-shoulders patterns), the Kelly framework tells you not just whether a signal is profitable, but how much to bet on it. And crucially, it accounts for how often the signal appears. A rare signal with high return may be less valuable than a frequent signal with moderate return.
Key Takeaways
- Blackjack teaches the three pillars of investing: strategy (what to buy), information (understanding the odds), and money management (how much to risk).
- The Kelly criterion maximizes long-term growth by betting f* = mu / sigma squared of your bankroll. Half-Kelly is a safer practical choice.
- Betting too aggressively is worse than betting too conservatively. Beyond twice Kelly, you expect to lose money even with positive edge.
- Bookmaker odds are set by betting patterns, not real probabilities, exactly like risk-neutral pricing in derivatives.
- The same math works for blackjack, horse racing, and stock market investing. The parameters are just harder to estimate in markets.
- If you cannot handle the math and emotions of gambling, Wilmott suggests you probably should not be working in a bank.
Previous post: Is the Normal Distribution Good Enough for Finance?
Next post: Portfolio Management