Extending the Non-Probabilistic Model: Cycles and Crashes
The Epstein-Wilmott model from the previous two chapters gives us worst-case prices for interest rate products without assuming any probability distribution. But the basic version is, well, basic. It assumes rates move smoothly within bounds. Real interest rates jump. They follow cycles. They have a stochastic component that looks a lot like Brownian motion on short timescales. Chapter 70 adds bells and whistles to make the model more realistic while keeping its non-probabilistic spirit.
Fitting Forward Rates
The first extension tries to bridge the gap between the Epstein-Wilmott world and classical models. Start with a deterministic mean-reverting equation for rates, similar to Vasicek but without the random term. If markets price according to this model, you can back out the mean-reversion target from the forward rate curve.
Now here is the trick: add a “margin of error” to this deterministic equation. Instead of saying rates follow this exact path, say they follow this path plus or minus some bounded error. That puts us right back into the Epstein-Wilmott framework, but now with a much tighter set of constraints because we have anchored the model to the forward curve.
The margin of error can be estimated by fitting the deterministic model many times across historical forward rate data and seeing how much the fitted parameters wander. This combines the best of both worlds: the forward curve fitting of classical models with the worst-case robustness of the non-probabilistic approach.
Economic Cycles
Interest rates follow economic cycles. Not perfectly, not predictably, but there is a pattern with a period of roughly five to ten years. Can we capture this?
A starting point is simple harmonic motion. The rate oscillates around some mean value with a fixed period. But we do not know the period precisely, and the mean value shifts over time. So we add uncertainty to both.
The result is a model with an extra state variable s (related to the velocity of rate changes) and uncertainty terms in both the rate and its acceleration. The worst-case equation now lives in three dimensions: r, s, and t. This is more expensive computationally but captures something important. Rates do not just wander randomly. They have momentum. An upward trend tends to continue before reversing. The cycle model captures this inertia.
The constraints on r and s define a region in r-s space where the system is allowed to be. This is more restrictive than just bounding r alone, because it limits not just where rates are but where they are heading.
Uncertainty Bands
This is probably the most practical extension in the chapter. The basic model assumes we know the short-term rate exactly. In reality, we observe rates with some error. Maybe we use a one-month rate as a proxy for the instantaneous rate, or maybe market microstructure introduces noise.
The fix is simple. Redefine r as an estimate that is always within some distance delta of the real short-term rate r’. The real rate is what drives discounting and cashflows, but r is what we observe and model.
In the worst case, we have to discount at whichever rate (r + delta or r - delta) is worse for us. If our portfolio has positive value, discounting at a higher rate hurts more. If negative, a lower rate is worse.
Wilmott illustrates the richness of this model with a beautiful thought experiment. He shows a possible path for r and r’, working from left to right through various regimes: steady increases, jumps, oscillations, quiet periods, volatile periods. The point is that within the uncertainty band, virtually any behavior is possible. Brownian motion, jumps, mean reversion, smooth trends. The band does not care what the specific dynamics are. It just bounds the gap between model and reality.
The example prices a four-year zero-coupon bond with different band widths. As expected, wider bands mean lower worst-case prices. But static hedging fights back: the hedged worst-case decreases more slowly than the unhedged one as delta increases. The hedge adapts by shifting weight toward longer-maturity instruments as the band widens.
How Wide Should the Band Be?
This is the key practical question. Wilmott uses daily one-month US interest rate data from 1986 to 1995 to find the minimum delta that makes the model consistent with observed rate movements. With the basic model parameters (4% per year growth bound, 3% to 20% range), the minimum band width is delta = 0.0063, about 63 basis points.
That sounds large. But it includes the worst jumps in the data, which mostly happen around year-end due to unusual supply and demand as people liquidate assets. These are not really structural rate movements. Once we model crashes separately (below), the minimum delta drops dramatically.
Crash Modeling: Limited Number of Crashes
Interest rates do jump. Sometimes by a lot. The basic model cannot handle this because it bounds the rate of change. A jump is an infinite rate of change. So we add crashes explicitly.
The first approach limits the total number of crashes. If at most one crash is allowed, introduce two value functions: V0 (no more crashes possible) and V1 (one crash still possible). V0 follows the standard worst-case equation. V1 also follows the same equation but with an additional constraint: at any moment, if the rate crashing from r to r-k would lower the portfolio value, the crash happens and we switch to the V0 world.
For N possible crashes, you need N+1 value functions linked by constraints. The value of the portfolio today is VN, accounting for all possible crashes.
The example allows up to five crashes of maximum 1% magnitude on a four-year zero-coupon bond. Without hedging, the worst case drops significantly with each additional crash. But with optimal hedging, the drop is much smaller because the hedge absorbs most of the crash risk. The catch is that you need more of the hedging instruments as the number of allowed crashes increases.
At high interest rates, crashes have less impact because the rate cannot crash above the upper bound. If it is already near 20%, a 1% crash just brings it to 19%. The worst case converges regardless of crash count.
Crash Modeling: Limited Frequency
The second approach limits how often crashes can happen rather than how many total. The rate can crash, but only after a specified waiting period omega since the last crash. This introduces a new variable tau, the time since the last crash.
You need two functions: V0 (crash not yet allowed, tracking tau) and V1 (crash allowed). When tau reaches omega, we transition from V0 to V1. The constraint on V1 is the same: crash happens only if it hurts.
You can combine both approaches. Allow one large crash (say 1%) plus unlimited small jumps (0.1%) that can happen at most once a month. The example shows a four-year bond with this setup. Unhedged worst case: 0.558. Hedged worst case: 0.728. The hedge has done its job.
The Payoff: A Much Smaller Band
When we include crashes in the model and then redo the data analysis, the largest historical jumps are explained by the crash mechanism rather than the uncertainty band. The eight biggest one-month rate changes in the 1986-1995 data range from 0.94% to 1.50%, and they cluster around November-December, consistent with year-end effects.
Excluding these points as crashes, the minimum uncertainty band drops dramatically. This is a big deal. A smaller delta means tighter worst-case/best-case spreads, which means more useful prices.
The lesson: model the big moves explicitly as crashes, and the remaining “normal” rate behavior requires a much narrower uncertainty band. The overall model becomes both more realistic and more useful.
Liquidity Effects
The final extension considers what happens when hedging instruments are not perfectly liquid. In reality, bonds have bid-offer spreads, and illiquid bonds have wider spreads.
The model handles this naturally. Just make the cost of the hedging instrument depend on whether you are buying or selling. If you are long (buying), you pay the offer. If short (selling), you receive the bid.
The example makes the five-year bond increasingly illiquid while keeping other bonds liquid. As the five-year spread widens, the yield envelope bulges out near the five-year maturity. The hedge adapts by using less of the five-year bond and more of the four-year and six-year bonds. The worst-case price decreases, but gradually, because the neighboring instruments partially compensate.
This is exactly what you would expect in practice. If one instrument becomes expensive or illiquid, you substitute nearby alternatives. The model formalizes this intuition and quantifies the cost.
Where This Leaves Us
Chapter 70 shows that the non-probabilistic framework is flexible enough to handle realistic features of interest rate behavior. Forward rate fitting, economic cycles, observation uncertainty, jumps, and liquidity all fit within the same basic structure.
Some extensions widen the best-worst spread (crashes, wider bands). Some narrow it (forward rate fitting, tighter bands after crash modeling). The art is in choosing the right combination of features and parameters to get spreads that are tight enough to be useful while still being honest about uncertainty.
Wilmott notes there is still plenty of work to be done. The model is a framework, not a finished product. But the philosophy is sound: assume less, bound more, and let the math find the worst case.
Previous post: Pricing Derivatives Without Probability: The Sequel
Next post: Modeling Inflation: Pricing Inflation-Linked Products