The Harbus team explores a simple approach to rational decision-making under conditions of uncertainty.
A CEO, a quant, and decorated poker player, Phil Ivey, walk into a bar. While the question at the forefront of all our minds—‘what is the punchline to this terrible joke?’, is sadly out of scope, we will attempt to unpack the question immediately following—‘what do these people have in common?’ The answer: all three individuals necessarily excel at making decisions under conditions of uncertainty.
Life, in reality, is inherently ambiguous. None of us have access to the perfect information we often crave to drive our decision-making. The most effective among us are able to navigate this ambiguity, and make decisions that optimize their own expected payoffs. Sometimes these decisions are relatively intuitive—most of us understand that the rational response to a (fair) coin toss is indifference. But oftentimes such decisions represent complex puzzles, requiring delicate balances of intuition, analysis, and judgment. So, how can we think about making these decisions?
One potential solution lies in basic probability and expected values. Returning to the coin toss example, suppose you are offered the chance to flip a fair coin. If you correctly guess the outcome of the toss, you stand to gain $100. If you incorrectly guess the outcome, you will lose $100. Given you can expect to guess correctly 50 percent of the time, the expected payoff from this coin toss is $0 (50 percent * +$100 + 50 percent * –$100), and you should be indifferent between participating in the game versus not. While analyzing expected payoffs is a useful way of assessing such decisions, we often lack the necessary information to do so. In this example, we have complete access to information about the possible outcomes, probabilities, and payoffs. In other situations, however, these parameters may not be as obvious. In such events, it can be helpful to frame the question in the inverse manner, and ask ourselves—what would you need to believe?
To illustrate this, suppose instead that you are offered a bet on the much anticipated Harvard-Yale football game in November. If Harvard wins the match, you will receive $200, however if Yale wins, you will lose $300. How should you assess this opportunity, assuming you lack a precise understanding of the likelihood of Harvard winning vs. Yale? In this case, you can think of the bet as risking $300 to win $200, implying a potential gain:loss ratio of 2:3. If Harvard wins the match exactly 60 percent of the time (3 ÷ (2 + 3), or loss ÷ (gain + loss)), your expected payoff is $0, and you are indifferent between making the bet versus not. As a consequence, so long as you believe that Harvard wins this match at least 60 percent of the time, you can conclude that making this bet is rational.
The reason why this approach is powerful, is that while we often will not know exactly how likely a specific event is to occur (e.g. Harvard winning), we generally do have some intuition about the reasonable range of likelihoods, which we can use to inform our decisions. Poker players might refer to this as the analysis of the ‘pot odds’ in a hand, however this approach can be generalized to decisions made more broadly.
To abstract one step further from explicit gambling-related scenarios, consider a question many of us might consider when booking flights for various treks—‘should I purchase flight insurance?’ Suppose our hypothetical flight costs $400, and the associated flight insurance, which offers us a full refund for last-minute cancellations, will cost us $40. As with our previous example, we can take the following approach:
Re-frame the question. ‘What would I need to believe to make purchasing flight insurance rational?’
Identify the range of relevant outcomes. There are two outcomes here that are relevant—(1) where we need to make a last minute cancellation, and (2) where we do not.
Quantify the payoffs of those outcomes. In the first outcome, we are refunded the cost of our flight for a gain of $360 ($400 – $40). In the second, we forfeit the full cost of insurance for $40. Our gain:loss ratio can be expressed as 360:40, or 9:1.
Establish the decision rule. As with before, our break-even likelihood is 10 percent (1 ÷ (9 + 1)). Therefore, if we believe there is more than a 10 percent chance that we will need to make a last minute cancellation, we can conclude that purchasing flight insurance is rational.
Intuit the decision. We use our intuition to arrive at our final decision. Does our instinct tell us that there is at least a 10 percent chance of us needing to cancel at the last minute?
One might quickly see how this approach could be applied to a broad range of situations, from day-to-day decisions to strategic choices made in board rooms (yes, retail travel insurance purchasing and geographical expansion for a Fortune 500 company are clearly decisions in the same order of consequence). Of course, you won't always make the correct call, and hindsight is 20/20. Sometimes you will wish you had purchased the insurance, taken the bet, or folded the hand. But the goal here is to be correct on average, and the key to this is to develop proficiency in: (1) predicting the range of relevant outcomes; (2) quantifying the payoffs of those outcomes; and (3) intuiting the likelihoods of each outcome occurring. Like all good things in life, mastering these skills requires repetition; so the next time you are faced with a decision, perhaps think to yourself—‘what would Phil Ivey do?’
Edouard Lyndt (MBA ’25) is from Australia. An example of someone who took the ‘jack-of-all-trades’ thing too far, he has explored a range of career paths spanning M&A, strategy, product management, and even (very briefly) professional fighting. Outside of work, he enjoys reading, cooking, and exercise.
Comments