For any gambler with a cursory understanding of math, expectation is everything.
When you calculate pot odds in poker, what you’re calculating is expectation. Let’s say all the cards are out in a hand of Holdem, you’re heads up with a single player, and there’s 90 dollars in the pot. You assess that you have a 20% chance of winning the hand.
If he bets 10 dollars, bringing the pot to $100, then you figure out your expectation by multiplying 100 x .20 = 20 dollars. This means that by calling, you stand to win $20 on average, so you’re certainly willing to pay $10 for that chance. If instead you determined that the bet was worth less than $10, you’d fold.
Simple. You’re willing to pay up the amount that a bet is worth to you, on average.
In the language of math, if $$X$$ is a random variable (like the payoffs for outcomes of a bet), the expectation $$E[X]$$ is defined
$$! E[X] = p_1X_1 + p_2X_2 + \cdots + p_nX_n,$$
where for each $$i$$, $$p_i$$ is the probability of the random event $$X_i$$, assuming that the probabilities are all nonnegative and sum to 1.
So how much would you pay to play THIS game?
Let’s see how the expectation calculation works on a hypothetical game. Here are the rules:
- You pay an entry fee, which the house keeps.
- The house flips a coin until the first heads comes up, at which point the game ends.
- If that first heads is on the first flip, you win 1 dollar.
- If it’s on the second flip, you win 2 dollars.
- If it’s on the third flip, you win 4 dollars.
- Your payout continues to double in this way with each flip, so that if the first heads comes on the nth flip, you win $$2^{n-1}$$ dollars.
So, how much would you be willing to pay to play this game?
Determining the Expectation of the Game
Let’s use the above expectation formula to determine what this game should be worth.
First, the probabilities: The chance of first getting heads on the first flip is 1/2. The chance of first getting heads on the second flip is $$(1/2)(1/2) = 1/4$$. The third flip, 1/8. Continuing in this way, we know that the probability of the first heads coming on the nth flip is $$1/2^n$$.
Multiplying each probability by the corresponding payout, we get
$$!E[X] = (1/2)*1 + (1/4)*2 + (1/8)*4 + \cdots$$ or
$$!E[X] = 1/2 + 1/2 + 1/2 + \cdots = \infty$$
See what happens? Each term in the sum simplifies to 1/2, and there are infinitely many of them. Thus, the expected value of the game is infinite!
If you’re using expected value as your criterion for deciding whether to play, there’s no amount you’d be unwilling to pay for a single chance to play the game. And you’d be pretty weird, because nobody else in their right mind would pay more than maybe 20 bucks for it.
What’s causing the paradox?
This famous problem is known as the St. Petersburg paradox. For several hundred years, mathematicians and economists have argued about the reasons nobody would pay any significant amount for the chance to play this infinite-expectation game.
One solution uses the fact that people are risk averse—in general, people don’t like taking risks with large amounts of money. So even if a game has a huge expected value, it’s hard to justify betting the farm if you have such a big chance of losing it right away (like if heads comes up on the first flip and you win a Previewmeasly dollar).
But that solution, which can be made precise by dealing with expected utility gain rather than expected wealth gain, isn’t satisfactory. Why? Because you can always tweak the payouts of the game to create an even more favorable game. No matter the utility function being used, you can always invent a positive-expected-utility game that’s just as unappealing as this one.
A better explanation is that no casino in the world has an infinite bank. Nobody could ever pay out the huge sums that occur if heads takes, say, 40 flips to show up.
Wikipedia has an interesting chart that explains this. If, say, the casino couldn’t pay you more than $100 dollars, then your expectation is quite low ($4.28), because most of the expectation from the game comes from the huge payouts that are possible from an unlimited bank. Even if the casino could pay out a million dollars, your expectation is barely more than $10!
Banker | Bankroll | Expected value of lottery |
Friendly game | $100 | $4.28 |
Millionaire | $1,000,000 | $10.95 |
Billionaire | $1,000,000,000 | $15.93 |
Bill Gates (2008) | $58,000,000,000 | $18.84 |
U.S. GDP (2007) | $13.8 trillion | $22.79 |
World GDP (2007) | $54.3 trillion | $23.77 |
Googolaire | $10100 | $166.50 |
So what we can learn from the St. Petersburg paradox that will make us better gamblers?
Not too much, that I can think of. But it’s interesting, nonetheless.
One thing we can learn from the first, unsatisfactory solution: If we are risk averse when it comes to large wagers, then we need to account for the riskiness of a wager in addition to simply the expectation.
So, for example, even though taking odds on a pass line bet is “the best bet in the casino,” that doesn’t mean you should put your entire bankroll on it. Sure, the odds bet is a fair, zero-expectation bet, the only one of its kind in the casino. And making a minimal passline bet and putting everything you have behind it is as close to a fair bet as you can get.
But you don’t do it, because it’s too risky. A single seven-out, and your Vegas trip is over.
Expectation matters. But so does risk, volatility, exposure, or whatever you want to call that mysterious thing that makes gambling so much fun.
Out of curiosity, what about a game where you flip a coin until it comes up heads (lets say it takes n flips), and you get n dollars for it? The expected value would be
E[X] = 1*2^(-1) + 2*2^(-2) + … + k*2^(-k) + …
What would this converge to?
T, I’m not sure what that converges to. It does converge, by the ratio test, to some finite value. But I’m not sure if that value is known or if it can be expressed nicely. At the very least you could approximate it with a computer.
Great blog! I love your posts!
Great posts. Everything on this site is why I choose poker over most games in the casino. Poker played mathematically can be profitable long run AND with a great reading ability of other players can become lucrative. I really think reading all the math behind most casino games is what pushing me to study poker the most.