Nearly 9 years ago, Steven posted this brainteaser:
Here I have a well shuffled deck of 52 cards, half of them red and half of them black. I plan to slowly turn the cards face up, one at a time. You can raise your hand at any point — either just before I turn over the first card, or the second, or the third, et cetera. When you raise your hand, you win a prize if the next card I turn over is red.
What’s your strategy?
Read no further if you want to try and solve this brainteaser on your own first!
The answer is that since the deck is randomized, when you call “Next”, it doesn’t matter which remaining card is turned over next. So the problem can be restated such that when you call “Next”, you just turn over the last card in the deck, which obviously means your odds of winning are 1/2 no matter what “strategy” you use.
But this fact can also be used to prove some non-trivial results. For example: Suppose your “strategy” is to wait until the ratio of remaining reds in the deck is at least 2/3, then call “Next”, unless you never reach the 2/3 ratio and you’re forced to take the last card. Given that you have reached at least 2/3, your chances of winning are actually slightly greater than 2/3, because you might jump from the ratio being slightly less than 2/3 (7 reds and 4 blacks remaining) to slightly greater than 2/3 (7 reds and 3 blacks remaining).
On the other hand, if you never reach the 2/3 ratio, then you lose. (In particular, if the last card is red, then you’ve “reached the 2/3 ratio” before that card is turned over, so you’d call “Next” on that card. So “never reaching the 2/3 ratio” means you go all the way to the last card and it’s black and you lose.)
This means that the probability of winning is
where p is the probability of eventually reaching a ratio of at least 2/3 (that is, the probability of calling “Next” at some point), and d is some small number.
But from the solution to the brainteaser, we also know that the probability of winning is exactly 1/2. So it follows that p is very close to (and slightly less than) 3/4.
More generally, by the same logic: Suppose the ratio of red cards in a randomized pile is a/b, and you want to know the probability that you’ll at some point reach a ratio of at least c/d. Of course, if c/d < a/b, then the answer is 100% because you reach a ratio of more than c/d before you even start. But if c/d > a/b, the answer is slightly less than ad/bc. The larger the deck, the smaller the value of “slightly”.
I posted this on Reddit in the /r/math forum
and a commenter pointed out I had rediscovered a result called “Doob’s martingale inequality”, although I think my example is vastly easier to understand than the Wikipedia article.
I was re-thinking this on the plane and I think I found a few flaws on it — mostly accurate, just underestimating the size of “some small delta” in some case (in particular, in cases where it definitely does not shrink in proportion to the size of the deck).
For example, suppose you want to find the odds that the proportion ever exceeds 90%. My formula says 5/9 minus some small delta. However, the odds of the remaining proportion of red cards hitting 90% is negligible *except* in the case where the last remaining card is red (imagine running the process in reverse, and dealing cards — the pile of cards you’ve dealt is never going to be more than 90% red unless the first card you deal is red). In other words, the odds of hitting 90% is about 1/2. My “5/9 minus some small delta” is always going to be off by an amount that does not shrink as the deck gets larger.
At the other end of the range, suppose you’re trying to calculate the odds of the proportion of remaining reds reaching 50.1%. (In other words, what are the odds that the number of red cards remaining will at some point exceed the number of black cards.) The formula gives the odds at almost 100%. But this seems intuitively wrong — it seems plausible that you could deal a run of red cards at the beginning, setting back the number of red cards remaining in the deck, and then never happen to catch up until the end.
The inequality is correct, but I was wrong about “some small delta” (I had assumed it would always shrink in proportion with the size of the deck, but that’s not the case).
Very nice! I remember the original post, the trick is great!
I solved the problem using dynamic programming with states (n,r) where n is the number of cards that have been drawn so far and r is the number of those cards that are red. In any state, the probability that you will eventually win the prize equals the probability that a red card will be drawn on the following draw, (26-r)/(52-n). This is true regardless of whether you say “next” or not, with one exception. If 51 cards have been drawn and 25 were red, then you must say “next”, in which case you are guaranteed to win the prize. If you follow this policy, at the start of the game (state n=r=0), your probability of eventually winning the prize is exactly ½. You may as well say “next” before the first draw and save yourself some time.