# A Game of Throws

I recently had the chance to mentor a high school student who was doing an internship at the Max Planck institute for Nuclear Physics. Since every project related to my actual work would require quite a solid knowledge of Quantum Field Theory, we could not do much there together. Still, I wanted to give him an idea of how does it feel like to work in theoretical physics, I decided to make him work on problems related to statistics and Monte Carlo simulations, which are easy enough to understand (at least, intuitively) with the math knowledge of a high school student.

When we started reviewing together the basics of combinatorial calculus (possibly one of my favourite topics in mathematics) I started assigning him some exercises, which more often than not I would invent on the spot. At some point, I assigned him the following innocuous looking problem:

Suppose you are playing a game where you throw a die times, with larger than two, and you win if in the sequence of throws you obtain at least two sixes in a row. What’s the probability to get such a winning sequence?

For instance, 1**66**45 is a winning sequence,
whereas 65626 is a losing one.

For reasonably small values of , a computer can quickly enumerate every possible sequence of throws, and from that it is possible to calculate a probability. For larger values of , instead, it is trivial to find an approximate solution using a Monte Carlo technique: just build sequences of throws at random and divide the number of winning ones by the number of total sequences generated.

Is it possible, however, to find an analytical solution to this problem? In order to do it, one should be able to count all the sequences of throws which contain at least two sixes in a row and divide that number by , the total number of sequences of throws of a die.

The first solution my student proposed me was along the following lines: let us count the number of sequences we have when we fix the initial throws to be two sixes, and let the other ones be arbitrary.

We can then move the sequence of two sixes to be not at the beginning, but starting from the second throw, and so on until the sequence is at the end.

Since we fixed two throws of the die, there are throws which can yield arbitrary results (marked in the figures with asterisks), for a total of possibilities. In this way, one could think that the total number of winning sequences are . Thus, the answer to the problem seems

This, however, cannot be the true solution, because probabilities must be smaller than one, while becomes larger than one for larger than . This suggests that the solution is wrong and is, in fact, a symptom of double-counting. It’s easy to see what’s wrong with this solution if we enumerate explicitly all the possible sequences for .

By counting twice some combinations, we overestimate our chances to win.

Thus, we need a different strategy to count. It is not really easy to fix the initially-proposed method, so let’s start from scratch: we will build the solution inductively! Assume that we already know that for throws there are possible sequences which contain two sixes in a row; how does this number change when we throw the die one extra time?

Of course, if we already had a winning sequence, whatever outcome we have on the new throw will result in a winning sequence; since there are six possible outcomes, we have sequences of this kind.

Moreover, in some cases a new throw can create new winning sequences! In fact, if we had a losing sequence which ended with a , if the new throw results in a we will obtain a winning sequence. Graphically, we have:

How many such sequences are there? Well, since the last two throws must be fixed (i.e., two sixes) we only need to calculate the number of losing sequences of size ; this number is given by subtracting , the number of winning sequences of size , from , the number of total sequences of size .

As a result, we obtain the following equation for the total number of winning sequences in throws:

This is starting to look quite nice: if we calculate by brute force two consecutive values
of , for instance and , we can get every following value
of by successive applications of Equation 2. For large values of , it is
**MUCH** more easy to evaluate times Equation 2 rather than building each of the
sequences.

2 | 1 | 36 |

3 | 11 | 216 |

4 | 101 | 1296 |

5 | 811 | 7776 |

6 | 6061 | 46656 |

7 | 43331 | 279936 |

But this is not over; we can do better! It is actually possible to “solve” Equation 2 for , and get immediately the answer to the starting problem. I will not delve into detail here, but it is easy to check that the solution is

Notice how we have an apparently complicated combination of sums of powers of irrational
numbers, and yet is always an integer number, because it satisfies Eq. (3)!
Small wonders of mathematics ¯\*(ツ)*/¯

And so, the probability of winning at the Game of Throws is

Notice especially that the asymptotic behaviour is correct this time: for very large , we can approximate as , thus the probability of winning approaches one, as one could expect intuitively.

It is interesting to see that for small number of throws the above mentioned double counting problem is not severe, so for the “probability” could still be a useful estimate.

Overall, I think it was a great exercise; it allowed me (and especially my student) to do quite nice calculations, both analytically and numerically, on an applied problem, and learn how to solve recurrence equations, which can be much more easy to understand by a high school student with respect to differential equations while sharing with them quite some properties.