How many matches in average do you need to play in order to win 5 times in a row?

Tl;dr: You will need to play $ \sum_{i=1}^n p^{-i} $ matches in average to win $n$ times in a row, where $p$ is the winning rate per match.

Introduction

In many games, you need to win consecutive matches to get a “prize”. For example, in Yugioh Duel Links, you need to win consecutive duels to rank up in the ranked duel; for the title of King of Games, you even need to win 5 consecutive duels. In job hunting, a candidate needs to win in successive interviews to secure a job; if he fails to do so, he will be rejected by the company in question and has to start from the beginning with another company. In the dating game, a male player needs to win in successive dates to get a female; if he fails to do so, he will be eliminated by the female in question and has to start from the beginning with another female. This requirement brings the question: How many matches does one need to play to win $n$ times in a row?

Of course, the best case is $n$ matches – you are so lucky that you win every match following. However, no reasonable persons will count on the best case. Therefore, a sensible index is the average number of matches that you need to play.

Let us suppose that all matches are independent and have the winning rate of $ p \in (0,1] $ uniformely. One may be attempted to calculate the value of $ p^n $. However, this is the probability of winning $n$ consecutive matches, not the average number you need to play. If you take the inverse $ p^{-n} $, this is again not correct: It means that, by taking every $n$ matches as a group, $ p^{-n} $ is the average number of groups that you need to play to get a group where you win all matches. Then, one may resort to the negative binomial distribution; this is not relevant either.

Then, how to solve this question correctively? I have two solutions – one using the Markov chain and the other using the martingale. Before proceeding to those technique details, let us first have a look at the results. The left figure below shows that the number of matches increases exponentially with regard to the number of winning streak. The right figure shows that low winning rate players are much more affected by this mechanism. The code can be found here.

result 1 result 2

Markov chain

Let us use a $W$ to denote a win, an $L$ to denote a loss, and a string of $W$’s and $L$’s to denote a sequence of match results. For instance, $WLW$ means that you win a match, then lose one, and then win one again. Each such string is a state of the Markov chain. For instance, when your goal is to win 2 consecutive matches ($n=2$), there are 4 states in the Markov chain: $WW$, $WL$, $LW$, $LL$. To keep the notation concise, I will use the power operator to denote consecutive wins or losses. For instance, $W^i$ means $i$ consecutive wins. Furthermore, I use $ * $ to denote an arbitrary sequence.

Let $N$ be the number of matches you still need to play until you win $n$ times in a row. $N$ is a random variable, which obviously depends on the current winning streak. If you already have an $n-1$ winning streak, $N$ will definitely be much smaller than when you have 0 winning streak.

Let us denote $x_i$ as the expected number of matches you still need to play until you win $n$ times in a row given that you have already won $i$ games in a row. Mathematically, \[ x_i := \mathbb{E} [ N| * LW^i ], \quad \forall i=0, 1, \ldots, n-1, \] \[ x_n := \mathbb{E} [ N| W^n ] = 0. \] Then, we get the following equations: \[ x_i = 1 + p x_{i+1} + (1-p) x_0 , \quad \forall i=0, 1, \ldots, n-1, \] by applying the Markov property.

Combining with the boundary condition above, we get a linear system with $n+1$ variables and $n+1$ equations. It will take a while to solve this system by hand, but it is easy to verify by hand that \[ x_i = \sum_{j=i+1}^n p^{-j}, \quad \forall i=0, 1, \ldots, n \] is the only solution. In particular, \[ x_0 = \sum_{j=1}^n p^{-j}. \] If $p=1$ (100% winning rate), then $x_0 = n$ (the least number of matches possible).

Martingale

Suppose that, when you are playing the matches, your friend is running a casino based on your match results. Each time you start a match, a new guest enters the casino. He bids 1 dollar for your win: If you lose, he will lose this dollar and leave the casino; if you win, he will get $p^{-1}$ dollars for each dollar he bids, and then he will bid all these money for your next win. He repeats this process until either you lose or you win $n$ matches in a row.

This is a martingale, and thus a fair gambling, for each guest. Consequently, it is also a fair gambling for your friend. By the time when you win $n$ consecutive matches, your friend will have to pay a total of $ \sum_{i=1}^n p^{-i} $ dollars to the last $n$ guests. Since this is a fair gambling, your friend has won the same amount of money in average, which means that the gambling, hence your match, has been played $ \sum_{i=1}^n p^{-i} $ times in average.

You may also like

Written on September 10, 2017