Powered by Blogger.

Handicapping a Race to 7

Karl and Richard are playing a series to 7 (no need to win by 2) in a two-player game, and their skill levels are such that Karl has a probability $p$ of winning any individual game in the series.

If we give Karl a handicap of 2 games so that he only needs to win 5 games to win the series, whereas Richard needs to win 7, what is the single-game probability $p$ that gives each player a 50% chance of winning the series?


I've tagged this post with "Billiards" since the question came from the director of my pool league (who I hear is an avid GTM reader). Actually, what he really wants to know is the best way to set up the scoring, tie-breakers, and handicaps in one of the leagues, but in order to answer these questions, I wanted to start by looking at just one series and then extend that to the more general questions. This analysis really has nothing to do with pool though, so it would work for any other game as well.

Binomial Coefficients


To start out, we'll need the binomial coefficients $\binom{n}{k}$, which are read as "$n$ choose $k$." $\binom{n}{k}$ is the number of ways to choose $k$ items from a set of $n$ distinct items, for example the number of $k$-person boards of directors that can be chosen from $n$ candidates. Note that choosing the $k$ candidates who are included in the board is the same as choosing the $n-k$ who are not included, which means that $\binom{n}{k} = \binom{n}{n-k}$. The formula for the binomial coefficients is $$\dbinom{n}{k} = \dfrac{n!}{k! (n-k)!}$$ from which the above-mentioned symmetry is obvious. Note that even though this is a fraction, it always comes out to be an integer.

The binomial coefficients got their name from the fact that they are the coefficients in the expansion of binomials: $$ (x+y)^n = \sum_{k=0}^{n}{\dbinom{n}{k} x^k y^{n-k}}
$$ This is because in the product $(x+y)(x+y)...(x+y)$ (with $n$ factors of $(x+y)$), each factor of $(x+y)$ contributes either an $x$ or a $y$ to a factor in the sum. If you have $k$ $x$'s in a term, the other $n-k$ factors of $(x+y)$ must have contributed a $y$. There are $\binom{n}{k}$ ways to get $k$ $x$'s and $n-k$ $y$'s, hence the formula. If anyone wants more detail on that, just ask in the comments, and I'll give a more detailed explanation.

Most identities about binomial coefficients can be proved either by using the formula with the factorials, or via a combinatorial argument. For example, for integers $n$, $m$, and $k$ with $0 \leq k \leq m \leq n$, we have the following identity, known as the subset of a subset identity: $$\dbinom{n}{m} \dbinom{m}{k} = \dbinom{n}{k} \dbinom{n-k}{m-k}
$$Algebraic proof: $$
\begin{align}
\dbinom{n}{m} \dbinom{m}{k} &= \dfrac{n!}{m! (n-m)!} \cdot \dfrac{m!}{k! (m-k)!} \\[3mm]
&= \dfrac{n!}{k!(n-m)!(m-k)!} \\[3mm]
&= \dfrac{n!}{k! (n-k)!} \cdot \dfrac{(n-k)!}{(n-m)! (m-k)!} \\[3mm]
&= \dbinom{n}{k} \dbinom{n-k}{m-k} \tag*{$\square$}
\end{align}
$$Combinatorial proof:
The left side of the identity is the number of ways to choose a board of directors with $m$ members from $n$ candidates, and then choose $k$ executive members from the $m$. The right side counts the number of ways to choose $k$ executive members from the $n$ candidates and then choose the $m-k$ non-executive board members from the $n-k$ remaining candidates. These count the same thing, so the two sides must be equal. $\tag*{$\square$}$

Winning a series to 7


In order to win a series to 7, without needing to win by 2, Karl needs to win 7 games, with Richard winning anywhere from 0 to 6 games in the series. If Karl wins 7 games, and Richard wins 3 games (for example), there will be a total of 10 games in the series. The 3 games that Richard does win can come anywhere in the 10 games, except for the 10th game- if it did, then Karl would have already won 7 and the series would not have made it to 10 in the first place. So we can choose from the first $10-1=9$ games where Richard's 3 wins go.

The probability that Karl wins a given game is $p$, which means the probability that Richard beats Karl is $1-p$. Combining all this, we can see that the probability that Karl beats Richard in a race to 7, with Richard winning 3 games, is $$ \binom{10-1}{3} p^7 (1-p)^3
$$Since Karl can win the series with Richard winning anywhere from 0 to 6 games, the total probability that Karl wins the series is the sum over the possible outcomes, with the summation index $k$ being the number of wins Richard gets in the series: $$
\begin{align}
{\Bbb P}(\text{Karl wins the series}) &= \sum_{k=0}^{7-1}{\binom{7+k-1}{k} p^7 (1-p)^k} \\[3mm]
&= \sum_{k=0}^{6}{\binom{6+k}{k} p^7 (1-p)^k}
\end{align}
$$ Here's a graph of the probability that Karl wins the series, for different values of $p$:

 

Not surprisingly, if there's a 50% chance that either player wins an individual game, then there's also a 50% chance that either player wins the series.

Now, let's say we give Karl a handicap of 2 games so that to win the series, Karl needs to win 5 games and Richard needs to win 7. More generally, if we call the handicap $H$, where $0 \leq H \leq 6$, then by the same reasoning as we used above, we get the modified formula: $$
{\Bbb P}(\text{Karl wins the series}) = \sum_{k=0}^{6}{\binom{6-H+k}{k} p^{7-H} (1-p)^k}
$$ Now Karl only needs to win $7-H$ games, and so the total number of games in the series for a given value of $k$ wins for Richard, is $7-H+k$, with the $k$ losses once again being placed anywhere but the last game.

Here are the graphs of Karl's probabilities of winning the series given different values of $p$ and $H$ (you can click to expand the picture):


Now, I'd love to be able say we're done here, but the fact is that for some real Karl and Richard, we have no idea what the value of $p$ is unless we are lucky enough to have a history of, say, 100 games between these two players. And even then, they could have improved over time or gotten rusty or whatever so that games they played a few months ago aren't so telling now as to the value of $p$.

We do know that every player in the league is assigned a ranking (which directly determines the handicap against an opponent) which is certainly partly subjective and determined based on observation by a few very experienced players who run, and possibly play in, the league. Instead of trying to guess $p$ and then assigning the rankings, which would be useless in the absence of a large history of games between each set of two players, we can use the handicaps to back out the value of $p$ that makes the match 50-50. For example, if Karl and Richard's rankings are such that Karl gets a handicap of 3, we can see from the graph above that the match will be 50-50 if Karl's probability $p$ of winning an individual game is about 35.5%.

Using Excel's Goal Seek functionality, I've backed out the values of $p$ that make a 7-game series 50-50 for different handicaps:


To test whether a player's handicap is appropriate, one could take all that player's games against opponents of different ranks and see what percentage of individual games he wins and how far off those percentages are from the table above (perhaps using a chi-square test for goodness of fit). If there are not enough games to do this analysis for individual players, then one could start by looking at the percentages for all games and then looking into the ranks furthest away from the table values and seeing if the stats of any particular player(s) are driving the difference. That's a bit of a manual exercise, but it's a start...

GTM Reader Challenge


"Backing out" $p$ basically means finding the inverse of the function $f(p) = \sum_{k=0}^{6}{\binom{6-H+k}{k} p^{7-H} (1-p)^k}$. We know the function has an inverse because if you look at the graphs, they all pass the horizontal line test. To be more rigorous, they are polynomials in $p$ and thus continuous, and $f(0)=0$ and $f(1)=1$, so the intermediate value theorem tells us that $f$ is surjective. Furthermore, $f$ is increasing on the interval $p \in [0,1]$, so it's also one-to-one, and thus has an inverse.

Now, while Excel Goal Seek will certainly work for this, it would be kind of nice to know the inverse function, so I worked for a few hours today trying to figure out how to invert $f$, but couldn't quite figure it out. Maybe one of my more nerdy readers wants to take a crack at it? Otherwise, maybe I'll go post the question on stack exchange...

[Update 7/29/2015: there's been some confusion on the question I'm asking, so just to clarify, for the purposes of finding the inverse of $f(p)$, assume that the $H$ in the formula above is a constant. So technically, there is a different function $f$ for each value of $H$, which I guess you could call $f_{H}(p)$ or something.]

What I was trying (and maybe this isn't the best way to go about it) was to find a not-too-awful formula for the coefficient of $p^n$ in the sum above and then try to use the Lagrange inversion formula, but it gets a bit messy with all the binomial coefficients. I tried to expand the $(1-p)^k$, turning the sum into a double sum, then switch the order of summation (making sure to adjust the summation limits- the Iverson bracket is helpful at this step), and finally simplify somehow using identities of the binomial coefficients such as the subset of a subset identity above, but said simplification proved elusive, so I didn't even bother with the inversion formula.

Anyone have any thoughts on that or maybe a different way to find the inverse of $f(p)$? Let me know in the comments or email me, and I can provide more details of the computation I tried.

Thanks for reading, and I will try to do a follow-up on this post soon. As always, feel free to ask questions in the comments section.

Recursions Part 2: Generating Functions

Prerequisites: Power Series, Recursions Part 1

Ok folks, it's been a long and relaxing vacation for me, which is why you haven't seen any new posts in the past 3 weeks. I had intended to do this one Live aus Berlin, but it turned out there was lots of better stuff to do there (including the literally 5 hours I spent at the absolutely fantastic Deutsches Historisches Museum; and I only left because they were closing. I started at Charlemagne at 1 PM and was only half-way through the WW2 section when they closed at 6- I still had the entire GDR and cold war to go...), so this one is coming to you Live aus London instead. Hope that's ok...who am I kidding? I'm pretty sure only like 5 people actually read this anyway, so of course it's ok! Also, those of you who know me won't be surprised to learn that I am writing this from the pub, so just a fair warning that the quality may well deteriorate towards the end...

In the last post, I did a brief overview of how we can use a power series to represent a function, and we wanted the series to converge (pointwise) to our function within a neighborhood of $x=0$, or some other central point. In the post before that, I showed you how to solve a simple recursion. In that post, the recursion we came up with was of degree 1, which means that the $n$-th term depended only on the $(n-1)$-th term, and so we could move them both to the same side of the equals sign to get an expression for the $n$-th difference $R(n)-R(n-1)$. We then summed up the differences to get a formula for $R(n)$ in terms of $n$.

That method was pretty fresh, but if the recursive definition of $R(n)$ includes terms before just $R(n-1)$, for example $R(n-2)$, then this method won't work anymore. What I'm going to show you in this post is one of the most clever things I've ever seen, and it is a very powerful method for dealing with higher-degree recursions (I will specify what this means soon if you haven't guessed it already). Basically, what we are going to do is find a function whose power series $\sum_{i=0}^{\infty}{c_n z^n}$ has coefficients $c_n$ which are the $n$-th terms of the recursion we want to solve. Once we back out the coefficients using Taylor's theorem, we will have solved the recursion. Whoever thought of this was wicked creative (and that's not a subtle self-call, because it definitely wasn't me).

What's interesting is that with generating functions, we just need the power series, but we really won't care when it converges, because we are just using it to back out the coefficients. I'll go into more detail on this point below, but it's a big departure from how we looked at power series before, where we pretty much only cared about if, when, and how (i.e. pointwise, uniformly, etc.) the series converged to our function.

Without further ado, let's get crackin'.

Types of Recursions


I started a bit on this above, but a recursion of degree $d$ is one in which the $n$-th term $R(n)$ depends on only some or all of the terms $R(n-1)$ through $R(n-d)$. So $R(n) = R(n-1) + 4nR(n-3)$ is a recursion of degree 3, for example. Note that for a recursion of degree $d$, we need to be given the values of the first $d$ terms in order to solve it.

A recursion equation is linear if the formula for $R(n)$ is a linear combination of the previous terms, i.e. (for some degree $d$) $$
R(n) = a_{n-1} R(n-1) + a_{n-2} R(n-2) + ... + a_{n-d} R(n-d) + \alpha (n)
$$ where the $a_i$'s can either be numbers or depend on $n$, but can't contain any earlier terms, and $\alpha (n)$ is some fixed function of $n$ called the particularity function. If the latter is just 0, then the recursion (or recurrence, not sure if I'm using the word 100% correctly, but whatever, it's the same thing in my mind) is called homogeneous. If the recursion formula has, for example, products of earlier terms it in, then it isn't linear.

Here are some examples of recursions:

Tower of Hanoi
This one arises when we analyze that game with the 3 rods on which we have discs of varying diameter- you know, this one:
The object of the game is to move all the discs from rod 1 to rod 3 by moving only one at a time from the top of a stack, and with the proviso that we may never place a larger one on top of a smaller one. If there are $n$ discs, then the number of moves $h_n$ it takes to accomplish this is given by the recursion formula $$
h_n = 2h_{n-1} + 1 \\
h_1 = 1
$$See if you can derive this formula on your own by assuming the number of moves for $n-1$ discs is given and then solving the game for $n$ discs. This is a linear, non-homogeneous recursion of degree 1.

Fibonacci Numbers
The Fibonacci numbers are ubiquitous, arising from the problem of reproducing pairs of rabbits, the golden ratio (where you have rectangles consisting of two squares- a curve through their corners is called a Fibonacci spiral), number of binary strings of length $n$ without consecutive 1's, etc. The recursion formula is $$
f_n = f_{n-1} + f_{n-2} \\
f_1 = 1, f_2 = 1
$$ which is linear (with constant coefficients, i.e. they don't depend on $n$ as the coefficients are both 1) of degree 2 and homogeneous.

Catalan Numbers
These are ubiquitous in the field of combinatorics, arising from all sorts of counting problems. The recursion formula is $$
C_n = C_0 C_{n-1} + C_1 C_{n-2} + ... + C_{n-1} C_0 \\
C_0 = 1
$$ which is non-linear, not of finite degree (because the number of previous terms in the formula depends on $n$), and homogeneous.

Generating Functions


Now that we've defined what types of "harder" recursions we can encounter, let's move on to the main topic, which is solving them using generating functions.

A generating function for a recurrence $R(n)$ is a formal power series $\sum_{i=0}^{\infty}{c_n z^n}$ where the $n$-th coefficient $c_n$ is equal to the $n$-th recurrence value $R(n)$ for every value of $n$. Technically, I have defined an ordinary generating function (OGF); there are other types such as exponential and Poisson generating functions which are easier to apply in different situations, but they all have the same idea of the $n$-th coefficient's coinciding with the $n$-th thing that we are looking for. I'm only going to talk about OGF's in this post.

In the definition above, I mentioned a formal power series. What this is is a power series where we don't care at all about whether it converges or, if so, for which values of $z$. Basically, we are just going to use the $z$'s as a tool to help keep track of our coefficients $c_n$ without any regard for the convergence properties of the series.

To add and subtract two formal power series, we just add/subtract the coefficients of the terms with the same power of $z$, and to multiply them, we use the distributive property- basically, all the same way that we would do it for a regular power series. The same applies to taking derivatives and integrals of the series.

To show how to use OGF's to solve a recursion, let's take as an example the following linear, homogeneous recursion of degree 2: $$
g_n = 5g_{n-1} - 6g_{n-2} \\
g_0 = 1, g_1 = 2
$$ What we need to do is find a function $G(z)$ which has a power series whose $n$-th coefficient is equal to $g_n$.

Here's how it's done. We'll multiply both sides of the recursion equation by $z_n$, sum them from $n = 2$ to $n = \infty$ (starting at 2 because the the recursion formula starts at $n=2$, with $g_0$ and $g_1$ given), and then get an algebraic expression in terms of $G(z)$. Heeeeeeeeeere we go:$$
\begin{align}
g_n &= 5g_{n-1} - 6g_{n-2} \\[3mm]

g_n z^n &= 5g_{n-1}z^n - 6g_{n-2}z^n \\[3mm]

\sum_{n=2}^{\infty}{g_n z^n} &=
\sum_{n=2}^{\infty}{5g_{n-1}z^n} - \sum_{n=2}^{\infty}{6g_{n-2}z^n} \\[3mm]

\sum_{n=2}^{\infty}{g_n z^n} &=
5z \left( \sum_{n=2}^{\infty}{g_{n-1}z^{n-1}} \right) -
6z^2 \left( \sum_{n=2}^{\infty}{g_{n-2}z^{n-2}} \right) \\[3mm]

\sum_{n=0}^{\infty}{g_n z^n} - g_1 z - g_0 &=
5z \left( \sum_{n=1}^{\infty}{g_{n-1}z^{n-1}} - g_0 \right) -
6z^2 \left( \sum_{n=2}^{\infty}{g_{n-2}z^{n-2}} \right) \\[3mm]

G(z) - g_1 z - g_0 &=
5z \left( G(z) - g_0 \right) -
6z^2 \left( G(z) \right) \\[3mm]

G(z) (1 - 5z + 6z^2) &= g_1z + g_0 - 5g_0 z \\[3mm]

G(z) &= \dfrac{(g_1 - 5g_0)z + g_0}{1- 5z + 6z^2}\\[3mm]

G(z) &= \dfrac{(1 - 3z)}{1- 5z + 6z^2} \\[3mm]

G(z) &= \dfrac{(1 - 3z)}{(1 - 3z)(1 - 2z)} \\[3mm]

G(z) &= \dfrac{1}{1 - 2z} \\[3mm]

G(z) &= \sum_{n=0}^{\infty}{2^n z^n}

\end{align}
$$ Note that in the last step, we used the known power series $$\dfrac{1}{1-x} = \sum_{n=0}^{\infty}{x^n}$$ and substituted in $2z$ for $x$.

Now, as discussed in the Power Series post, this series only converges when $|x| < 1$ or $|z| < \frac{1}{2}$. But guess what? WE DON'T CARE! Because this was just a way to get the coefficients, and we got them. Thus we have solved the recursion; the solution is $$
g_n = 2^n
$$ for all values of $n$. You can verify this by plugging the answer back into the recursion equation, and you'll see that both sides of the equals sign balance.

So that is pretty awesome and illustrates the idea behind OGF's. If you don't think that was wicked cool, then I would have to recommend more..."primitive" forms of entertainment for you, such as MTV's hit series Jersey Shore (which, I must admit, I am a big fan of myself...).

Now, the astute reader probably did think that was wicked cool, but is still just a tad bit bugged by the fact that this example was so obviously contrived so that $G(z)$ had a denominator which factored easily to give a really simple answer.

The astute reader is correct. If we look at the exact same recursion, except change one of the initial conditions from $g_0 = 1$ to $g_0 = 0$, we can repeat the exact same process to arrive at $$
G(z) = \dfrac{2z}{(1-3z)(1-2z)}
$$ which does not easily reduce to a known power series.

But not to worry; we can use a method called partial fraction decomposition to rewrite this quotient as $$
\begin{align}
G(z) &= \dfrac{2}{1-3z} + \dfrac{-2}{1-2z}\\[3mm]
G(z) &= \sum_{n=0}^{\infty}{2 \cdot 3^n z^n} + \sum_{n=0}^{\infty}{(-2) 2^n z^n} \\[3mm]
&\Downarrow \\[3mm]
g_n &= 2 \cdot 3^n - 2^{n+1}
\end{align}
$$ Now, I don't really want to go into detail about partial fraction decomposition, because it's not that exciting, and you can look it up really quickly on Google if you want to know more about it. But I did want to point it out as a useful method when you end up with $G(z)$'s for which you can't just factor the denominator and then use the known power series for $\dfrac{1}{1-x}$.

Another thing I want to point out here is that, although we just looked at an example of a homogeneous recursion, the OGF method also works for non-homogeneous recursions such as the Tower of Hanoi one above.

If you work out the same steps for that one, you'll get the generating function $$
H(z) = \dfrac{z}{(1-z)(1-2z)} = \dfrac{1}{1-2z} - \dfrac{1}{1-z}
$$ and thus $h_n = 2^n - 1$, once again by using the known series of $\dfrac{1}{1-x}$.

Just to sum up here, I'll show you the OGF's of the Fibonacci and Catalan numbers mentioned above.

For the Fibonacci's, the OGF is $$
F(z) = \dfrac{z}{1-z-z^2}
$$ You need to use a partial fraction decomposition to extract the coefficients, which turn out to be $$
f_n = \dfrac{1}{\sqrt 5} \cdot (\gamma^n - {\hat{\gamma}}^n)
$$ where $\gamma$ is the golden mean $\dfrac{1+\sqrt 5}{2}$, and $\hat \gamma$ is its conjugate $\dfrac{1-\sqrt 5}{2}$. This formula for $f_n$ is called the Binet formula for the Fibonacci numbers, named after Jacques Binet, who rediscovered it in 1843 after, you guessed it, our prolific friend Euler had published it in 1765.

I also used the Catalan numbers as an example, and you can click here to see the generating function and formula for these, from the very cool math blog of a guy named Mike Spivey. Who knew there was another math blog out there? Well, there you have it, there is! [Update: as I mention in the comments in response to Joe's question, I didn't do any research on the history of the Binet formula above when I first wrote this post, but now that I have looked into it, I found an interesting history of collaboration as well as independent discovery among famous mathematicians in various nations (frequently at war with each other during the period of European history in question) as they encountered these prolific number sequences and developed generating function and other methods in the 18th and 19th centuries. Here is a link to a good history of the Catalan numbers.]

And with that, there you have it- that's the end of this post. I hope you enjoyed it and thought that OGF's are as cool as I think they are. Just another example of how math and creativity are abolutely NOT mutually exclusive (in fact, quite the opposite, though I must admit that I myself have zero creativity and just copy most of this stuff from somewhere else that I've seen it...)!

Thanks for reading, and please post any questions in the comments section.