## Recursions Part 2: Generating Functions

Prerequisites: Power Series, Recursions Part 1

Ok folks, it's been a long and relaxing vacation for me, which is why you haven't seen any new posts in the past 3 weeks. I had intended to do this one Live aus Berlin, but it turned out there was lots of better stuff to do there (including the literally 5 hours I spent at the absolutely fantastic Deutsches Historisches Museum; and I only left because they were closing. I started at Charlemagne at 1 PM and was only half-way through the WW2 section when they closed at 6- I still had the entire GDR and cold war to go...), so this one is coming to you Live aus London instead. Hope that's ok...who am I kidding? I'm pretty sure only like 5 people actually read this anyway, so of course it's ok! Also, those of you who know me won't be surprised to learn that I am writing this from the pub, so just a fair warning that the quality may well deteriorate towards the end...

In the last post, I did a brief overview of how we can use a power series to represent a function, and we wanted the series to converge (pointwise) to our function within a neighborhood of $x=0$, or some other central point. In the post before that, I showed you how to solve a simple recursion. In that post, the recursion we came up with was of degree 1, which means that the $n$-th term depended only on the $(n-1)$-th term, and so we could move them both to the same side of the equals sign to get an expression for the $n$-th difference $R(n)-R(n-1)$. We then summed up the differences to get a formula for $R(n)$ in terms of $n$.

That method was pretty fresh, but if the recursive definition of $R(n)$ includes terms before just $R(n-1)$, for example $R(n-2)$, then this method won't work anymore. What I'm going to show you in this post is one of the most clever things I've ever seen, and it is a very powerful method for dealing with higher-degree recursions (I will specify what this means soon if you haven't guessed it already). Basically, what we are going to do is find a function whose power series $\sum_{i=0}^{\infty}{c_n z^n}$ has coefficients $c_n$ which are the $n$-th terms of the recursion we want to solve. Once we back out the coefficients using Taylor's theorem, we will have solved the recursion. Whoever thought of this was wicked creative (and that's not a subtle self-call, because it definitely wasn't me).

What's interesting is that with generating functions, we just need the power series, but we really won't care when it converges, because we are just using it to back out the coefficients. I'll go into more detail on this point below, but it's a big departure from how we looked at power series before, where we pretty much only cared about if, when, and how (i.e. pointwise, uniformly, etc.) the series converged to our function.

Without further ado, let's get crackin'.

#### Types of Recursions

I started a bit on this above, but a recursion of degree $d$ is one in which the $n$-th term $R(n)$ depends on only some or all of the terms $R(n-1)$ through $R(n-d)$. So $R(n) = R(n-1) + 4nR(n-3)$ is a recursion of degree 3, for example. Note that for a recursion of degree $d$, we need to be given the values of the first $d$ terms in order to solve it.

A recursion equation is linear if the formula for $R(n)$ is a linear combination of the previous terms, i.e. (for some degree $d$) $$R(n) = a_{n-1} R(n-1) + a_{n-2} R(n-2) + ... + a_{n-d} R(n-d) + \alpha (n)$$ where the $a_i$'s can either be numbers or depend on $n$, but can't contain any earlier terms, and $\alpha (n)$ is some fixed function of $n$ called the particularity function. If the latter is just 0, then the recursion (or recurrence, not sure if I'm using the word 100% correctly, but whatever, it's the same thing in my mind) is called homogeneous. If the recursion formula has, for example, products of earlier terms it in, then it isn't linear.

Here are some examples of recursions:

Tower of Hanoi
This one arises when we analyze that game with the 3 rods on which we have discs of varying diameter- you know, this one:
The object of the game is to move all the discs from rod 1 to rod 3 by moving only one at a time from the top of a stack, and with the proviso that we may never place a larger one on top of a smaller one. If there are $n$ discs, then the number of moves $h_n$ it takes to accomplish this is given by the recursion formula $$h_n = 2h_{n-1} + 1 \\ h_1 = 1$$See if you can derive this formula on your own by assuming the number of moves for $n-1$ discs is given and then solving the game for $n$ discs. This is a linear, non-homogeneous recursion of degree 1.

Fibonacci Numbers
The Fibonacci numbers are ubiquitous, arising from the problem of reproducing pairs of rabbits, the golden ratio (where you have rectangles consisting of two squares- a curve through their corners is called a Fibonacci spiral), number of binary strings of length $n$ without consecutive 1's, etc. The recursion formula is $$f_n = f_{n-1} + f_{n-2} \\ f_1 = 1, f_2 = 1$$ which is linear (with constant coefficients, i.e. they don't depend on $n$ as the coefficients are both 1) of degree 2 and homogeneous.

Catalan Numbers
These are ubiquitous in the field of combinatorics, arising from all sorts of counting problems. The recursion formula is $$C_n = C_0 C_{n-1} + C_1 C_{n-2} + ... + C_{n-1} C_0 \\ C_0 = 1$$ which is non-linear, not of finite degree (because the number of previous terms in the formula depends on $n$), and homogeneous.

#### Generating Functions

Now that we've defined what types of "harder" recursions we can encounter, let's move on to the main topic, which is solving them using generating functions.

A generating function for a recurrence $R(n)$ is a formal power series $\sum_{i=0}^{\infty}{c_n z^n}$ where the $n$-th coefficient $c_n$ is equal to the $n$-th recurrence value $R(n)$ for every value of $n$. Technically, I have defined an ordinary generating function (OGF); there are other types such as exponential and Poisson generating functions which are easier to apply in different situations, but they all have the same idea of the $n$-th coefficient's coinciding with the $n$-th thing that we are looking for. I'm only going to talk about OGF's in this post.

In the definition above, I mentioned a formal power series. What this is is a power series where we don't care at all about whether it converges or, if so, for which values of $z$. Basically, we are just going to use the $z$'s as a tool to help keep track of our coefficients $c_n$ without any regard for the convergence properties of the series.

To add and subtract two formal power series, we just add/subtract the coefficients of the terms with the same power of $z$, and to multiply them, we use the distributive property- basically, all the same way that we would do it for a regular power series. The same applies to taking derivatives and integrals of the series.

To show how to use OGF's to solve a recursion, let's take as an example the following linear, homogeneous recursion of degree 2: $$g_n = 5g_{n-1} - 6g_{n-2} \\ g_0 = 1, g_1 = 2$$ What we need to do is find a function $G(z)$ which has a power series whose $n$-th coefficient is equal to $g_n$.

Here's how it's done. We'll multiply both sides of the recursion equation by $z_n$, sum them from $n = 2$ to $n = \infty$ (starting at 2 because the the recursion formula starts at $n=2$, with $g_0$ and $g_1$ given), and then get an algebraic expression in terms of $G(z)$. Heeeeeeeeeere we go:\begin{align} g_n &= 5g_{n-1} - 6g_{n-2} \\[3mm] g_n z^n &= 5g_{n-1}z^n - 6g_{n-2}z^n \\[3mm] \sum_{n=2}^{\infty}{g_n z^n} &= \sum_{n=2}^{\infty}{5g_{n-1}z^n} - \sum_{n=2}^{\infty}{6g_{n-2}z^n} \\[3mm] \sum_{n=2}^{\infty}{g_n z^n} &= 5z \left( \sum_{n=2}^{\infty}{g_{n-1}z^{n-1}} \right) - 6z^2 \left( \sum_{n=2}^{\infty}{g_{n-2}z^{n-2}} \right) \\[3mm] \sum_{n=0}^{\infty}{g_n z^n} - g_1 z - g_0 &= 5z \left( \sum_{n=1}^{\infty}{g_{n-1}z^{n-1}} - g_0 \right) - 6z^2 \left( \sum_{n=2}^{\infty}{g_{n-2}z^{n-2}} \right) \\[3mm] G(z) - g_1 z - g_0 &= 5z \left( G(z) - g_0 \right) - 6z^2 \left( G(z) \right) \\[3mm] G(z) (1 - 5z + 6z^2) &= g_1z + g_0 - 5g_0 z \\[3mm] G(z) &= \dfrac{(g_1 - 5g_0)z + g_0}{1- 5z + 6z^2}\\[3mm] G(z) &= \dfrac{(1 - 3z)}{1- 5z + 6z^2} \\[3mm] G(z) &= \dfrac{(1 - 3z)}{(1 - 3z)(1 - 2z)} \\[3mm] G(z) &= \dfrac{1}{1 - 2z} \\[3mm] G(z) &= \sum_{n=0}^{\infty}{2^n z^n} \end{align} Note that in the last step, we used the known power series $$\dfrac{1}{1-x} = \sum_{n=0}^{\infty}{x^n}$$ and substituted in $2z$ for $x$.

Now, as discussed in the Power Series post, this series only converges when $|x| < 1$ or $|z| < \frac{1}{2}$. But guess what? WE DON'T CARE! Because this was just a way to get the coefficients, and we got them. Thus we have solved the recursion; the solution is $$g_n = 2^n$$ for all values of $n$. You can verify this by plugging the answer back into the recursion equation, and you'll see that both sides of the equals sign balance.

So that is pretty awesome and illustrates the idea behind OGF's. If you don't think that was wicked cool, then I would have to recommend more..."primitive" forms of entertainment for you, such as MTV's hit series Jersey Shore (which, I must admit, I am a big fan of myself...).

Now, the astute reader probably did think that was wicked cool, but is still just a tad bit bugged by the fact that this example was so obviously contrived so that $G(z)$ had a denominator which factored easily to give a really simple answer.

The astute reader is correct. If we look at the exact same recursion, except change one of the initial conditions from $g_0 = 1$ to $g_0 = 0$, we can repeat the exact same process to arrive at $$G(z) = \dfrac{2z}{(1-3z)(1-2z)}$$ which does not easily reduce to a known power series.

But not to worry; we can use a method called partial fraction decomposition to rewrite this quotient as \begin{align} G(z) &= \dfrac{2}{1-3z} + \dfrac{-2}{1-2z}\\[3mm] G(z) &= \sum_{n=0}^{\infty}{2 \cdot 3^n z^n} + \sum_{n=0}^{\infty}{(-2) 2^n z^n} \\[3mm] &\Downarrow \\[3mm] g_n &= 2 \cdot 3^n - 2^{n+1} \end{align} Now, I don't really want to go into detail about partial fraction decomposition, because it's not that exciting, and you can look it up really quickly on Google if you want to know more about it. But I did want to point it out as a useful method when you end up with $G(z)$'s for which you can't just factor the denominator and then use the known power series for $\dfrac{1}{1-x}$.

Another thing I want to point out here is that, although we just looked at an example of a homogeneous recursion, the OGF method also works for non-homogeneous recursions such as the Tower of Hanoi one above.

If you work out the same steps for that one, you'll get the generating function $$H(z) = \dfrac{z}{(1-z)(1-2z)} = \dfrac{1}{1-2z} - \dfrac{1}{1-z}$$ and thus $h_n = 2^n - 1$, once again by using the known series of $\dfrac{1}{1-x}$.

Just to sum up here, I'll show you the OGF's of the Fibonacci and Catalan numbers mentioned above.

For the Fibonacci's, the OGF is $$F(z) = \dfrac{z}{1-z-z^2}$$ You need to use a partial fraction decomposition to extract the coefficients, which turn out to be $$f_n = \dfrac{1}{\sqrt 5} \cdot (\gamma^n - {\hat{\gamma}}^n)$$ where $\gamma$ is the golden mean $\dfrac{1+\sqrt 5}{2}$, and $\hat \gamma$ is its conjugate $\dfrac{1-\sqrt 5}{2}$. This formula for $f_n$ is called the Binet formula for the Fibonacci numbers, named after Jacques Binet, who rediscovered it in 1843 after, you guessed it, our prolific friend Euler had published it in 1765.

I also used the Catalan numbers as an example, and you can click here to see the generating function and formula for these, from the very cool math blog of a guy named Mike Spivey. Who knew there was another math blog out there? Well, there you have it, there is! [Update: as I mention in the comments in response to Joe's question, I didn't do any research on the history of the Binet formula above when I first wrote this post, but now that I have looked into it, I found an interesting history of collaboration as well as independent discovery among famous mathematicians in various nations (frequently at war with each other during the period of European history in question) as they encountered these prolific number sequences and developed generating function and other methods in the 18th and 19th centuries. Here is a link to a good history of the Catalan numbers.]

And with that, there you have it- that's the end of this post. I hope you enjoyed it and thought that OGF's are as cool as I think they are. Just another example of how math and creativity are abolutely NOT mutually exclusive (in fact, quite the opposite, though I must admit that I myself have zero creativity and just copy most of this stuff from somewhere else that I've seen it...)!

1. 1)What does it mean to "rediscover math"? Like did *nobody* remember that Euler wrote that paper and then Binet was just the first person to find it again? Or did sincerely think he was being original?
2) It's a little unclear to me what you mean when you say you are "solving" the recursion. Does that just mean you are taking the already nifty equation we have and making it even niftier by by eliminating its self-referential nature?
3) Generating functions are really cool. We use them a lot in physical chemistry too (for statistical mechanics). You probably used them in physics as well. But we called them "partition functions". Conceptually, it seems very similar: the sum of all states. But can you also use these partition functions as generating functions? That is to say, we would often take partial derivatives of our generating function to calculate expected values. Is there some sort of expected value for these recursions?
4) This also looks pretty similar to my good buddy, Fourier, and my slightly less good buddy, Laplace. Can we use generating function techniques to find the coefficients for those series?
As always, some very cool stuff from G T Math. I look forward to reading the next one :D

1. Good questions.
1) I am having trouble finding the full story for the Binet formula above (honestly, I just copied that fact from where I was reading about the formula and didn't really investigate it further), but upon seeing your question, I looked into it and found a full history of the development of the Catalan numbers also mentioned above. These are some of the most prolific in combinatorics; it turns out that many famous mathematicians including Euler were working on problems in which these numbers arose as early as the mid-18th century, in various countries, and sometimes collaborating via letters. Igor Pak wrote a good summary of this history, to which I have now added a link in the post above- check it out. If you go onto his page on the UCLA math website, he even has links to the letters he mentions, which are in German, French, and Latin, some with English translations. Pretty cool to see Euler's handwriting (it's pretty much impossible to read, but still)...

2. (breaking my answer into a few replies here...)
2) Yes, exactly- to "solve" a recurrence means to find a formula for the $n$-th term which depends on $n$, but not on any of the earlier terms- in other words, to eliminate the self-referential nature as you put it.

3. 3) You're exactly right: partition functions are very similar in the sense that they sort of store a bunch of information we want, which we can extract by taking derivatives or extracting coefficients of the series. Thus partition functions are a type of generating function. To the second part of your question, I wouldn't say the generating function of a recurrence has an expected value since the coefficients aren't probabilities. If you take the derivative of an OGF, you get a new recurrence with the terms shifted over and multiplied by their index ($n$ value) from the old recurrence- can you see why?

2. 4) There is already a formula for the coefficients of a Fourier series, but once again there is a connection in that the Fourier series basically divides a function into its constituent single-frequency signals. There is also a similarity between solving differential equations using integral transforms and solving recurrences using generating functions: in both cases, we turn the problem into an algebraic one which is easier to solve and then extract the answer we were looking for.