Powered by Blogger.

Power Series

As the title suggests, this post is going to cover power series, and while I haven't listed any prerequisites at the top, this will probably be a bit "hand-wavy" for those who have not seen sequences/series or calculus before. Sorry, but I am not going to write a series of posts on these topics, because it will take a long time and is already covered all over the internet, but for a refresher on that, I would direct you to Paul's Online Math Notes, which is a great website written by a math professor at Lamar University and actually where I originally learned this post-calculus I stuff. He has a whole section on sequences and series as part of an entire online course on calculus I-III. I'll also address any specific questions in the comments section, so feel free to ask there.

So, why am I writing this post at all? As of today, June 12th, 2015, there were 83 page views on the previous post about solving a recursion, making it one of the most popular to date given the short period of time it's been up. In that post, we went through how to solve a simple recursion by shuffling terms around to arrive at a sum in terms of the differences between successive terms. This method, sweet though it is, does not work for more complicated recursions, but there is a super clever method to solve the latter by finding their "generating function," which is actually so clever that it's going to blow your mind. That will involve a sort of "stepchild" of power series, so this post is really going to serve as a refresher and prerequisite for that. The idea of a power series (and other series representations of functions, e.g. Fourier series) is also really cool in its own right, and was in fact a reader request (ok well, technically, a request from a guy who not only doesn't read this, but also said he still won't read it even if I make a post about power series), so even if said "reader" doesn't see it, it's still a solid topic.

The sum of an infinite series


To talk about power series, we need to have a notion of the sum of an infinite series. A series is a sum of an infinite list of numbers (said list, without the sum, is called a sequence). If that sequence is the list of numbers $(a_n)_{n \in {\Bbb N}}$ (and notice I've used a common notation for sequences here- this means the list of terms $a_n$ with the index $n$ encompassing the counting numbers ${\Bbb N} = \{ 0,1,2,3,... \}$), then the series would be written as the infinite sum $$
\sum_{n=0}^{\infty}{a_n}
$$ To be precise, this infinite sum, like all infinite things in the study of real numbers, really refers to a limit of finite sums. In particular, the sum $\sum_{n=0}^{\infty}{a_n}$ is the limit as $N$ goes to infinity, of the sequence of partial sums $S_N = \sum_{n=0}^{N}{a_n}$. Get it? The partial sums are a sequence, and the sum of the infinite series is the limit of the sequence. In symbols, $$
\sum_{n=0}^{\infty}{a_n} \buildrel {\scr def} \over{=}
\lim_{N \rightarrow \infty} S_N = \lim_{N \rightarrow \infty} \sum_{n=0}^{N}{a_n}
$$ Note that the symbol $\buildrel {\scr def} \over{=}$ stands for "equals by definition." Clearly, in order for this sequence of $S_N$'s to converge to a (non-infinite) number, the numbers $S_N$ need to "level off" and stop getting bigger.

The difference between $S_{N+1}$ and $S_N$ is $a_{N+1}$, so in order for the partial sums to level off, we need the terms to get smaller, or technically closer to zero (i.e. $|a_n|$ gets smaller) since we could be talking about negative numbers, not just positive. So in order for the sum of the infinite series to exist, it must be the case that $\lim_{n \rightarrow \infty} |a_n| = 0$. By the way, when the differences $|S_{N+1} - S_N| \rightarrow 0$ as $N \rightarrow \infty$, we say that the sequence $(S_N)_{N \in {\Bbb N}}$ is a Cauchy sequence, named after Augustin-Louis Cauchy, one of the pioneers of analysis, the field of math that made a lot of our intuition about the number line (and functions thereof and calculus) rigorous. Without analysis, we wouldn't be able to have a lot of the modern theory that's extensively used in practice today, like the Black-Scholes formula and all the other stochastic calculus for example. So the guy is a big deal. Now, I'm not going to go into it now, but I did want to point out that there will probably be not just one, but a whole series of posts on the real numbers later, and in those we'll go into more detail about sequences' converging. It turns out that sequences of real numbers converge to a limit if and only if they are Cauchy sequences. It kind of makes sense, because if the terms of the sequence aren't eventually getting closer and closer together, then how can they possibly be converging to a limit? But that intuition requires some proving which I'm not going to get into right now. Anyway, onwards with the whole series thing.

Now, it turns out that just the fact that $\lim_{n \rightarrow \infty} |a_n| = 0$ isn't enough to guarantee that the series converges. The $a_n$'s need to go to zero fast enough. As an example, take the series $1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + ... + \frac{1}{n} + ...$. Here, the terms certainly approach zero in absolute value since $\frac{1}{n}$ gets smaller and smaller as $n \rightarrow \infty$, but they don't go to zero fast enough for the series to have a finite sum. To prove this, note that $$
\begin{align}
&1 + \frac{1}{2} + (\frac{1}{3} + \frac{1}{4}) + (\frac{1}{5} + \frac{1}{6} + \frac{1}{7} + \frac{1}{8}) + (\frac{1}{9} + ...\\[3mm]
> \ &1 + \frac{1}{2} + (\frac{1}{4} + \frac{1}{4}) + (\frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8}) + (\frac{1}{16} + ...\\[3mm]
= \ &1 + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + ...
\end{align}$$ which clearly just keeps adding up to bigger and bigger partial sums and thus diverges.

On the other hand, the series $$1 + \frac{1}{2^2} + \frac{1}{3^2} + ... + \frac{1}{n^2} + ...$$ does converge, and its sum is $\dfrac{\pi^2}{6}$. The proof of this is pretty slick, and of course the first one to come up with it was Euler in 1735 (though he didn't come up with a truly rigorous proof until 1741). I mean, seriously, this guy was an absolute animal- they didn't even have light bulbs yet, and he came up with this ridiculous sum. Anyway, the series $$\sum_{n=0}^{\infty}{\dfrac{1}{n^p}}$$ is called a $p$-series and converges if and only if $p>1$. I'm not going to get into the proof, but it's a nice example of the "going to zero fast enough" thing. As it turns out, there are a whole slew of tests to see if a series converges- the integral test, the alternating series test, the comparison test, etc. You might remember them from your calculus II class. They aren't that exciting, but I just wanted to mention all the above, because it is important when it comes to understanding what we're talking about when we speak of a power series and the convergence properties thereof.

Speaking of power series, let's move on to those now, but if there are any questions on the above discussion, the comments section is the place to ask.

Approximating a function using polynomials


A polynomial is a function of the form $p(x) = c_0 + c_1 x + c_2 x^2 + ... + c_n x^n$. The $c_i$ numbers are called coefficients, and the degree of the polynomial is the highest power of $x$ with a non-zero coefficient.

Polynomials are pretty easy to work with (and, in particular, to differentiate and integrate), so they are a decent choice if we want to approximate another function. If you look at the animated diagram below, you'll see the function $f(x) = e^x$ in blue, and in red, approximations of $f(x)$ using polynomials of increasing degree. This is a great animation ("borrowed", as usual, from Wikipedia, thank you very much) which really demonstrates the main idea I want to get at.

If we look around $x=0$, the approximation is pretty good, and gets better as we increase the degree of the polynomial approximation. When $n=0$, it's the dumbest imaginable "approximation" of $f$: a flat line at the level $f(0)$. For $n=1$, we have a line passing through $f(0)$ and having the same slope as $f$ at that point. A little better...The higher degree approximations take into account more and more of the curvature properties of $f$ and thus begin to "hug" the curve better around $x=0$, and actually further out as well.

This line of reasoning raises an interesting question: if we let the degree of the polynomial approximation approach infinity (and we'd have to really think carefully about what that would entail), could we get an "infinite polynomial" (whatever that means) that is exactly equal to our function for all values of $x$, or at least in a neighborhood of $x=0$?

In order to understand these types of objects, we'll need to make use of the section above in order to actually make sense of such an infinite polynomial. Because this thing is a series involving powers of $x$, it's called a (yes, this is what we've been waiting for...wait for it...) power series.

Power series


Ok, I may have been a bit hasty with the boldface font up there, because I still haven't told you exactly what a power series is. To define it more precisely, a power series is a function of the form $$
p(x) = \sum_{n=0}^{\infty}{c_n x^n}
$$ where the coefficients $c_n$ are some numbers which may depend on $n$ (but which do not depend on $x$).

For a given value of $x$, the numbers $c_n x^n$ are just some new numbers, which we could just as well call $a_n$, and now we are looking at the exact same thing as in the first section about infinite series of numbers. The same rules apply about convergence, which is really the key topic here, since the whole point was to come up with a power series which converges, at least for certain values of $x$, to the value of some other function $f(x)$.

It's worth noting that we are talking about a particular type of convergence here which, in the field of analysis, is called pointwise convergence: at each point $x$, we want the sequence (it's a different sequence of numbers for each value of $x$) of partial sums $$
p_{N}(x) = \sum_{n=0}^{N}{c_n x^n}
$$ to converge to the limit $f(x)$. There are other types of convergence such as uniform convergence, $L^2$-convergence, etc., which I'm not going to go into, but it is important to be clear about what is meant when one says "convergence" in the context of functions; indeed, you often hear of a sequence of functions $p_{n}(x)$ converging to a (limit) function $f(x)$. This could be interpreted in the sense of pointwise convergence, which would mean that the sequence of numbers $p_{1}(x), p_{2}(x), ...$ converges to the number $f(x)$ for each value of $x$, or it could mean that over the range $0 \leq x \leq 1$, the the maximum difference between the $p_{n}(x)$'s and the $f(x)$'s approaches 0 in some way as $n \rightarrow \infty$. Anyway, this is all very detailed, but I'm mentioning it so that you understand we are talking about pointwise convergence and not something else. So with that, let's get back to the discussion at hand.

Given a function $f(x)$, we are looking for some coefficients $c_n$ which make the series $p(x) = \sum_{n=0}^{\infty}{c_n x^n}$ coverge (pointwise) to $f(x)$ for every value of $x$, or at least for some values of $x$ close enough to $x=0$.

As an example, you might recall from precalculus or a similar class in high school, that a geometric series is a series whose terms are $a_0 = a, \ a_1 = ar, \ a_2 = ar^2, \ ... \ , \ a_n = ar^n, \ ... $ where $r$ is a fixed ratio and $a$ is some fixed first term. Clearly, if $|r| \geq 1$, then the terms of the series $\sum_{n=0}^{\infty}{ar^n}$ do not get closer to zero as $n$ gets larger, which means the series diverges, i.e. does not converge to a finite limit. If $|r| < 1$, then the series does converge, and the sum is $\dfrac{a}{1-r}$. To prove this, note that $$
\begin{align}
\sum_{n=0}^{N}{ar^n} &= a + ar + ar^2 + ar^3 + ... + ar^N \\[3mm]
&= a (1 + r + r^2 + r^3 + ... + r^N) \\[3mm]
&= a \dfrac{1-r^{N+1}}{1-r} \tag{$\clubsuit$} \\[3mm]
\end{align}
$$ where $(\clubsuit)$ is true because if you multiply $(1-r)$ by $(1 + r + r^2 + ... + r^N)$, you'll see that you get $1-r^{N+1}$. We're also assuming that $|r| < 1$ so that $1-r \neq 0$ and thus it's safe to divide by it in equation $(\clubsuit)$. So anyway, now we have $$
\begin{align}
\sum_{n=0}^{\infty}{ar^n} &= \lim_{N \rightarrow \infty} \sum_{n=0}^{N}{ar^n} \\[3mm]
 & \buildrel (\clubsuit) \over {=} \lim_{N \rightarrow \infty} a \dfrac{1-r^{N+1}}{1-r} \\[3mm]
&= a \lim_{N \rightarrow \infty} \dfrac{1-r^{N+1}}{1-r} \\[3mm]
&= a \dfrac{1}{1-r} \\[3mm]
&= \dfrac{a}{1-r}
\end{align}
$$ since $r^{N+1} \rightarrow 0$ as $N \rightarrow \infty$, and the rest of that fraction is independent of $N$. So that's a neat little formula, but if we change the name of $r$ to $x$ (and there's no reason we can't do that) and take the particular case of $a=1$, then we have shown that for values of $x$ in the range $|x|<1$ (i.e. $-1 < x < 1$), the power series $$ \sum_{n=0}^{\infty}{x^n} $$ converges pointwise to the function $f(x) = \dfrac{1}{1-x}$. Since the power series representation only works for $-1 < x < 1$, we say that the series has a radius of convergence of 1.

There are lots of common functions with known power series representations, such as $e^x$, $\cos x$, $\sin x$, etc. In fact, combining those 3, you can prove the famous identity involving everyone's favorite math symbols, $e^{i \pi} = -1$. What??? Yup, it's true, as weird as it is.

Those readers who are familiar with calculus can go ahead and keep reading here, though the above is all you will need for Recursions Part 2.

Now that we've gone through approximating a function with polynomials of increasing degree, I just want to point out 3 more things.

The first is that if $f(x) = \sum_{n=0}^{\infty}{c_n (x-c)^n}$ has a radius of convergence $R$ (and note here that I've generalized a bit and centered the series around $x = c$ instead of $x=0$), then the derivative of the function is $f'(x) = \sum_{n=0}^{\infty}{n c_n (x-c)^{n-1}}$, and there's a similar formula for $\int f(x) dx$. Furthermore, both of these also have radius of convergence $R$. Note that since we are dealing with infinite series, not finite sums, these two facts are not trivial, and you have to go through a little limit of partial sums argument to prove them.

The second point is that if a function $f(x)$ is $k$ times differentiable at a point $c \in {\Bbb R}$, then there exists a remainder function $R_{k}(x)$ such that the $k$-th order Taylor polynomial $$ \sum_{n=0}^{k}{\dfrac{f^{(n)}(x-c)}{n!}(x-c)^n} $$ is within $R_{k}(x-c)^k$ of $f(x)$, and the remainder goes to 0 as $x \rightarrow c$. Here, $f^{n}(x-c)$ denotes the $n$-th derivative of $f$ evaluated at the point $x-c$. There are a few subtleties, but intuitively, these Taylor polynomials represent increasingly accurate polynomial approximations as in the earlier diagram. If a function is infinitely differentiable at $x=c$, then (ignoring the subtleties once again) you can let $k$ go to $\infty$ to obtain the Taylor series of $f$. For "nice enough" functions, i.e. ones that aren't contrived to be counterexamples, the Taylor series converges to $f$, but there are a few wacky examples where it actually converges to something else. But that's Taylor's Theorem for you, which basically provides us with a formula to find the $c_n$'s we wanted in order to approximate a function using polynomials.

The third and final point before we wrap up is that there are other types of series that approximate functions as well. A notable example is Fourier series, which are very important in the world of digital signals. A Fourier series approximates a function using sine waves of different frequencies. The details of how this works could be an entire post, but just look at this picture of the first four partial sums of the Fourier series for the square wave function, and it will kind of make sense:


Then there's the sort-of-related Fourier transform, which decomposes a signal into its constituent frequencies, and which can be seen as a sort of limit of Fourier series (which are sums) into integrals.

Ok, that's enough of this topic. I hope you enjoyed it if you made it this far, and I did go through it pretty fast, so feel free to ask any questions that are still lingering in your mind in the comments section.