Power Series
Definition
- A power series is denoted by $S(x) : = \sum \limits_{k=0}^{\infty} a_{k} ( x - x_{0} )^{k}$, and the Center of $S(x)$ is denoted by $x_{0}$.
- When $S(x)$ converges absolutely for $|x - x_{0}| < R$ and diverges for $|x - x_{0}| > R$, $R$ is called the Radius of Convergence of $S(x)$.
- The largest interval on which $S(x)$ converges is called the Interval of Convergence.
- If there exists a power series $\sum \limits_{k=0}^{\infty} a_{k} ( x - x_{0} )^{k} = f(x)$ centered at $x_{0} \in (c,d)$ in the interval of convergence $[c,d] \subset (a,b)$, then $f$ is said to be Analytic at $(a,b)$.
- If for all $n$, $a_{n}=b_{n}$ holds, then two power series $\sum \limits_{n=0}^\infty a_n(x-x_0) ^n$ and $\sum \limits_{n=0}^\infty b_n(x-x_0) ^n$ are considered equal.
- If $a_0=a_1=a_2=\cdots=0$ then $\sum \limits_{n=0}^\infty a_n(x-x_0)^{n}=0,\quad \forall x$
Explanation
For someone studying Analysis for the first time in their undergraduate course, it might be unclear why ‘Calculus’ and ‘Analysis’ are separated, and why Analysis focuses so much on sequences and series. However, if one has studied up to power series without losing interest, they might get at least a hint.
If asked why one studies Analysis, it’s reasonable to answer ’to bring down difficult functions to easier ones’. For example, transcendental functions are difficult, but polynomials are easy. If that transcendental function is analytic, it is fortunate. Analytic functions are those that can be expanded into series, and a function that can be expressed as a series can, in turn, be resolved into a sum of polynomials.
Power series are a key concept in basic Analysis, especially with many conditions attached to their convergence. Despite being infinite series, they are surprisingly ‘sensible’ and have many good properties.
Theorem
- (a) If $R := \lim_{k \to \infty} {{ | a_{k} | } \over { | a_{k+1} | }}$ exists, then $R$ is the radius of convergence of $S(x)$.
- (b) If the radius of convergence $R > 0$ exists, then $S(x)$ diverges for all $x \notin [ x_{0} - R , x_{0} + R ]$.
- (c) If the radius of convergence $R > 0$ exists, then $S(x)$ converges absolutely for all $x \in ( x_{0} - R , x_{0} + R )$.
- (d) If the radius of convergence $R > 0$ exists, then $S(x)$ converges uniformly on all $[a,b] \subset ( x_{0} - R , x_{0} + R )$.
- (e) If the radius of convergence $R > 0$ exists, then $S(x)$ is continuous on $( x_{0} - R , x_{0} + R )$.
- (f) If the radius of convergence $R > 0$ exists, then $S(x)$ is infinitely differentiable on $( x_{0} - R , x_{0} + R )$ $$ S^{(k)} (x) = \sum \limits_{n=k}^{\infty} {{n!} \over {(n-k)!}} a_{n} (x - x_{0} )^{n-k} $$
- (g) If $S(x)$ converges on $[a,b]$, it is integrable on $[a,b]$ $$ \int_{a}^{b} S(x) dx = \sum \limits_{k=0}^{\infty} a_{k} \int_{a}^{b} (x - x_{0} )^{k} dx $$