logo

Cauchy Product: The Product of Two Convergent Power Series 📂Analysis

Cauchy Product: The Product of Two Convergent Power Series

Theorem 1

If the convergence interval for $f(x) : = \sum _{k=0}^{\infty} a_{k} x^{k}$ and $g(x) : = \sum_{k=0}^{\infty} b_{k} x^{k}$ is $(-r,r)$ and assuming $c_{k} := \sum_{j=0}^{k} a_{j} b_{k-j}$, then $\sum_{k=0}^{\infty} c_{k} x^{k}$ converges to $f(x)g(x)$ within the convergence interval $(-r,r)$.

Description

The fact that the products of coefficients converge to the product of the coefficients of the two functions on their own is quite fascinating. It would be taken for granted if it were just a polynome, but power series have infinitely many terms, after all.

Proof

Let’s fix $x \in (-r,r)$ and $n \in \mathbb{N}$ one at a time and define sequences of functions as follows.

$$ \begin{align*} f_{n} (x) : =& \sum_{k=0}^{n} a_{k} x^{k} \\ g_{n} (x) : =& \sum_{k=0}^{n} b_{k} x^{k} \\ h_{n} (x) : =& \sum_{k=0}^{n} c_{k} x^{k} \end{align*} $$

Since commutativity of addition holds for a finite number of terms, $$ \begin{align*} h_{n} (x) =& \sum_{k=0}^{n} c_{k} x^{k} \\ =& \sum_{k=0}^{n} \sum_{j=0}^{k} a_{j} b_{k-j} x^{j} x^{k-j} % \\ =& \sum_{k=0}^{n} \left[ a_{0} b_{k} x^{0} x^{k} + a_{1} b_{k-1} x^{1} x^{k-1} + \cdots + a_{k-1} b_{1} x^{1} x^{k-1} + a_{k} b_{0} x^{0} x^{k} \right] \\ =& \sum_{j=0}^{0} a_{j} b_{0-j} x^{j} x^{0-j} + \sum_{j=0}^{1} a_{j} b_{1-j} x^{j} x^{1-j} + \cdots + \sum_{j=0}^{n} a_{j} b_{n-j} x^{j} x^{n-j} \\ =& + a_{0} b_{0} x^{0} x^{0} \\ & + a_{0} b_{1} x^{0} x^{1} + a_{1} b_{0} x^{1} x^{0} \\ & \vdots \\ & + a_{0} b_{n} x^{0} x^{n} + a_{1} b_{n-1} x^{1} x^{n-1} + \cdots + a_{n} b_{n} x^{n} x^{0} \\ (\text{sum by column}) =& a_{0} x^{0} \sum_{k=0}^{n} b_{k} x^{k} + a_{1} x^{1} \sum_{k=1}^{n} b_{k} x^{k-1} + \cdots + a_{n} x^{n} \sum_{k=n}^{n} b_{k} x^{k-k} \\ =& \sum_{j=0}^{n} a_{j} x^{j} \sum_{k=j}^{n} b_{k-j} x^{k-j} \\ =& \sum_{j=0}^{n} a_{j} x^{j} g_{n-j} (x) \\ =& \sum_{j=0}^{n} a_{j} x^{j} \left[ g_{n-j} (x) + g(x) - g(x) \right] \\ =& g(x) \sum_{j=0}^{n} a_{j} x^{j} + \sum_{j=0}^{n} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] \\ =& g(x) f_{n} (x) + \sum_{j=0}^{n} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] \end{align*} $$

Thus, it only remains to show that $\lim _{n \to \infty} f_{n} (x) = f(x)$ implies $\lim _{n \to \infty} \sum_{j=0}^{n} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] = 0$.

Given any positive number $\varepsilon > 0$, assuming that within the convergence interval $\lim _{n \to \infty} g_{n} (x) = g(x)$ and $f(x)$ converges absolutely, for all natural numbers $n > j$

$$ | g_{n- j } (x) - g (x) | \le M $$

$$ \sum_{k=0}^{\infty} \left| a_{k} x^{k} \right| < M $$

there exists a $M > 0$ that satisfies the above. For the same reason, for this $M$

$$ l \ge N \implies | g_{ l } (x) - g (x) | < {{\varepsilon} \over {2M}} $$

$$ \sum_{k=N+1}^{\infty} \left| a_{k} x^{k} \right| < {{\varepsilon} \over {2M}} $$

there can be chosen a $N \in \mathbb{N}$.

Now, setting $n > 2N$

$$ \begin{align*} \left| \sum_{j=0}^{n} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] \right| =& \left| \sum_{j=0}^{N} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] + \sum_{j=N+1}^{n} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] \right| \\ \le & \left| \sum_{j=0}^{N} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] \right| + \left| \sum_{j=N+1}^{n} a_{j} x^{j} \left[ g_{n-j} (x) - g(x) \right] \right| \\ \le & \sum_{j=0}^{N} \left| a_{j} x^{j} \right| \left| g_{n-j} (x) - g(x) \right|+ \sum_{j=N+1}^{n} \left| a_{j} x^{j} \right| \left| g_{n-j} (x) - g(x) \right| \\ \le & {{\varepsilon} \over {2M}} \sum_{j=0}^{N} \left| a_{j} x^{j} \right| + M \sum_{j=N+1}^{n} \left| a_{j} x^{j} \right| \\ \le & {{\varepsilon} \over {2M}} \cdot M + M \cdot {{\varepsilon} \over {2M}} \\ \le & {{\varepsilon} \over {2}} + {{\varepsilon} \over {2}} \\ =& \varepsilon \end{align*} $$


  1. Wade. (2013). An Introduction to Analysis(4th Edition): p244. ↩︎