Necessary and Sufficient Conditions for Linear Functionals to be Represented by Linearly Independent Combinations
Theorem
Let $f, f_{1} , \cdots , f_{n}$ be a linear functional with domain $X$.
(a) For $c_{1} , \cdots , c_{n} \in \mathbb{C}$, $\displaystyle f = \sum_{i=1}^{n} c_{i} f_{i}$ $\iff$ $\displaystyle \bigcap_{i=1}^{n} \ker ( f_{i} ) \subset \ker (f)$
(b) There exists $x_{1} , \cdots , x_{n}$ satisfying $f_{j} (x_{i} ) = \delta_{ij}$ with $f_{1} , \cdots , f_{n}$ being linearly independent.
Here, $\delta_{ij}$ is the Kronecker delta.
Explanation
Consider that the kernel is related to the concept of homogeneous, it can be inferred that it is a useful fact for linear homogeneous differential equations. However, from a learning perspective, it’s recommended to familiarize oneself with the facts as proofs can be excessively long, difficult, and complicated.
Proof
(a)
Strategy: Using only the definition of $( \implies )$ kernel, it can be easily shown. Specifically, finding $c_{1} , \cdots , c_{n}$.
$(\implies )$
Since $\displaystyle x \in \bigcap_{i=1}^{n} \ker ( f_{i} )$,
$$ f_{i} ( x ) = 0 $$
Since $\displaystyle f(x) = \sum_{i=1}^{n} c_{i} f_{i} (x) = 0$,
$$ x \in \ker (f) $$
To summarize,
$$ \bigcap_{i=1}^{n} \ker ( f_{i} ) \subset \ker (f) $$
$( \impliedby )$
Define Proposition $ \displaystyle P(n) : \bigcap_{i=1}^{n} \ker ( f_{i} ) \subset \ker (f) \implies f = \sum_{i=1}^{n} c_{i} f_{i}$ and use Mathematical Induction.
It’s trivial if $f = 0$, so assume $f \ne 0$.
Part 1. $n=1$
Since $\ker (f_{1} ) \subset \ker (f)$,
$$ f_{1} \ne 0 $$
$$ f_{1} (x_{1} ) = 1 $$
There exists $x_{1} \in X$ such that, if $x \in X$, then $x - f_{1} (x) x_{1} \in X$, and applying $f_{1}$ yields
$$ f_{1} ( x - f_{1} (x) x_{1} ) = f_{1} ( x ) - f_{1} (x) f_{1} ( x_{1} ) = 0 $$
That is, $x - f_{1} (x) x_{1} \in \ker (f_{1} ) \subset \ker (f)$, and applying $f$ yields
$$ 0 = f ( x - f_{1} (x) x_{1} ) = f(x) - f_{1} (x) f_{1} (x_{1}) $$
To summarize, it can be shown for $f_{1} (x_{1}) \in \mathbb{C}$ that $f(x) = f_{1} (x_{1}) f_{1} (X)$.
Part 2. $n=N-1$
Assume that $P(N-1)$ holds.
Part 3. $n=N$
Case 1. If $f_{1} , \cdots , f_{N}$ is not linearly independent
Since $f_{1} , \cdots , f_{N}$ is not linearly independent,
$$ t_{1} f_{1} + \cdots + t_{N} f_{N} = 0 $$
$$ t_{i_{0}} \ne 0 $$
Some $t_{i} \in \mathbb{C}$ exists such that
$$ \displaystyle f_{i_{0}} = {{1} \over { t_{i_{0}} }} \left( \sum_{i \ne i_{0}} t_{i} f_{i} \right) =\sum_{i \ne i_{0}} \left( {{t_{i} } \over { t_{i_{0}} }} \right) f_{i} $$
Thus,
$$ \bigcap_{i \ne i_{0} } \ker ( f_{i} ) \subset \ker (f_{ i_{0}} ) $$
Since $\displaystyle \bigcap_{i \ne i_{0} } \ker ( f_{i} )$ is included in $\displaystyle \ker (f_{ i_{0}} )$,
$$ \bigcap_{i \ne i_{0} } \ker ( f_{i} ) = \left[ \bigcap_{i \ne i_{0} } \ker ( f_{i} ) \right] \cap \ker (f_{i_{0}} ) = \bigcap_{i=1}^{ N } \ker ( f_{i} ) \subset \ker (f) $$
But, assuming $P(N-1)$ in Part 2., $c_{1} , \cdots , c_{N-1} \in \mathbb{C}$ satisfying $\displaystyle f = \sum_{ i \ne i_{0}} c_{i} f_{i} + 0 f_{i_{0}}$ exists.
Case 2. If $f_{1} , \cdots , f_{N}$ is linearly independent
Assuming $\displaystyle \bigcap_{ k \ne i } \ker ( f_{k} ) \subset \ker (f_{i} )$ for $1 \le k \le N$, and from Part 2. assuming $P(N-1)$ holds, for some $\lambda_{1} , \cdots , \lambda_{N} \in \mathbb{C}$, $\displaystyle f_{i} = \sum_{ k \ne i } \lambda_{k} f_{k}$ can be shown, making $f_{1} , \cdots , f_{N}$ not linearly independent and necessitating $\displaystyle \bigcap_{ k \ne i } \ker ( f_{k} ) \not\subset \ker (f_{i} )$. Thus, $$ \displaystyle y_{i} \in \left[ \bigcap_{ k \ne i } \ker ( f_{k} ) \right] \setminus \ker (f_{i} ) $$
$$ y_{i} \in \ker (f_{i} ) $$
There exists $y_{1} , \cdots , y_{N} \in X$ satisfying these. Defining $\displaystyle x_{i} := {{ y_{i}} \over {f_{i} ( y_{i} ) }}$,
$$ \begin{cases} \displaystyle f_{j} (x_{i} ) = {{ f_{j} (y_{i}) } \over { f_{i} (y_{i} ) }} = 0 \\ \displaystyle f_{i} (x_{i} ) = {{ f_{i} (y_{i}) } \over { f_{i} (y_{i} ) }} = 1 \end{cases} \implies f_{j} ( x_{ i} ) = \delta_{ij} = \begin{cases} 0 & , i \ne j \\ 1 & , i = j \end{cases} $$
Applying $f_{i}$ to any $\displaystyle x - \sum_{i=1}^{N} f_{j} (x) x_{j}$ defined by $x \in X$ for all $i = 1 , \cdots , N$ yields
$$ \begin{align*} f_{i } \left( x - \sum_{j=1}^{N} f_{j} (x) x_{j} \right) =& f_{j} (x) - \sum_{i=1}^{N} f_{j} (x) f_{i} ( x_{j} ) \\ =& f_{i} (x) - \sum_{j=1}^{N} f_{j} (x) \delta_{ij} \\ =& f_{i} (x) - f_{i} (x) \\ =& 0 \end{align*} $$
Shown as a subset relation,
$$ \left( x - \sum_{j=1}^{N} f_{j} (x) x_{j} \right) \in \bigcap_{i=1}^{N} \ker ( f_{i} ) \subset \ker (f) $$
Applying $f$ to $\displaystyle x - \sum_{j=1}^{N} f_{j} (x) x_{j}$, according to the definition of kernel,
$$ f \left( x - \sum_{j=1}^{N} f_{j} (x) x_{j} \right) = f(x) - \sum_{j=1}^{N} f_{j} (x) f ( x_{j} ) = 0 $$
$$ \implies f(x) =\sum_{j=1}^{N} f_{j} (x) f ( x_{j} ) = \left[ \sum_{j=1}^{N} f_{j} f ( x_{j} ) \right] (x) $$
$$ \implies f = \sum_{j=1}^{N} f ( x_{j} ) f_{j} $$
Thus, $f$ is expressed as a linear combination of $f_{1} , \cdots , f_{N}$ for specific $f ( x_{1} ) , \cdots , f ( x_{N} ) \in \mathbb{C}$.
Therefore, Proposition $P(n)$ holds for all $n \in \mathbb{N}$, regardless of whether $f_{1} , \cdots , f_{N}$ is linearly independent or not.
■
(b)
Strategy: Essentially a corollary of (a).
$(\implies)$
In fact, in Proof(a) -$(\impliedby)$-Part 3. -Case 2., there existed $\displaystyle x_{i} := {{ y_{i}} \over {f_{i} ( y_{i} ) }}$ satisfying $f_{j} (x_{i} ) = \delta_{ij}$ with $f_{1} , \cdots , f_{n}$ being linearly independent.
$(\impliedby)$
When considering $c_{1} f_{1} (x) + \cdots + c_{n} f_{n} (x) = 0$ with the existence of $x_{1} , \cdots , x_{n}$ satisfying $f_{j} (x_{i} ) = \delta_{ij}$,
$$ c_{1} f_{1} (x_{1} ) + 0 + \cdots + 0 = c_{1} = 0 $$
Similarly, by applying $x_{i}$,
$$ 0 + \cdots + 0 + c_{i} f_{i} + 0 + \cdots + 0 = c_{i} = 0 $$
Therefore, the only case satisfying $c_{1} f_{1} (x) + \cdots + c_{n} f_{n} (x) = 0$ is $c_{1} = \cdots = c_{n} = 0$, making $f_{1} , \cdots , f_{n}$ linearly independent.
■