logo

Dual Space 📂Linear Algebra

Dual Space

Dual Spaces

Definition 11

The set of all continuous linear functionals of a vector space $X$ is denoted by $X^{ \ast }$ and is called the dual space of $X$, simply referred to as the dual of $X$, as denoted below.

$$ X^{ \ast }:=\left\{ x^{ \ast }:X\to \mathbb{C}\ |\ x^{ \ast } \text{ is continuous and linear} \right\} $$

$$ X^{ \ast }:=B(X,\mathbb{C}) $$

$B \left( X, \mathbb{C} \right)$ is the set of bounded linear operators whose domain is $X$ and codomain is $\mathbb{C}$.

Definition 22

For a vector space $X$ over the field $F$, the set of linear functionals on $X$ is called the dual space of $X$ and is denoted by $X^{\ast}$.

$$ X^{\ast} = L(X, F) $$

$L(X, F)$ is the set of all linear transformations from $X$ to $F$.

Explanation

  • By the properties of linear operators, the condition of continuity is equivalent to the condition of boundedness.
  • Instead of the symbol $\ast$, $^{\prime}$ is also used to denote the dual space.

It is also possible to talk about the dual space of the dual space. In this case, it is denoted as $X^{\ast \ast}=\left( X^{ \ast } \right)^{ \ast }$ and called as bidual, double dual, or second dual.

Regarding the operator norm $\displaystyle \| f \| = \sup_{\substack{x \in X \\ \| x \| =1}} | f(x) |$, $(X^{ \ast } , \| \cdot \| )$ becomes a Banach space. The following theorem holds.

Theorem

If $X$ is a finite-dimensional vector space, then the following holds.

$$ \dim X^{ \ast } = \dim X $$

Proof

Method 11

Strategy: Use the basis of $\dim X$ to create a basis that makes $\dim X^{ \ast }$ finite-dimensional.


If we let $\dim X = n$, since $X$ is finite-dimensional, it has a basis $\left\{ \tilde{ e_{1} } , \cdots , \tilde{ e_{n} } \right\}$.If we then let $\displaystyle e_{j} : = {{ \tilde{e_{j} } } \over { \| \tilde{ e_{j} } \| }} \in X$, $\| e_{j} \| = 1$ and $\left\{ e_{1} , \cdots , e_{n} \right\}$ is still a basis of $X$. Now, let’s define $e_{j}^{ \ast } : (X , \| \cdot \| ) \to ( \mathbb{C} , | \cdot | )$ as follows.

$$ e_{j}^{ \ast } (e_{i}) := \delta_{ij} $$

Properties of Linear Operators

Let $T : (X , \| \cdot \|_{X}) \to ( Y , \| \cdot \|_{Y} )$ be a linear operator. If $X$ is a finite-dimensional space, then $T$ is continuous.

Since we assumed $\dim X = n$, $e_{j}^{ \ast }$ is a continuous linear functional.

Necessary and Sufficient Conditions for Linear Functionals to be Expressed as Linear Combinations

Let $f_{1} , \cdots , f_{n}$ be a linear functional with domain $X$.

There exists $x_{1} , \cdots , x_{n}$ that satisfies $\iff$ $f_{j} (x_{i} ) = \delta_{ij}$ if $f_{1} , \cdots , f_{n}$ is linearly independent

According to the above theorem, $\beta^{\ast} = \left\{ e_{1}^{\ast}, \dots, e_{n}^{\ast} \right\}$ is linearly independent. Applying $f \in X^{ \ast }$ to any $\displaystyle x = \sum_{i=1}^{n} t_{i} e_{i} \in X$ results in

$$ f(x) = f\left( \sum_{i=1}^{n} t_{i} e_{i} \right) = \sum_{i=1}^{n} t_{i} f(e_{i} ) = \sum_{i=1}^{n} f(e_{i} ) t_{i} $$

Since $\displaystyle t_{i} = e_{i}^{ \ast } \left( \sum_{k=1}^{n} t_{k} e_{k} \right) = e_{i}^{ \ast } (x)$,

$$ f(x) = \sum_{i=1}^{n} f(e_{i} ) e_{i}^{ \ast } (x) = \left[ \sum_{i=1}^{n} f(e_{i} ) e_{i}^{ \ast } \right] (x) $$

Therefore,

$$ f = \sum_{i=1}^{n} f(e_{i} ) e_{i}^{ \ast } \in \text{span} \left\{ e_{1}^{ \ast } , \cdots , e_{n}^{ \ast } \right\} $$

In other words, $\beta^{\ast} = \left\{ e_{1}^{ \ast } , \cdots , e_{n}^{ \ast } \right\}$ is linearly independent and generates $X^{\ast}$, so it is a basis of $X^{ \ast }$.

$$ \dim X^{ \ast } = n $$

Method 22

Since $\dim(L(X,F)) = \dim(X)\dim(F)$,

$$ \dim(X^{\ast}) = \dim(L(X,F)) = \dim(X)\dim(F) = \dim(X) $$

Although this concludes the proof of the theorem itself, let’s concretely find the basis of $X^{\ast}$. Let $\beta = \left\{ x_{1}, \dots, x_{n} \right\}$ be the ordered basis of $X$. Also, let’s refer to $f_{i}$ as the $i$th coordinate function.

$$ f_{i}(x_{j}) = \delta_{ij} $$

Thus, $f_{i}$ is a linear functional defined on $X$. Now, let’s assume $\beta^{\ast} = \left\{ f_{i}, \dots, f_{n} \right\}$.

Claim: $\beta^{\ast}$ is an (ordered) basis of $X^{\ast}$

We already know $\dim (X^{\ast}) = n$, so all we have to show is $\span(\beta^{\ast}) = X^{\ast}$. That is, we have to demonstrate that any $f \in X^{\ast}$ can be represented as a linear combination of $f_{i}$s. Given $f$, let’s assume $g = \sum_{i=1}^{n}f(x_{i})f_{i}$. Then, in fact, this $g$ is precisely $f$, and we can see that $f$ is represented as a linear combination of $f_{i}$s. Regarding $1 \le j \le n$,

$$ g(x_{j}) = \left( \sum_{i=1}^{n}f(x_{i})f_{i} \right) (x_{j}) = \sum_{i=1}^{n}f(x_{i})f_{i}(x_{j}) = \sum_{i=1}^{n}f(x_{i})\delta_{ij} = f(x_{j}) $$

Therefore, $g=f$ and $f = \sum_{i=1}^{n}f(x_{i})f_{i}$. Thus, $\beta^{\ast}$ generates $X^{\ast}$.

Dual Basis

Following the notation, the ordered basis $\beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\}$ of $X^{\ast}$ is called the dual basis or reciprocal basis of $\beta$.

$$ f_{i} : X \to \mathbb{F} \quad \text{ by } \quad f_{i}(x_{j}) = \delta_{ij} $$

Here, $\delta_{ij}$ is the Kronecker delta.


  1. Kreyszig. (1989). Introductory Functional Analysis with Applications: p106. ↩︎ ↩︎

  2. Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p119-120 ↩︎ ↩︎