logo

Transpose of Linear Transformations Defined by Dual Spaces 📂Linear Algebra

Transpose of Linear Transformations Defined by Dual Spaces

Theorem1

Let’s denote the ordered bases of two finite-dimensional vector spaces $V, W$ as $\beta, \gamma$, respectively. For any linear transformation $T : V \to W$, the following defined function $U$ is a linear transformation and satisfies $[U]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t}$.

$$ U : W^{\ast} \to V^{\ast} \quad \text{ by } \quad U(g) = gT \quad \forall g \in W^{\ast} $$

Here, $[T]_{\beta}^{\gamma}$ is the matrix representation of $T$, ${}^{t}$ is the transpose of a matrix, $V^{\ast}$ is the dual space of $V$, and $\beta^{\ast}$ is the dual basis of $\beta$.

Definition

The linear transformation $U$ by the above theorem is called the transpose of $T$ and is denoted by $T^{t}$.

Explanation

Since the transpose matrix of the matrix representation of $T$ is the matrix representation of $U$, it is very natural to denote it as $T^{t}$ and call it a transpose.

$$ [T^{t}]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t} $$

Meanwhile, from the definition, $gT$ is the composition of $g$ and $T$. $T : V \to W$ is given, and $W^{\ast} \ni g : W \to \mathbb{R}$, therefore, $gT(x) = g(T(x))$.

Proof

For any $g \in W^{\ast}$, since $g : W \to \mathbb{R}$ and $T : V \to W$, $U(g) = gT : V \to \mathbb{R}$ and $gT \in V^{\ast}$. Therefore, $U : W^{\ast} \to V^{\ast}$.

Let us denote the two ordered bases as $\beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\}, \gamma = \left\{ \mathbf{w}_{1}, \dots, \mathbf{w}_{m} \right\}$. And their respective dual bases as $\beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\}, \gamma^{\ast} = \left\{ g_{1}, \dots, g_{m} \right\}$. For the sake of notation convenience, let us denote them as $A = [T]_{\beta}^{\gamma}$. To find the matrix representation, one needs to see how each base is mapped. Thus, to find the $j$-th column of $[U]_{\gamma^{\ast}}^{\beta^{\ast}}$, let’s calculate $U(g_{j})$.

Dual Spaces and Dual Bases

Let’s call $\beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\}$ the ordered basis of $X$, and $\beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\}$ the dual basis of $X^{\ast}$. Then, for $f \in X^{\ast}$,

$$ f = \sum_{i=1}^{n}f(\mathbf{v}_{i})f_{i} $$

By the properties of the dual space and $g_{j}T \in V^{\ast}$,

$$ U(g_{j}) = g_{j}T = \sum_{s}(g_{j}T)(\mathbf{v}_{s})f_{s} $$

$$ [U(g_{j})]_{\beta^{\ast}} = \begin{bmatrix} (g_{j}T)(\mathbf{v}_{1}) \\ (g_{j}T)(\mathbf{v}_{2}) \\ \vdots \\ (g_{j}T)(\mathbf{v}_{n})\end{bmatrix} $$

Therefore, the matrix representation of $U$, $[U]_{\gamma^{\ast}}^{\beta^{\ast}}$, is,

$$ [U]_{\gamma^{\ast}}^{\beta^{\ast}} = \begin{bmatrix} (g_{1}T)(\mathbf{v}_{1}) & (g_{2}T)(\mathbf{v}_{1}) & \cdots & (g_{n}T)(\mathbf{v}_{1}) \\ (g_{1}T)(\mathbf{v}_{2}) & (g_{2}T)(\mathbf{v}_{2}) & \cdots & (g_{n}T)(\mathbf{v}_{2}) \\ \vdots & \vdots & \ddots & \vdots \\ (g_{1}T)(\mathbf{v}_{n}) & (g_{2}T)(\mathbf{v}_{n}) & \cdots & (g_{n}T)(\mathbf{v}_{n}) \end{bmatrix} $$

But if each component is calculated,

$$ \begin{align*} (g_{j}T)(\mathbf{v}_{i}) = g_{j}(T(\mathbf{v}_{i})) &= g_{j}\left( \sum_{k}^{m} A_{ki}\mathbf{w}_{k} \right) \\ &= \sum_{k}^{m} A_{ki} g_{j}(\mathbf{w}_{k}) \\ &= \sum_{k}^{m} A_{ki} \delta_{jk} \\ &= A_{ji} \end{align*} $$

Therefore, $[U]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t}$ is satisfied.


  1. Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p121-122 ↩︎