logo

Transpose of Linear Transformations Defined by Dual Spaces 📂Linear Algebra

Transpose of Linear Transformations Defined by Dual Spaces

Theorem1

Let’s denote the ordered bases of two finite-dimensional vector spaces V,WV, W as β,γ\beta, \gamma, respectively. For any linear transformation T:VWT : V \to W, the following defined function UU is a linear transformation and satisfies [U]γβ=([T]βγ)t[U]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t}.

U:WV by U(g)=gTgW U : W^{\ast} \to V^{\ast} \quad \text{ by } \quad U(g) = gT \quad \forall g \in W^{\ast}

Here, [T]βγ[T]_{\beta}^{\gamma} is the matrix representation of TT, t{}^{t} is the transpose of a matrix, VV^{\ast} is the dual space of VV, and β\beta^{\ast} is the dual basis of β\beta.

Definition

The linear transformation UU by the above theorem is called the transpose of TT and is denoted by TtT^{t}.

Explanation

Since the transpose matrix of the matrix representation of TT is the matrix representation of UU, it is very natural to denote it as TtT^{t} and call it a transpose.

[Tt]γβ=([T]βγ)t [T^{t}]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t}

Meanwhile, from the definition, gTgT is the composition of gg and TT. T:VWT : V \to W is given, and Wg:WRW^{\ast} \ni g : W \to \mathbb{R}, therefore, gT(x)=g(T(x))gT(x) = g(T(x)).

Proof

For any gWg \in W^{\ast}, since g:WRg : W \to \mathbb{R} and T:VWT : V \to W, U(g)=gT:VRU(g) = gT : V \to \mathbb{R} and gTVgT \in V^{\ast}. Therefore, U:WVU : W^{\ast} \to V^{\ast}.

Let us denote the two ordered bases as β={v1,,vn},γ={w1,,wm}\beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\}, \gamma = \left\{ \mathbf{w}_{1}, \dots, \mathbf{w}_{m} \right\}. And their respective dual bases as β={f1,,fn},γ={g1,,gm}\beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\}, \gamma^{\ast} = \left\{ g_{1}, \dots, g_{m} \right\}. For the sake of notation convenience, let us denote them as A=[T]βγA = [T]_{\beta}^{\gamma}. To find the matrix representation, one needs to see how each base is mapped. Thus, to find the jj-th column of [U]γβ[U]_{\gamma^{\ast}}^{\beta^{\ast}}, let’s calculate U(gj)U(g_{j}).

Dual Spaces and Dual Bases

Let’s call β={v1,,vn}\beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\} the ordered basis of XX, and β={f1,,fn}\beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\} the dual basis of XX^{\ast}. Then, for fXf \in X^{\ast},

f=i=1nf(vi)fi f = \sum_{i=1}^{n}f(\mathbf{v}_{i})f_{i}

By the properties of the dual space and gjTVg_{j}T \in V^{\ast},

U(gj)=gjT=s(gjT)(vs)fs U(g_{j}) = g_{j}T = \sum_{s}(g_{j}T)(\mathbf{v}_{s})f_{s}

[U(gj)]β=[(gjT)(v1)(gjT)(v2)(gjT)(vn)] [U(g_{j})]_{\beta^{\ast}} = \begin{bmatrix} (g_{j}T)(\mathbf{v}_{1}) \\ (g_{j}T)(\mathbf{v}_{2}) \\ \vdots \\ (g_{j}T)(\mathbf{v}_{n})\end{bmatrix}

Therefore, the matrix representation of UU, [U]γβ[U]_{\gamma^{\ast}}^{\beta^{\ast}}, is,

[U]γβ=[(g1T)(v1)(g2T)(v1)(gnT)(v1)(g1T)(v2)(g2T)(v2)(gnT)(v2)(g1T)(vn)(g2T)(vn)(gnT)(vn)] [U]_{\gamma^{\ast}}^{\beta^{\ast}} = \begin{bmatrix} (g_{1}T)(\mathbf{v}_{1}) & (g_{2}T)(\mathbf{v}_{1}) & \cdots & (g_{n}T)(\mathbf{v}_{1}) \\ (g_{1}T)(\mathbf{v}_{2}) & (g_{2}T)(\mathbf{v}_{2}) & \cdots & (g_{n}T)(\mathbf{v}_{2}) \\ \vdots & \vdots & \ddots & \vdots \\ (g_{1}T)(\mathbf{v}_{n}) & (g_{2}T)(\mathbf{v}_{n}) & \cdots & (g_{n}T)(\mathbf{v}_{n}) \end{bmatrix}

But if each component is calculated,

(gjT)(vi)=gj(T(vi))=gj(kmAkiwk)=kmAkigj(wk)=kmAkiδjk=Aji \begin{align*} (g_{j}T)(\mathbf{v}_{i}) = g_{j}(T(\mathbf{v}_{i})) &= g_{j}\left( \sum_{k}^{m} A_{ki}\mathbf{w}_{k} \right) \\ &= \sum_{k}^{m} A_{ki} g_{j}(\mathbf{w}_{k}) \\ &= \sum_{k}^{m} A_{ki} \delta_{jk} \\ &= A_{ji} \end{align*}

Therefore, [U]γβ=([T]βγ)t[U]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t} is satisfied.


  1. Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p121-122 ↩︎