Transpose of Linear Transformations Defined by Dual Spaces
📂Linear Algebra Transpose of Linear Transformations Defined by Dual Spaces Theorem Let’s denote the ordered bases of two finite-dimensional vector spaces V , W V, W V , W as β , γ \beta, \gamma β , γ , respectively. For any linear transformation T : V → W T : V \to W T : V → W , the following defined function U U U is a linear transformation and satisfies [ U ] γ ∗ β ∗ = ( [ T ] β γ ) t [U]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t} [ U ] γ ∗ β ∗ = ([ T ] β γ ) t .
U : W ∗ → V ∗ by U ( g ) = g T ∀ g ∈ W ∗
U : W^{\ast} \to V^{\ast} \quad \text{ by } \quad U(g) = gT \quad \forall g \in W^{\ast}
U : W ∗ → V ∗ by U ( g ) = g T ∀ g ∈ W ∗
Here, [ T ] β γ [T]_{\beta}^{\gamma} [ T ] β γ is the matrix representation of T T T , t {}^{t} t is the transpose of a matrix, V ∗ V^{\ast} V ∗ is the dual space of V V V , and β ∗ \beta^{\ast} β ∗ is the dual basis of β \beta β .
Definition The linear transformation U U U by the above theorem is called the transpose of T T T and is denoted by T t T^{t} T t .
Explanation Since the transpose matrix of the matrix representation of T T T is the matrix representation of U U U , it is very natural to denote it as T t T^{t} T t and call it a transpose.
[ T t ] γ ∗ β ∗ = ( [ T ] β γ ) t
[T^{t}]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t}
[ T t ] γ ∗ β ∗ = ([ T ] β γ ) t
Meanwhile, from the definition, g T gT g T is the composition of g g g and T T T . T : V → W T : V \to W T : V → W is given, and W ∗ ∋ g : W → R W^{\ast} \ni g : W \to \mathbb{R} W ∗ ∋ g : W → R , therefore, g T ( x ) = g ( T ( x ) ) gT(x) = g(T(x)) g T ( x ) = g ( T ( x )) .
Proof For any g ∈ W ∗ g \in W^{\ast} g ∈ W ∗ , since g : W → R g : W \to \mathbb{R} g : W → R and T : V → W T : V \to W T : V → W , U ( g ) = g T : V → R U(g) = gT : V \to \mathbb{R} U ( g ) = g T : V → R and g T ∈ V ∗ gT \in V^{\ast} g T ∈ V ∗ . Therefore, U : W ∗ → V ∗ U : W^{\ast} \to V^{\ast} U : W ∗ → V ∗ .
Let us denote the two ordered bases as β = { v 1 , … , v n } , γ = { w 1 , … , w m } \beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\}, \gamma = \left\{ \mathbf{w}_{1}, \dots, \mathbf{w}_{m} \right\} β = { v 1 , … , v n } , γ = { w 1 , … , w m } . And their respective dual bases as β ∗ = { f 1 , … , f n } , γ ∗ = { g 1 , … , g m } \beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\}, \gamma^{\ast} = \left\{ g_{1}, \dots, g_{m} \right\} β ∗ = { f 1 , … , f n } , γ ∗ = { g 1 , … , g m } . For the sake of notation convenience, let us denote them as A = [ T ] β γ A = [T]_{\beta}^{\gamma} A = [ T ] β γ . To find the matrix representation, one needs to see how each base is mapped. Thus, to find the j j j -th column of [ U ] γ ∗ β ∗ [U]_{\gamma^{\ast}}^{\beta^{\ast}} [ U ] γ ∗ β ∗ , let’s calculate U ( g j ) U(g_{j}) U ( g j ) .
Dual Spaces and Dual Bases
Let’s call β = { v 1 , … , v n } \beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\} β = { v 1 , … , v n } the ordered basis of X X X , and β ∗ = { f 1 , … , f n } \beta^{\ast} = \left\{ f_{1}, \dots, f_{n} \right\} β ∗ = { f 1 , … , f n } the dual basis of X ∗ X^{\ast} X ∗ . Then, for f ∈ X ∗ f \in X^{\ast} f ∈ X ∗ ,
f = ∑ i = 1 n f ( v i ) f i
f = \sum_{i=1}^{n}f(\mathbf{v}_{i})f_{i}
f = i = 1 ∑ n f ( v i ) f i
By the properties of the dual space and g j T ∈ V ∗ g_{j}T \in V^{\ast} g j T ∈ V ∗ ,
U ( g j ) = g j T = ∑ s ( g j T ) ( v s ) f s
U(g_{j}) = g_{j}T = \sum_{s}(g_{j}T)(\mathbf{v}_{s})f_{s}
U ( g j ) = g j T = s ∑ ( g j T ) ( v s ) f s
[ U ( g j ) ] β ∗ = [ ( g j T ) ( v 1 ) ( g j T ) ( v 2 ) ⋮ ( g j T ) ( v n ) ]
[U(g_{j})]_{\beta^{\ast}} = \begin{bmatrix} (g_{j}T)(\mathbf{v}_{1}) \\ (g_{j}T)(\mathbf{v}_{2}) \\ \vdots \\ (g_{j}T)(\mathbf{v}_{n})\end{bmatrix}
[ U ( g j ) ] β ∗ = ( g j T ) ( v 1 ) ( g j T ) ( v 2 ) ⋮ ( g j T ) ( v n )
Therefore, the matrix representation of U U U , [ U ] γ ∗ β ∗ [U]_{\gamma^{\ast}}^{\beta^{\ast}} [ U ] γ ∗ β ∗ , is,
[ U ] γ ∗ β ∗ = [ ( g 1 T ) ( v 1 ) ( g 2 T ) ( v 1 ) ⋯ ( g n T ) ( v 1 ) ( g 1 T ) ( v 2 ) ( g 2 T ) ( v 2 ) ⋯ ( g n T ) ( v 2 ) ⋮ ⋮ ⋱ ⋮ ( g 1 T ) ( v n ) ( g 2 T ) ( v n ) ⋯ ( g n T ) ( v n ) ]
[U]_{\gamma^{\ast}}^{\beta^{\ast}} = \begin{bmatrix} (g_{1}T)(\mathbf{v}_{1}) & (g_{2}T)(\mathbf{v}_{1}) & \cdots & (g_{n}T)(\mathbf{v}_{1}) \\ (g_{1}T)(\mathbf{v}_{2}) & (g_{2}T)(\mathbf{v}_{2}) & \cdots & (g_{n}T)(\mathbf{v}_{2}) \\ \vdots & \vdots & \ddots & \vdots \\ (g_{1}T)(\mathbf{v}_{n}) & (g_{2}T)(\mathbf{v}_{n}) & \cdots & (g_{n}T)(\mathbf{v}_{n}) \end{bmatrix}
[ U ] γ ∗ β ∗ = ( g 1 T ) ( v 1 ) ( g 1 T ) ( v 2 ) ⋮ ( g 1 T ) ( v n ) ( g 2 T ) ( v 1 ) ( g 2 T ) ( v 2 ) ⋮ ( g 2 T ) ( v n ) ⋯ ⋯ ⋱ ⋯ ( g n T ) ( v 1 ) ( g n T ) ( v 2 ) ⋮ ( g n T ) ( v n )
But if each component is calculated,
( g j T ) ( v i ) = g j ( T ( v i ) ) = g j ( ∑ k m A k i w k ) = ∑ k m A k i g j ( w k ) = ∑ k m A k i δ j k = A j i
\begin{align*}
(g_{j}T)(\mathbf{v}_{i}) = g_{j}(T(\mathbf{v}_{i})) &= g_{j}\left( \sum_{k}^{m} A_{ki}\mathbf{w}_{k} \right) \\
&= \sum_{k}^{m} A_{ki} g_{j}(\mathbf{w}_{k}) \\
&= \sum_{k}^{m} A_{ki} \delta_{jk} \\
&= A_{ji}
\end{align*}
( g j T ) ( v i ) = g j ( T ( v i )) = g j ( k ∑ m A ki w k ) = k ∑ m A ki g j ( w k ) = k ∑ m A ki δ jk = A ji
Therefore, [ U ] γ ∗ β ∗ = ( [ T ] β γ ) t [U]_{\gamma^{\ast}}^{\beta^{\ast}} = ([T]_{\beta}^{\gamma})^{t} [ U ] γ ∗ β ∗ = ([ T ] β γ ) t is satisfied.
■