logo

Linear Transformation Space 📂Linear Algebra

Linear Transformation Space

Definition1

The set of all linear transformations from the vector space VV to WW is denoted as L(V,W)L(V,W).

L(V,W)=L(V,W):={T:VWT is linear } L(V, W) = \mathcal{L}(V, W) := \left\{ T : V \to W\enspace |\enspace T \text{ is linear } \right\}

This is also expressed as follows, referred to as the homomorphism space.

Hom(V,W)=L(V,W)={T:VW is linear} \operatorname{Hom}(V,W) = L(V, W) = \left\{ T : V \to W \text{ is linear} \right\}

Additionally, when W=VW = V, it is also denoted as follows, referred to as the endomorphism space.

End(V)=Hom(V,V)=Hom(V)=L(V,V)=L(V) \operatorname{End}(V) = \operatorname{Hom}(V,V) = \operatorname{Hom}(V) = L(V,V) = L(V)

The notations that are mainly used are LL, L\mathcal{L}.

Explanation

Homomorphic means, as the word suggests, being almost like an isomorphism, with the condition of being invertible omitted.

Let’s say T,UL(V,W)T, U \in L(V, W), aRa \in \mathbb{R}. The sum of two linear transformations T+UT+U and the scalar multiplication of a linear transformation aTaT are defined as follows.

(T+U)(x)=T(x)+U(x)and(aT)(x)=aT(x) for xV,aR (T+U) (x) = T(x) + U(x) \quad \text{and} \quad (aT)(x) = aT(x) \quad \text{ for } x \in V, a \in \mathbb{R}

Then, aT+UaT+U also becomes a linear transformation belonging to L(V,W)L(V, W) (closed under both operations), and L(V,W)L(V, W) becomes a vector space under the aforementioned operations.

Basis

Let V,WV, W be vector spaces of dimensions n,mn, m, respectively. Let BV={v1,,vn}\mathcal{B}_{V} = \left\{ v_{1}, \cdots, v_{n} \right\}, BW={w1,,wm}\mathcal{B}_{W} = \left\{ w_{1}, \cdots, w_{m} \right\} be bases of V,WV, W, respectively. Let’s define the linear transformation ϕij\phi_{ij} as follows.

ϕij:VWvj{wiif j=j0if jj \begin{equation} \begin{aligned} \phi_{ij} : V &\to W \\ v_{j^{\prime}} &\mapsto \begin{cases} w_{i} & \text{if } j^{\prime} = j \\ 0 & \text{if } j^{\prime} \ne j \\ \end{cases} \end{aligned} \end{equation}

Then, the set {ϕij:1im,1jn}\left\{ \phi_{ij} : 1 \le i \le m, 1 \le j \le n\right\} becomes a basis of L(V,W)L(V, W).

Linear Independence

i,jλijϕij=0    λij=0i,j \sum_{i,j} \lambda_{ij}\phi_{ij} = 0 \implies \lambda_{ij} = 0\quad \forall i,j

Showing linear independence means showing the above equation. The zero vector of the vector space L(V,W)L(V,W) is the zero transformation T0T_{0}, so let’s say i,jaijϕij=T0\sum_{i,j} a_{ij}\phi_{ij} = T_{0}. If we substitute v1v_{1} into both sides, the following is obtained.

i,jλijϕij(v1)=iλi1wi=T0(v1)=0 \sum_{i,j} \lambda_{ij}\phi_{ij}(v_{1}) = \sum_{i}\lambda_{i1}w_{i} = T_{0}(v_{1}) = \mathbf{0}

    iλi1wi=0 \implies \sum_{i}\lambda_{i1}w_{i} = \mathbf{0}

However, since {wi}\left\{ w_{i} \right\} is a basis, it concludes i\forall i λi1=0\lambda_{i1} = 0. Similarly, by substituting all of vjv_{j} into both sides of i,jλijϕij=T0\sum_{i,j} \lambda_{ij}\phi_{ij} = T_{0}, the following result is obtained.

λij=0i,j \lambda_{ij} = 0 \quad \forall i, j

Therefore, {ϕij}\left\{ \phi_{ij} \right\} is linearly independent.

Span

All that needs to be shown is that any TL(V,W)T \in L(V, W) can be expressed as a sum of ϕij\phi_{ij}. Let’s say the basis BV\mathcal{B}_{V} maps as follows.

T(vj)=i=1mbijwi \begin{equation} T(v_{j}) = \sum_{i=1}^{m} b_{ij} w_{i} \end{equation}

Then, according to (1)(1), since wi=k=1nϕik(vj)w_{i} = \sum_{k=1}^{n} \phi_{ik}(v_{j}), for x=j=1najvjVx = \sum\limits_{j=1}^{n}a_{j}v_{j} \in V, T(x)T(x) is as follows.

T(x)=T(jajvj)=jajT(vj)=jajibijwi=i,jajbijwi=i,jajbijkϕik(vj) \begin{align*} T(x) &= T ( {\textstyle\sum_{j}}a_{j}v_{j} ) = \sum\limits_{j} a_{j}T(v_{j}) \\ &= \sum\limits_{j} a_{j} \sum\limits_{i}b_{ij}w_{i} = \sum\limits_{i,j} a_{j}b_{ij}w_{i} \\ &= \sum\limits_{i,j} a_{j}b_{ij}\sum_{k}\phi_{ik}(v_{j}) \end{align*}

But here, since inevitably kjk \ne j if ϕik(vj)=0\phi_{ik}(v_{j}) = 0, for a fixed jj, the following holds.

bijkϕik(vj)=kbikϕik(vj) b_{ij}\sum_{k}\phi_{ik}(v_{j}) = \sum_{k}b_{ik}\phi_{ik}(v_{j})

Therefore,

T(x)=i,jajkbikϕik(vj)=i,j,kbikϕik(ajvj)=i,kbikϕik(x)=i,jbijϕij(x)=(i,jbijϕij)(x)    T=i,jbijϕij \begin{align*} && T(x) &= \sum\limits_{i,j} a_{j}\sum_{k}b_{ik}\phi_{ik}(v_{j}) = \sum\limits_{i,j,k} b_{ik}\phi_{ik}(a_{j}v_{j}) \\ && &= \sum\limits_{i,k} b_{ik}\phi_{ik}(x) = \sum\limits_{i,j} b_{ij}\phi_{ij}(x) \\ && &= \left( \sum\limits_{i,j} b_{ij}\phi_{ij} \right)(x) \\ \implies && T &= \sum\limits_{i,j} b_{ij}\phi_{ij} \end{align*}

{ϕij}\left\{ \phi_{ij} \right\} is a basis because it spans L(V,W)L(V, W). Also, here bijb_{ij} is the component (i,j)(i,j) of the matrix representation [T]BVBW\begin{bmatrix} T \end{bmatrix}_{\mathcal{B}_{V}}^{\mathcal{B}_{W}} of TT.

[T]BVBW=[bij] \begin{bmatrix} T \end{bmatrix}_{\mathcal{B}_{V}}^{\mathcal{B}_{W}} = [b_{ij}]

Representation with respect to Dual Basis

The content is much easier to understand when expressed in terms of dual bases. Let’s say the dual basis of BV\mathcal{B}_{V} is {v1,,vn}\left\{ v_{1}^{\ast}, \dots, v_{n}^{\ast} \right\}. Then, it is possible to consider the following linear transformation corresponding to wiw_{i} and vjv_{j}^{\ast}.

wivj:VWxvj(x)wii,j \begin{align*} w_{i}v_{j}^{\ast} : V &\to W \\ x &\mapsto v_{j}^{\ast}(x)w_{i} \end{align*} \quad \forall i, j

It is a linear transformation with rank 11, trivially mapping only to the scalar multiple of wiw_{i} for a fixed ii, essentially no different from the definition of (1)(1). Since {wi}\left\{ w_{i} \right\} is a basis, it is self-evident that for the index ii, {wivj}\left\{ w_{i}v_{j}^{\ast} \right\} is linearly independent. By checking jλijwivj(vj)=0\sum_{j}\lambda_{ij}w_{i}v_{j}^{\ast}(v_{j^{\prime}}) = 0 in the manner shown above, it is also understood that it is independent for the index jj. It is easy to show that any TL(V,W)T \in L(V, W) can be expressed as a linear combination of {wivj}\left\{ w_{i}v_{j}^{\ast} \right\}. The coefficients here are the components bijb_{ij} of the matrix representation of TT. For x=jxjvjVx = \sum_{j}x_{j}v_{j} \in V, T(vj)=ibijwiT(v_{j}) = \sum_{i}b_{ij}w_{i},

T(x)=T(jxjvj)=jxjT(vj)=jxjibijwi=i,jbijxjwi=i,jbijvj(x)wi=i,jbijwivj(x)=(i,jbijwivj)(x) \begin{align*} T(x) &= T(\textstyle{\sum_{j}}x_{j}v_{j}) \\ &= \sum_{j} x_{j} T(v_{j}) \\ &= \sum_{j} x_{j} \sum_{i}b_{ij}w_{i} \\ &= \sum_{i,j} b_{ij}x_{j}w_{i} \\ &= \sum_{i,j} b_{ij}v_{j}^{\ast}(x)w_{i} \\ &= \sum_{i,j} b_{ij}w_{i}v_{j}^{\ast}(x) \\ &= (\textstyle{\sum_{i,j} b_{ij}w_{i}v_{j}^{\ast}})(x) \end{align*}

    T=i,jbijwivj \implies T = \sum_{i,j} b_{ij}w_{i}v_{j}^{\ast}


  1. Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p82 ↩︎