logo

Linear Transformation Space 📂Linear Algebra

Linear Transformation Space

Definition1

The set of all linear transformations from the vector space $V$ to $W$ is denoted as $L(V,W)$.

$$ L(V, W) = \mathcal{L}(V, W) := \left\{ T : V \to W\enspace |\enspace T \text{ is linear } \right\} $$

This is also expressed as follows, referred to as the homomorphism space.

$$ \operatorname{Hom}(V,W) = L(V, W) = \left\{ T : V \to W \text{ is linear} \right\} $$

Additionally, when $W = V$, it is also denoted as follows, referred to as the endomorphism space.

$$ \operatorname{End}(V) = \operatorname{Hom}(V,V) = \operatorname{Hom}(V) = L(V,V) = L(V) $$

The notations that are mainly used are $L$, $\mathcal{L}$.

Explanation

Homomorphic means, as the word suggests, being almost like an isomorphism, with the condition of being invertible omitted.

Let’s say $T, U \in L(V, W)$, $a \in \mathbb{R}$. The sum of two linear transformations $T+U$ and the scalar multiplication of a linear transformation $aT$ are defined as follows.

$$ (T+U) (x) = T(x) + U(x) \quad \text{and} \quad (aT)(x) = aT(x) \quad \text{ for } x \in V, a \in \mathbb{R} $$

Then, $aT+U$ also becomes a linear transformation belonging to $L(V, W)$ (closed under both operations), and $L(V, W)$ becomes a vector space under the aforementioned operations.

Basis

Let $V, W$ be vector spaces of dimensions $n, m$, respectively. Let $\mathcal{B}_{V} = \left\{ v_{1}, \cdots, v_{n} \right\}$, $\mathcal{B}_{W} = \left\{ w_{1}, \cdots, w_{m} \right\}$ be bases of $V, W$, respectively. Let’s define the linear transformation $\phi_{ij}$ as follows.

$$ \begin{equation} \begin{aligned} \phi_{ij} : V &\to W \\ v_{j^{\prime}} &\mapsto \begin{cases} w_{i} & \text{if } j^{\prime} = j \\ 0 & \text{if } j^{\prime} \ne j \\ \end{cases} \end{aligned} \end{equation} $$

Then, the set $\left\{ \phi_{ij} : 1 \le i \le m, 1 \le j \le n\right\}$ becomes a basis of $L(V, W)$.

Linear Independence

$$ \sum_{i,j} \lambda_{ij}\phi_{ij} = 0 \implies \lambda_{ij} = 0\quad \forall i,j $$

Showing linear independence means showing the above equation. The zero vector of the vector space $L(V,W)$ is the zero transformation $T_{0}$, so let’s say $\sum_{i,j} a_{ij}\phi_{ij} = T_{0}$. If we substitute $v_{1}$ into both sides, the following is obtained.

$$ \sum_{i,j} \lambda_{ij}\phi_{ij}(v_{1}) = \sum_{i}\lambda_{i1}w_{i} = T_{0}(v_{1}) = \mathbf{0} $$

$$ \implies \sum_{i}\lambda_{i1}w_{i} = \mathbf{0} $$

However, since $\left\{ w_{i} \right\}$ is a basis, it concludes $\forall i$ $\lambda_{i1} = 0$. Similarly, by substituting all of $v_{j}$ into both sides of $\sum_{i,j} \lambda_{ij}\phi_{ij} = T_{0}$, the following result is obtained.

$$ \lambda_{ij} = 0 \quad \forall i, j $$

Therefore, $\left\{ \phi_{ij} \right\}$ is linearly independent.

Span

All that needs to be shown is that any $T \in L(V, W)$ can be expressed as a sum of $\phi_{ij}$. Let’s say the basis $\mathcal{B}_{V}$ maps as follows.

$$ \begin{equation} T(v_{j}) = \sum_{i=1}^{m} b_{ij} w_{i} \end{equation} $$

Then, according to $(1)$, since $w_{i} = \sum_{k=1}^{n} \phi_{ik}(v_{j})$, for $x = \sum\limits_{j=1}^{n}a_{j}v_{j} \in V$, $T(x)$ is as follows.

$$ \begin{align*} T(x) &= T ( {\textstyle\sum_{j}}a_{j}v_{j} ) = \sum\limits_{j} a_{j}T(v_{j}) \\ &= \sum\limits_{j} a_{j} \sum\limits_{i}b_{ij}w_{i} = \sum\limits_{i,j} a_{j}b_{ij}w_{i} \\ &= \sum\limits_{i,j} a_{j}b_{ij}\sum_{k}\phi_{ik}(v_{j}) \end{align*} $$

But here, since inevitably $k \ne j$ if $\phi_{ik}(v_{j}) = 0$, for a fixed $j$, the following holds.

$$ b_{ij}\sum_{k}\phi_{ik}(v_{j}) = \sum_{k}b_{ik}\phi_{ik}(v_{j}) $$

Therefore,

$$ \begin{align*} && T(x) &= \sum\limits_{i,j} a_{j}\sum_{k}b_{ik}\phi_{ik}(v_{j}) = \sum\limits_{i,j,k} b_{ik}\phi_{ik}(a_{j}v_{j}) \\ && &= \sum\limits_{i,k} b_{ik}\phi_{ik}(x) = \sum\limits_{i,j} b_{ij}\phi_{ij}(x) \\ && &= \left( \sum\limits_{i,j} b_{ij}\phi_{ij} \right)(x) \\ \implies && T &= \sum\limits_{i,j} b_{ij}\phi_{ij} \end{align*} $$

$\left\{ \phi_{ij} \right\}$ is a basis because it spans $L(V, W)$. Also, here $b_{ij}$ is the component $(i,j)$ of the matrix representation $\begin{bmatrix} T \end{bmatrix}_{\mathcal{B}_{V}}^{\mathcal{B}_{W}}$ of $T$.

$$ \begin{bmatrix} T \end{bmatrix}_{\mathcal{B}_{V}}^{\mathcal{B}_{W}} = [b_{ij}] $$

Representation with respect to Dual Basis

The content is much easier to understand when expressed in terms of dual bases. Let’s say the dual basis of $\mathcal{B}_{V}$ is $\left\{ v_{1}^{\ast}, \dots, v_{n}^{\ast} \right\}$. Then, it is possible to consider the following linear transformation corresponding to $w_{i}$ and $v_{j}^{\ast}$.

$$ \begin{align*} w_{i}v_{j}^{\ast} : V &\to W \\ x &\mapsto v_{j}^{\ast}(x)w_{i} \end{align*} \quad \forall i, j $$

It is a linear transformation with rank $1$, trivially mapping only to the scalar multiple of $w_{i}$ for a fixed $i$, essentially no different from the definition of $(1)$. Since $\left\{ w_{i} \right\}$ is a basis, it is self-evident that for the index $i$, $\left\{ w_{i}v_{j}^{\ast} \right\}$ is linearly independent. By checking $\sum_{j}\lambda_{ij}w_{i}v_{j}^{\ast}(v_{j^{\prime}}) = 0$ in the manner shown above, it is also understood that it is independent for the index $j$. It is easy to show that any $T \in L(V, W)$ can be expressed as a linear combination of $\left\{ w_{i}v_{j}^{\ast} \right\}$. The coefficients here are the components $b_{ij}$ of the matrix representation of $T$. For $x = \sum_{j}x_{j}v_{j} \in V$, $T(v_{j}) = \sum_{i}b_{ij}w_{i}$,

$$ \begin{align*} T(x) &= T(\textstyle{\sum_{j}}x_{j}v_{j}) \\ &= \sum_{j} x_{j} T(v_{j}) \\ &= \sum_{j} x_{j} \sum_{i}b_{ij}w_{i} \\ &= \sum_{i,j} b_{ij}x_{j}w_{i} \\ &= \sum_{i,j} b_{ij}v_{j}^{\ast}(x)w_{i} \\ &= \sum_{i,j} b_{ij}w_{i}v_{j}^{\ast}(x) \\ &= (\textstyle{\sum_{i,j} b_{ij}w_{i}v_{j}^{\ast}})(x) \end{align*} $$

$$ \implies T = \sum_{i,j} b_{ij}w_{i}v_{j}^{\ast} $$


  1. Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p82 ↩︎