logo

Matrix Representation of the Sum and Scalar Multiplication of Linear Transformations 📂Linear Algebra

Matrix Representation of the Sum and Scalar Multiplication of Linear Transformations

Theorem

Let $V, W$ be a finite-dimensional vector space with a given ordered basis $\beta, \gamma$. Also, let $T, U : V \to W$. Then, the following hold:$\\[0.5em]$

  • $[T + U]_{\beta}^{\gamma} = [T]_{\beta}^{\gamma} + [U]_{\beta}^{\gamma}$

  • $[aT]_{\beta}^{\gamma} = a[T]_{\beta}^{\gamma}$

Here, $[T]_{\beta}^{\gamma}$ is the matrix representation of $T$.

Proof

Since the proofs are similar, we will only prove the first equation. Let $\beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\}$ and $\gamma = \left\{ \mathbf{w}_{1}, \dots, \mathbf{w}_{m} \right\}$. Then, by the uniqueness of basis representation, there exists a unique scalar $a_{ij}, b_{ij}$ such that:

$$ T(\mathbf{v}_{j}) = \sum_{i=1}^{m}a_{ij}\mathbf{w}_{i} \quad \text{and} \quad U(\mathbf{v}_{j}) = \sum_{i=1}^{m}b_{ij}\mathbf{w}_{i} $$

Therefore,

$$ (T + U)(\mathbf{v}_{j}) = T(\mathbf{v}_{j}) + U(\mathbf{v}_{j}) = \sum_{i=1}^{m}a_{ij}\mathbf{w}_{i} + \sum_{i=1}^{m}b_{ij}\mathbf{w}_{i} = \sum_{i=1}^{m}(a_{ij} + b_{ij})\mathbf{w}_{i} $$

Hence,

$$ ([T + U]_{\beta}^{\gamma})_{ij} = a_{ij} + b_{ij} = ([T]_{\beta}^{\gamma})_{ij} + ([U]_{\beta}^{\gamma})_{ij} $$