logo

Matrix Representation of the Sum and Scalar Multiplication of Linear Transformations 📂Linear Algebra

Matrix Representation of the Sum and Scalar Multiplication of Linear Transformations

Theorem

Let V,WV, W be a finite-dimensional vector space with a given ordered basis β,γ\beta, \gamma. Also, let T,U:VWT, U : V \to W. Then, the following hold:\\[0.5em]

  • [T+U]βγ=[T]βγ+[U]βγ[T + U]_{\beta}^{\gamma} = [T]_{\beta}^{\gamma} + [U]_{\beta}^{\gamma}

  • [aT]βγ=a[T]βγ[aT]_{\beta}^{\gamma} = a[T]_{\beta}^{\gamma}

Here, [T]βγ[T]_{\beta}^{\gamma} is the matrix representation of TT.

Proof

Since the proofs are similar, we will only prove the first equation. Let β={v1,,vn}\beta = \left\{ \mathbf{v}_{1}, \dots, \mathbf{v}_{n} \right\} and γ={w1,,wm}\gamma = \left\{ \mathbf{w}_{1}, \dots, \mathbf{w}_{m} \right\}. Then, by the uniqueness of basis representation, there exists a unique scalar aij,bija_{ij}, b_{ij} such that:

T(vj)=i=1maijwiandU(vj)=i=1mbijwi T(\mathbf{v}_{j}) = \sum_{i=1}^{m}a_{ij}\mathbf{w}_{i} \quad \text{and} \quad U(\mathbf{v}_{j}) = \sum_{i=1}^{m}b_{ij}\mathbf{w}_{i}

Therefore,

(T+U)(vj)=T(vj)+U(vj)=i=1maijwi+i=1mbijwi=i=1m(aij+bij)wi (T + U)(\mathbf{v}_{j}) = T(\mathbf{v}_{j}) + U(\mathbf{v}_{j}) = \sum_{i=1}^{m}a_{ij}\mathbf{w}_{i} + \sum_{i=1}^{m}b_{ij}\mathbf{w}_{i} = \sum_{i=1}^{m}(a_{ij} + b_{ij})\mathbf{w}_{i}

Hence,

([T+U]βγ)ij=aij+bij=([T]βγ)ij+([U]βγ)ij ([T + U]_{\beta}^{\gamma})_{ij} = a_{ij} + b_{ij} = ([T]_{\beta}^{\gamma})_{ij} + ([U]_{\beta}^{\gamma})_{ij}