logo

Direct Sum of Matrices 📂Matrix Algebra

Direct Sum of Matrices

Definition1

The direct sum of two matrices BMm×nB \in M_{m\times n}, CMp×qC \in M_{p\times q} is defined as matrix AA of the following (m+p)×(n+q)(m+p) \times (n+q), and is denoted by BCB \oplus C.

A=BC:=[b11b1n00bm1bmn0000c11c1q00cp1cpq] A = B \oplus C := \begin{bmatrix} b_{11} & \cdots & b_{1n} & 0 & \cdots & 0 \\ \vdots & \ddots & \vdots & \vdots & \ddots & \vdots \\ b_{m1} & \cdots & b_{mn} & 0 & \cdots & 0 \\ 0 & \cdots & 0 & c_{11} & \cdots & c_{1q} \\ \vdots & \ddots & \vdots & \vdots & \ddots & \vdots \\ 0 & \cdots & 0 & c_{p1} & \cdots & c_{pq} \\ \end{bmatrix}

Aij:={[B]ijfor 1im, 1jn[C](im),(jn)for m+1ip+m, n+1jq+n0otherwise A_{ij} := \begin{cases} [B]_{ij} & \text{for } 1\le i \le m,\ 1\le j \le n \\ [C]_{(i-m),(j-n)} & \text{for } m+1\le i \le p+m,\ n+1\le j \le q+n \\ 0 & \text{otherwise} \end{cases}

If expressed in block matrix form,

A=[BOmqOpnC] A = \begin{bmatrix} B & O_{mq} \\ O_{pn} & C \end{bmatrix}

In this case, OO is the zero matrix.

Generalization

The direct sum of matrix B1,B2,,BkB_{1}, B_{2}, \dots, B_{k} is recursively defined as follows.

B1B2Bk:=(B1B2Bk1)Bk B_{1} \oplus B_{2} \oplus \cdots \oplus B_{k} := (B_{1} \oplus B_{2} \oplus \cdots \oplus B_{k-1}) \oplus B_{k}

If A=B1B2BkA = B_{1} \oplus B_{2} \oplus \cdots \oplus B_{k},

A=[B1OOOB2OOOBk] A = \begin{bmatrix} B_{1} & O & \cdots & O \\ O & B_{2} & \cdots & O \\ \vdots & \vdots & \ddots & \vdots \\ O & O & \cdots & B_{k} \\ \end{bmatrix}

Explanation

Simply put, it’s about making a block diagonal matrix with matrices.

B1B2Bk=diag[B1B2Bk] B_{1} \oplus B_{2} \oplus \cdots \oplus B_{k} = \href{../2048}{\diag} \begin{bmatrix} B_{1} \\ B_{2} \\ \vdots \\ B_{k} \end{bmatrix}

For a concrete example, if B1=[111111]B_{1} = \begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \end{bmatrix}, B2=[2]B_{2} = \begin{bmatrix} 2 \end{bmatrix}, and B3=[333333333]B_{3} = \begin{bmatrix} 3 & 3 & 3 \\ 3 & 3 & 3 \\ 3 & 3 & 3 \end{bmatrix},

B1B2B3=[111000011100000002000000033300003330000333] B_{1} \oplus B_{2} \oplus B_{3} = \begin{bmatrix} 1 & 1 & 1 & 0 & 0 & 0 & 0 \\ 1 & 1 & 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 2 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 3 & 3 & 3 \\ 0 & 0 & 0 & 0 & 3 & 3 & 3 \\ 0 & 0 & 0 & 0 & 3 & 3 & 3 \end{bmatrix}

In many cases, one may encounter direct sum of subspaces before the direct sum of matrices, but the theorem below is sufficient to understand why such a definition is called a direct sum. When a linear transformation T:VVT : V \to V is given, if V=W1WkV = W_{1} \oplus \cdots \oplus W_{k}, the matrix representation of TT appears as the direct sum of the matrix representations of projections TWiT|_{W_{i}}, therefore, there’s no reason not to call this operation a direct sum.

Theorem

Let T:VVT : V \to V be a linear transformation on the finite-dimensional vector space VV. Let W1,,WkW_{1}, \dots, W_{k} be an TT-invariant subspace, and VV be the direct sum of WiW_{i}.

V=W1Wk V = W_{1} \oplus \cdots \oplus W_{k}

Let βi\beta_{i} be the ordered basis of WiW_{i}, and β=β1βk\beta = \beta_{1} \cup \cdots \cup \beta_{k} (then β\beta is a basis for VV). And if A=[T]βA = \begin{bmatrix} T \end{bmatrix}_{\beta}, Bi=[TW]βiB_{i} = \begin{bmatrix} T|_{W}\end{bmatrix}_{\beta_{i}}, then the following holds.

A=B1B2Bk=[B1OOOB2OOOBk] A = B_{1} \oplus B_{2} \oplus \cdots \oplus B_{k} = \begin{bmatrix} B_{1} & O & \cdots & O \\ O & B_{2} & \cdots & O \\ \vdots & \vdots & \ddots & \vdots \\ O & O & \cdots & B_{k} \\ \end{bmatrix}

Proof

The proof is by mathematical induction.

  • It holds when k=2k=2.

    Let’s say vβ1\mathbf{v} \in \beta_{1}. Since β\beta is a basis for VV, TvVT \mathbf{v} \in V is expressed as a linear combination of β\beta. But since W1W_{1} is an invariant subspace, TvW1T \mathbf{v} \in W_{1} holds. Therefore, in the linear combination for TvT \mathbf{v}, the coefficients of the elements of β2\beta_{2} are all 00. This means, when n=dim(W1)n = \dim(W_{1}), the components of the coordinate vector [Tv]β\begin{bmatrix} T \mathbf{v} \end{bmatrix}_{\beta} are all 00 from the n+1n+1nd position onwards. Therefore, [TW1v]β1=[b1bn]and[Tv]β=[b1bn00] \begin{bmatrix} T|_{W_{1}}\mathbf{v}\end{bmatrix}_{\beta_{1}} = \begin{bmatrix} b_{1} \\ \vdots \\ b_{n} \end{bmatrix} \quad \text{and} \quad \begin{bmatrix} T \mathbf{v} \end{bmatrix}_{\beta} = \begin{bmatrix} b_{1} \\ \vdots \\ b_{n} \\ 0 \\ \vdots \\ 0 \end{bmatrix} Similarly, if vβ2\mathbf{v} \in \beta_{2}, m=dim(W2)m = \dim(W_{2}), then TvW2T \mathbf{v} \in W_{2} applies and the coordinate vector is as follows. [TW2v]β2=[bn+1bn+m]and[Tv]β=[00bn+1bn+m] \begin{bmatrix} T|_{W_{2}}\mathbf{v}\end{bmatrix}_{\beta_{2}} = \begin{bmatrix} b_{n+1} \\ \vdots \\ b_{n+m} \end{bmatrix} \quad \text{and} \quad \begin{bmatrix} T \mathbf{v} \end{bmatrix}_{\beta} = \begin{bmatrix} 0 \\ \vdots \\ 0 \\ b_{n+1} \\ \vdots \\ b_{n+m} \end{bmatrix} Therefore, [T]β=[[TW1]β1OO[TW2]β2] \begin{bmatrix} T \end{bmatrix}_{\beta} = \begin{bmatrix} \begin{bmatrix} T|_{W_{1}}\end{bmatrix}_{\beta_{1}} & O \\ O & \begin{bmatrix} T|_{W_{2}}\end{bmatrix}_{\beta_{2}} \end{bmatrix}

  • If it holds when k1k-1, it also holds when kk.

    Let’s say W=W1Wk1W = W_{1} \oplus \cdots \oplus W_{k-1}, βW=β1βk1\beta_{W} = \beta_{1} \cup \cdots \cup \beta_{k-1}. Assuming it holds when k1k-1, [TW]βW=[[TW1]β1OO[TWk1]βk1] \begin{bmatrix} T|_{W} \end{bmatrix}_{\beta_{W}} = \begin{bmatrix} \begin{bmatrix} T|_{W_{1}}\end{bmatrix}_{\beta_{1}} & \cdots & O \\ \vdots & \ddots & \vdots \\ O &\cdots & \begin{bmatrix} T|_{W_{k-1}}\end{bmatrix}_{\beta_{k-1}} \end{bmatrix} But since V=WWkV = W \oplus W_{k}, β=βWβk\beta = \beta_{W} \cup \beta_{k} and it holds when k=2k=2, [T]β=[[TW]βWOO[TWk]βk]=[[TW1]β1OOO[TWk1]βk1OOO[TWk]βk] \begin{bmatrix} T \end{bmatrix}_{\beta} = \begin{bmatrix} \begin{bmatrix} T|_{W}\end{bmatrix}_{\beta_{W}} & O \\ O & \begin{bmatrix} T|_{W_{k}}\end{bmatrix}_{\beta_{k}} \end{bmatrix} = \begin{bmatrix} \begin{bmatrix} T|_{W_{1}}\end{bmatrix}_{\beta_{1}} & \cdots & O & O \\ \vdots & \ddots & \vdots & \vdots \\ O & \cdots & \begin{bmatrix} T|_{W_{k-1}}\end{bmatrix}_{\beta_{k-1}} & O \\ O & \cdots & O & \begin{bmatrix} T|_{W_{k}}\end{bmatrix}_{\beta_{k}} \\ \end{bmatrix}


  1. Stephen H. Friedberg, Linear Algebra (4th Edition, 2002), p320-321 ↩︎