logo

Block Matrices 📂Matrix Algebra

Block Matrices

Definition

Let’s say $A$ is a matrix $m \times n$.

$$ A = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \\ \end{bmatrix} $$

Consider any vertical and horizontal lines that cut the matrix as follows.

$$ A = \left[ \begin{array}{cc|ccc|c|} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} & \cdots & a_{1n-1} & a_{1n} \\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25} & \cdots & a_{2n-1} & a_{2n} \\ \hline a_{31} & a_{32} & a_{33} & a_{34} & a_{35} & \cdots & a_{3n-1} & a_{3n} \\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45} & \cdots & a_{4n-1} & a_{4n} \\ \hline \vdots & \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots & \\ \hline a_{m1} & a_{m2} & a_{m3} & a_{m4} & a_{m5} & \cdots & a_{m-1n} & a_{mn} \end{array} \right] $$

The parts cut by each line are referred to as the blocks of $A$.

$$ A_{11} = \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{bmatrix},\quad A_{12} = \begin{bmatrix} a_{13} & a_{14} & a_{15}\\ a_{23} & a_{24} & a_{25} \end{bmatrix},\quad \cdots,\quad A_{kl} = \begin{bmatrix} a_{m-1n} & a_{mn}\end{bmatrix} $$

A matrix $A$ represented as blocks like the following is called a block matrix.

$$ A = \begin{bmatrix} A_{11} & A_{12} & \cdots & A_{1l} \\ A_{21} & A_{22} & \cdots & A_{2l} \\ \vdots & \vdots & \ddots & \vdots \\ A_{k1} & A_{k2} & \cdots & A_{kl} \\ \end{bmatrix} $$

Explanation

Handling matrices as block matrices simplifies matrix calculations. In fact, the calculation of block matrices can be done just like normal matrix calculations.

$$ A = \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix}\quad \text{and} \quad B = \begin{bmatrix} B_{11} & B_{12} \\ B_{21} & B_{22} \end{bmatrix} \\[1em] \implies AB = \begin{bmatrix} A_{11}B_{11} + A_{12}B_{21} & A_{11}B_{12} + A_{12}B_{22} \\ A_{21}B_{11} + A_{22}B_{21} & A_{21}B_{12} + A_{22}B_{22} \end{bmatrix} $$

Therefore, if the matrix is converted into a block matrix form including zero matrices or identity matrices, the matrix multiplication can be easily calculated.

$$ A = \begin{bmatrix} A_{11} & I \\ O & A_{22} \end{bmatrix}\quad \text{and} \quad B = \begin{bmatrix} B_{11} & B_{12} \\ B_{21} & B_{22} \end{bmatrix} \\[1em] \implies AB = \begin{bmatrix} A_{11}B_{11} + B_{21} & A_{11}B_{12} + B_{22} \\ A_{22}B_{21} & A_{22}B_{22} \end{bmatrix} $$

A matrix split into row vectors and column vectors is also a block matrix.

$$ \begin{align*} A =& \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} = \begin{bmatrix} \mathbf{r}_{1} \\ \mathbf{r}_{2} \\ \vdots \\ \mathbf{r}_{m} \end{bmatrix} \\ =& \begin{bmatrix} \mathbf{c}_{1} & \mathbf{c}_{2} & \cdots & \mathbf{c}_{n} \end{bmatrix} \end{align*} $$

Therefore, $A \mathbf{x}$ can be represented as follows.

$$ \begin{align*} A \mathbf{x} &= \begin{bmatrix} \mathbf{c}_{1} & \mathbf{c}_{2} & \cdots & \mathbf{c}_{n} \end{bmatrix} \begin{bmatrix} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{bmatrix} \\ &= \sum_{i}^{n} x_{i}\mathbf{c}_{i} \end{align*} $$

Furthermore, let’s say $A$ is a matrix $m \times p$ and $\mathbf{a}_{i}$ is a row vector of $A$, and $B$ is a matrix $p \times n$ and $\mathbf{b}_{i}$ is a column vector of $B$. Then the multiplication of the two matrices is as follows.

$$ AB = \begin{bmatrix} \mathbf{a}_{1} \\ \mathbf{a}_{2} \\ \vdots \\ \mathbf{a}_{m} \end{bmatrix} \begin{bmatrix} \mathbf{b}_{1} & \mathbf{b}_{2} & \cdots & \mathbf{b}_{n} \end{bmatrix} = \begin{bmatrix} \mathbf{a}_{1} \mathbf{b}_{1} & \mathbf{a}_{1} \mathbf{b}_{2} & \cdots & \mathbf{a}_{1} \mathbf{b}_{n} \\ \mathbf{a}_{2} \mathbf{b}_{1} & \mathbf{a}_{2} \mathbf{b}_{2} & \cdots & \mathbf{a}_{2} \mathbf{b}_{n} \\ \vdots & \vdots & \ddots & \vdots \\ \mathbf{a}_{m} \mathbf{b}_{1} & \mathbf{a}_{m} \mathbf{b}_{2} & \cdots & \mathbf{a}_{m} \mathbf{b}_{n} \\ \end{bmatrix} $$

Theorem

Let’s say $A = \begin{bmatrix} A_{1} & A_{2} \\ O & A_{3} \end{bmatrix}$ is a block matrix. The following holds for its determinant.

$$ \det A = \det A_{1} \det A_{3} $$

Corollary

  • The determinant of the block matrix $A = \begin{bmatrix} A_{1} & A_{2} \\ O & I \end{bmatrix}$ is the same as $\det A_{1}$.
  • A matrix $A$ that can be expressed in the following form by taking permutations of rows and columns is called a reducible matrix, and its determinant $\det A$ is either $\det B \det D$ or $-\det B \det D$. $$ \widetilde{A} = \begin{bmatrix} B & O \\ C & D \end{bmatrix} $$

Proof

The block matrix $A$ can be decomposed into the product of three block matrices as follows.

$$ \begin{align*} A &= \begin{bmatrix} A_{1} & A_{2} \\ O & A_{3} \end{bmatrix} \\ &= \begin{bmatrix} IA_{1} + OO & IA_{2} + OI \\ OA_{1} + A_{3}O & OA_{2} + A_{3}I \end{bmatrix} \\ &= \begin{bmatrix} I & O \\ O & A_{3} \end{bmatrix} \begin{bmatrix} A_{1} & A_{2} \\ O & I \end{bmatrix} \\ &= \begin{bmatrix} I & O \\ O & A_{3} \end{bmatrix} \begin{bmatrix} IA_{1} + A_{2}O & IO + A_{2}I \\ OA_{1} + IO & OO + II \end{bmatrix} \\ &= \begin{bmatrix} I & O \\ O & A_{3} \end{bmatrix} \begin{bmatrix} I & A_{2} \\ O & I \end{bmatrix} \begin{bmatrix} A_{1} & O \\ O & I \end{bmatrix} \end{align*} $$

Since the determinant of the product is the same as the product of determinants

$$ \begin{align*} \det A &= \det \left( \begin{bmatrix} I & O \\ O & A_{3} \end{bmatrix} \begin{bmatrix} I & A_{2} \\ O & I \end{bmatrix} \begin{bmatrix} A_{1} & O \\ O & I \end{bmatrix} \right) \\ &= \det \left( \begin{bmatrix} I & O \\ O & A_{3} \end{bmatrix} \right) \det \left( \begin{bmatrix} I & A_{2} \\ O & I \end{bmatrix} \right) \det \left( \begin{bmatrix} A_{1} & O \\ O & I \end{bmatrix} \right) \end{align*} $$

Considering the Laplace expansion of the determinant,

$$ \det \left( \begin{bmatrix} I & O \\ O & A_{3} \end{bmatrix} \right) = \det A_{3},\quad \det \left( \begin{bmatrix} A_{1} & O \\ O & I \end{bmatrix} \right) = \det A_{1}, $$

$$ \text{and} \quad \det \left( \begin{bmatrix} I & A_{2} \\ O & I \end{bmatrix} \right)=1 $$

we can see that

$$ \det A = \det A_{1} \det A_{3} $$