logo

Matrix Operations: Scalar Multiplication, Addition, and Multiplication 📂Matrix Algebra

Matrix Operations: Scalar Multiplication, Addition, and Multiplication

Scalar Multiplication

The multiplication of an arbitrary matrix $A$ of size $m \times n$ by a scalar $k$ is defined as multiplying each element of $A$ by $k$ and is denoted as follows:

$$ kA = k\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} := \begin{bmatrix} ka_{11} & ka_{12} & \cdots & ka_{1n} \\ ka_{21} & ka_{22} & \cdots & ka_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & ka_{m2} & \cdots & ka_{mn} \end{bmatrix} $$

By definition, the multiplication of a scalar and a matrix is commutative, although the scalar is usually written in front.

$$ kA = Ak $$

Addition

The addition of two matrices $A$ and $B$, each of size $m \times n$, is defined as adding the elements in the same row and column and is denoted as follows:

$$ \begin{align*} A+B &= \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} + \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1n} \\ b_{21} & b_{22} & \cdots & b_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ b_{m1} & b_{m2} & \cdots & b_{mn} \end{bmatrix} \\ &:=\begin{bmatrix} a_{11} + b_{11} & a_{12} + b_{12} & \cdots & a_{1n} + b_{1n} \\ a_{21} + b_{21} & a_{22} + b_{22} & \cdots & a_{2n} + b_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} + b_{m1} & a_{m2} + b_{m2} & \cdots & a_{mn} + b_{mn} \end{bmatrix} \end{align*} $$

As can be seen from the definition, the addition of two matrices is only defined for matrices of the same size and is commutative.

$$ A+B=BA $$

Multiplication

While multiplying a matrix by a scalar or adding two matrices is intuitively acceptable, multiplication is a bit different. Let’s first look at the multiplication of a row vector and a column vector.

The multiplication of a row vector $A=\begin{bmatrix} a_{1} & a_{2} & \cdots & a_{n} \end{bmatrix}$ of size $1\times n$ and a column vector $B= \begin{bmatrix} b_{1} \\ b_{2} \\ \vdots \\ b_{n} \end{bmatrix}$ of size $n \times 1$ is defined as follows:

$$ \begin{align*} AB =\begin{bmatrix} a_{1} & a_{2} & \cdots & a_{n} \end{bmatrix}\begin{bmatrix} b_{1} \\ b_{2} \\ \vdots \\ b_{n} \end{bmatrix} &:= a_{1}b_{1}+a_{2}b_{2} + \cdots +a_{n}b_{n} \\ &= \sum \limits_{i=1}^{n}a_{i}b_{i} \end{align*} $$

If we were to explain this definition in words, it would be “the sum of the products of the elements in the same order,” which is conceptually the same as the dot product of two vectors learned in high school.

$$ \begin{align*} \vec{a} &=(a_{1},a_{2},a_{3}) \\ \vec{b} &= (b_{1},b_{2},b_{3}) \end{align*},\quad \vec{a} \cdot \vec{b} = a_{1}b_{1} + a_{2}b_{2} + a_{3}b_{3} $$

The multiplication of two matrices can be thought of as an extension of this concept. The multiplication of a $m\times n$ matrix $A$ and a $m\times k$ matrix $B$ is defined as follows:

$$ \begin{align*} AB &= \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1k} \\ b_{21} & b_{22} & \cdots & b_{2k} \\ \vdots & \vdots & \ddots & \vdots \\ b_{n1} & b_{n2} & \cdots & b_{nk} \end{bmatrix} \\ &:= \begin{bmatrix} \sum_{i=1}^{n} a_{1i}b_{i1} & \sum_{i=1}^{n} a_{1i}b_{i2} & \cdots & \sum_{i=1}^{n} a_{1i}b_{ik} \\ \sum_{i=1}^{n} a_{2i}b_{i1} & \sum_{i=1}^{n} a_{2i}b_{i2} & \cdots & \sum_{i=1}^{n} a_{2i}b_{ik} \\ \vdots & \vdots & \ddots & \vdots \\ \sum_{i=1}^{n} a_{mi}b_{i1} & \sum_{i=1}^{n} a_{mi}b_{i2} & \cdots & \sum_{i=1}^{n} a_{mi}b_{ik} \end{bmatrix} \end{align*} $$

Although the formula might seem complicated, it is merely the result of doing several multiplications of row vectors and column vectors. The element in the $n$ row and $k$ column of the resulting matrix $AB$ from the multiplication of $A$ and $B$ is the same as the dot product of the $n$ row of $A$ and the $k$ column of $B$. Therefore, the number of columns of $A$ and the number of rows of $B$ must be the same for the multiplication between them to be defined. Also, the multiplication of two matrices generally does not follow the commutative law.

$$ AB \ne BA $$

This can be easily verified with a simple example. If we say $A=\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$, $\begin{bmatrix} 2 & -1 \\ 1 & 1 \end{bmatrix}$, then

$$ \begin{align*} AB &=\begin{bmatrix} 2+1 & -1+1 \\ 0+1 & 0+1 \end{bmatrix}=\begin{bmatrix} 3 & 0 \\ 1 & 1 \end{bmatrix} \\ BA &=\begin{bmatrix} 2+0 & 2-1 \\ 1+0 & 1+1 \end{bmatrix} = \begin{bmatrix} 2 & 1 \\ 1 & 2 \end{bmatrix} \end{align*} $$

Therefore,

$$ AB\ne BA $$

The process of matrix multiplication can be visually represented as follows:

Properties1

Let’s assume that $A$, $B$, and $C$ are arbitrary matrices of size $m \times n$. Let $r$ and $s$ be arbitrary constants. The following properties of matrix operations hold:

(a) Commutative law for addition: $A + B = B + A$

(b) Associative law for addition: $A + (B + C) = (A + B) + C$

(c) $(r + s)A = rA + sA$

(d) $r(A + B) = rA + rB $

(e) $(rs)A = r(sA)$

Assuming $A$, $B$, and $C$ are arbitrary matrices of size $n\times n$:

(f) Associative law for multiplication: $A(BC) = (AB)C$

(g) Distributive law for multiplication $A(B+C) = AB + AC \quad \& \quad (A+B)C=AC + BC$

It should be noted again that the commutative law does not hold for matrix multiplication.


  1. Jim Hefferon, Linear Algebra(4th Edition). 2020, p235 ↩︎