logo

Equivalence Conditions for Orthogonal Matrices 📂Matrix Algebra

Equivalence Conditions for Orthogonal Matrices

Theorem

For a real matrix $A$, the following propositions are all equivalent.

(a) $A$ is an orthogonal matrix.

(b) The set of row vectors of $A$ forms a normal orthogonal set in $\mathbb{R}^n$.

(c) The set of column vectors of $A$ forms a normal orthogonal set in $\mathbb{R}^n$.

(d) $A$ preserves inner product, i.e., for all $\mathbf{x},\mathbf{y}\in \mathbb{R}^{n}$, the following holds:

$$ (A \mathbf{x}) \cdot (A\mathbf{y}) = \mathbf{x} \cdot \mathbf{y} $$

(e) $A$ preserves length, i.e., for all $\mathbf{x}\in \mathbb{R}^{n}$, the following holds:

$$ \left\| A \mathbf{x} \right\| = \left\| \mathbf{x} \right\| $$

Proof

(a)$\iff$(b) and (a)$\iff$(c) are proven in the same way, so the latter is omitted.

(a) $\iff$ (b)

Let’s denote $A$ as a $n\times n$ matrix and its row vectors by $\mathbf{r}_{i}$.

$$ A = \begin{bmatrix} \mathbf{r}_{1} \\ \mathbf{r}_{2} \\ \vdots \\ \mathbf{r}_{n} \end{bmatrix} $$

Then $A A^{T}$ is as follows:

$$ AA^{T} = \begin{bmatrix} \mathbf{r}_{1} \\ \mathbf{r}_{2} \\ \vdots \\ \mathbf{r}_{n} \end{bmatrix} \begin{bmatrix} \mathbf{r}_{1}^{T} & \mathbf{r}_{2}^{T} & \cdots & \mathbf{r}_{n}^{T} \end{bmatrix} = \begin{bmatrix} \mathbf{r}_{1} \cdot \mathbf{r}_{1} & \mathbf{r}_{1} \cdot \mathbf{r}_{2} & \cdots & \mathbf{r}_{1} \cdot \mathbf{r}_{n} \\ \mathbf{r}_{2} \cdot \mathbf{r}_{1} & \mathbf{r}_{2} \cdot \mathbf{r}_{2} & \cdots & \mathbf{r}_{2} \cdot \mathbf{r}_{n} \\ \vdots & \vdots & \ddots & \vdots \\ \mathbf{r}_{n} \cdot \mathbf{r}_{1} & \mathbf{r}_{n} \cdot \mathbf{r}_{2} & \cdots & \mathbf{r}_{n} \cdot \mathbf{r}_{n} \end{bmatrix} $$

  • (a) $\implies$ (b)

    Assume $A$ is an orthogonal matrix. Then $AA^{T}=I$, which leads to:

    $$ \mathbf{r}_{i} \cdot \mathbf{r}_{j} = \begin{cases} 1, & i=j \\ 0, & i\ne j \end{cases} $$

    Hence, the set of row vectors of $A$ $\left\{ \mathbf{r}_{i} \right\}$ forms a normal orthogonal set.

  • (a) $\Longleftarrow$ (b)

    Suppose $\left\{ \mathbf{r}_{i} \right\}$ forms a normal orthogonal set. Then,

    $$ \mathbf{r}_{i} \cdot \mathbf{r}_{j} = \begin{cases} 1, & i=j \\ 0, & i\ne j \end{cases} $$

    leads to

    $$ AA^{T} = \begin{bmatrix} \mathbf{r}_{1} \cdot \mathbf{r}_{1} & \mathbf{r}_{1} \cdot \mathbf{r}_{2} & \cdots & \mathbf{r}_{1} \cdot \mathbf{r}_{n} \\ \mathbf{r}_{2} \cdot \mathbf{r}_{1} & \mathbf{r}_{2} \cdot \mathbf{r}_{2} & \cdots & \mathbf{r}_{2} \cdot \mathbf{r}_{n} \\ \vdots & \vdots & \ddots & \vdots \\ \mathbf{r}_{n} \cdot \mathbf{r}_{1} & \mathbf{r}_{n} \cdot \mathbf{r}_{2} & \cdots & \mathbf{r}_{n} \cdot \mathbf{r}_{n} \end{bmatrix} = \begin{bmatrix} 1 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \end{bmatrix} =I $$

(a) $\iff$ (d) $\iff$ (e)

  • (a) $\implies$ (d)

    Assume $A$ is an orthogonal matrix and $\mathbf{x}$, $\mathbf{y} \in \mathbb{R}^{n}$. Then by the property of the transpose matrix and assumption, the following equation holds:

    $$ \begin{align*} \left( A \mathbf{x} \right) \cdot \left( A \mathbf{y} \right) &= \left( A \mathbf{x} \right)^{T} \left( A \mathbf{y} \right) \\ &= \mathbf{x}^{T} A^{T} A \mathbf{y} \\ &= \mathbf{x}^{T} \mathbf{y} \\ &= \mathbf{x} \cdot \mathbf{y} \end{align*} $$

  • (d) $\implies$ (e)

    Assuming $A$ preserves the inner product, the following equation by assumption:

    $$ \begin{align*} \left\| A \mathbf{x} \right\| &= \sqrt{(A \mathbf{x}) \cdot (A \mathbf{x})} \\ &= \sqrt{\mathbf{x} \cdot \mathbf{x}} \\ &= \left\| \mathbf{x} \right\| \end{align*} $$

  • (e) $\implies$ (a)

    Assuming $\left\| A \mathbf{x} \right\| =\left\| \mathbf{x}\right\|$ holds, the following equation is true:

    $$ \begin{align*} && \left\| A \mathbf{x} \right\| &= \left\| \mathbf{x}\right\| \\ \implies && \left\| A \mathbf{x} \right\|^{2} &= \left\| \mathbf{x}\right\|^{2} \\ \implies && A \mathbf{x} \cdot A \mathbf{x} &= \mathbf{x} \cdot \mathbf{x} \end{align*} $$

    By the property of the inner product $A \mathbf{u} \cdot \mathbf{v} = \mathbf{u} \cdot A^{T} \mathbf{v}$, the above equation is:

    $$ A \mathbf{x} \cdot A \mathbf{x} = \mathbf{x} \cdot A^{T} A \mathbf{x} = \mathbf{x} \cdot \mathbf{x} $$

    Rearranging gives:

    $$ \mathbf{x} \cdot \left( A^{T}A\mathbf{x} -\mathbf{x}\right) = 0 $$

    This must hold for all $\mathbf{x}$, hence $\left( A^{T}A\mathbf{x} -\mathbf{x}\right) = 0$. Therefore,

    $$ \begin{align*} && \left( A^{T}A\mathbf{x} -\mathbf{x} \right) &= 0 \\ \implies && \left( A^{T}A -I\right) \mathbf{x} &= 0 \end{align*} $$

    This also must satisfy for all $\mathbf{x}$, leading to:

    $$ A^{T}A-I = 0 \implies A^{T}A=I $$

    Hence, $A$ is an orthogonal matrix.