logo

Necessary and Sufficient Conditions for a Quadratic Form to Be Zero 📂Linear Algebra

Necessary and Sufficient Conditions for a Quadratic Form to Be Zero

Theorem

Matrix Form

Let’s say $A \in \mathbb{C}^{n \times n}$ represents a matrix and $\mathbf{x} \in \mathbb{C}^{n}$ represents a vector.

The necessary and sufficient condition for the quadratic form $\mathbf{x}^{\ast} A \mathbf{x}$ to be $0$ for all $\mathbf{x} \in \mathbb{C}^{n}$ is that $A$ is a zero matrix: $$ \mathbf{x}^{*} A \mathbf{x} = 0 , \forall \mathbf{x} \in \mathbb{C}^{n} \iff A = O $$

Linear Transformation Form

When $\left( V, \mathbb{C} \right)$ is considered a finite-dimensional complex inner product space, let’s assume that $T : V \to V$ represents a linear transformation and $v \in V$ represents a vector.

The necessary and sufficient condition for the quadratic form $\left< T v , v \right>$ to be $0$ for all $v \in V$ is that $T$ is a zero transformation $T_{0}$: $$ \left< T v , v \right> = 0 , \forall v \in V \iff T = T_{0} $$


Proof

Since the proofs for both forms are essentially the same, only the matrix form, which is not in the references, is shown1.

$(\implies)$

Assume $A \ne O$ and use reductio ad absurdum.

The fact that $\mathbf{x}^{\ast} A \mathbf{x} = 0$ holds means that the same result is obtained even when any scalar $\overline{\lambda} \in \mathbb{C}$ is multiplied on both sides, which $\overline{\lambda} \mathbf{x}^{\ast} A \mathbf{x} = 0$. This holding for all $\mathbf{x}$ means that even when $\mathbf{x}$ is the corresponding eigenvector to the eigenvalue $\lambda$ of $A$, it still applies because, when expressed as a matrix inner product, $$ \begin{align*} & 0 \\ =& \overline{\lambda} \mathbf{x}^{\ast} A \mathbf{x} \\ =& \left( \lambda \mathbf{x} \right)^{\ast} \left( A \mathbf{x} \right) \\ =& \left( \lambda \mathbf{x} \right) \cdot \left( \lambda \mathbf{x} \right) \end{align*} $$ according to the positive definiteness of the inner product $\mathbf{v} \cdot \mathbf{v} = \mathbf{0} \iff \mathbf{v} = \mathbf{0}$ all the eigenvalues of $A$ must be $0$.

Nilpotent matrices and eigenvalues: The condition that all eigenvalues of a square matrix $A \in \mathbb{R}^{n \times n}$ are $0$ is equivalent to $A$ being a nilpotent matrix.

In other words, $A$ is a nilpotent matrix. Meanwhile, if $A \ne O$, there must be at least one vector $\mathbf{x} \in \mathbb{C}^{n}$ that satisfies $\mathbf{y} = A \mathbf{x}$ for some non-zero $\mathbf{y} \ne 0$. Having already shown that $A$ is a nilpotent matrix, without loss of generality, let’s say $$ \begin{align*} & 0 \\ =& \left( \mathbf{x} + \mathbf{y} \right)^{\ast} A \left( \mathbf{x} + \mathbf{y} \right) & \because 0 = \mathbf{z}^{\ast} A \mathbf{z}, \forall \mathbf{z} \in \mathbb{C}^{n} \\ =& \left( \mathbf{x} + \mathbf{y} \right)^{\ast} \left( A \mathbf{x} + A \mathbf{y} \right) \\ =& \left( \mathbf{x} + \mathbf{y} \right)^{\ast} \left( \mathbf{y} + \mathbf{0} \right) \\ =& \left( \mathbf{x}^{\ast} + \mathbf{y}^{\ast} \right) \mathbf{y} \\ =& \mathbf{x}^{\ast} \mathbf{y} + \mathbf{y}^{\ast} \mathbf{y} \\ =& \mathbf{x}^{\ast} A \mathbf{x} + \mathbf{y}^{\ast} \mathbf{y} & \because \mathbf{y} = A \mathbf{x} \implies \mathbf{x}^{\ast} \mathbf{y} = \mathbf{x}^{\ast} A \mathbf{x} \\ =& 0 + \mathbf{y}^{\ast} \mathbf{y} \end{align*} $$ That means $\mathbf{y} \cdot \mathbf{y} = 0$, but once again, according to the positive definiteness, it must be $\mathbf{y} = 0$, which is a contradiction to the definition of $\mathbf{y} \ne 0$ as $\mathbf{y}$. Consequently, we arrive at $A = O$.

$(\impliedby)$

It’s self-evident.