logo

Derivatives of Vectors and Matrices 📂Vector Analysis

Derivatives of Vectors and Matrices

Gradient of a Scalar Function

Scalar function $f : \mathbb{R}^{n} \to \mathbb{R}$’s gradient is as follows. $$ \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } := \nabla f(\mathbf{x}) = \begin{bmatrix} \dfrac{ \partial f(\mathbf{x})}{ \partial x_{1} } & \dfrac{ \partial f(\mathbf{x})}{ \partial x_{2} } & \cdots & \dfrac{ \partial f(\mathbf{x})}{ \partial x_{n} } \end{bmatrix}^{T} $$

Here, $\dfrac{ \partial f(\mathbf{x})}{ \partial x_{i} }$ is the partial derivative of $f$ with respect to $x_{i}$.

Inner Product

For a fixed $\mathbf{w} \in \mathbb{R}^{n}$, let’s assume,

$$ \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } = \frac{ \partial (\mathbf{w}^{T}\mathbf{x})}{ \partial \mathbf{x} } = \frac{ \partial (\mathbf{x}^{T}\mathbf{w})}{ \partial \mathbf{x} } = \mathbf{w} $$

Norm

Let’s assume,

$$ \nabla f(\mathbf{x}) = \dfrac{\partial \left\| \mathbf{x} \right\|^{2}}{\partial \mathbf{x}} = 2\mathbf{x} $$

Quadratic Form

For matrix $n \times n$ and $\mathbf{R}$, let’s assume,

$$ \dfrac{\partial f(\mathbf{x})}{\partial \mathbf{x}} = \dfrac{\partial (\mathbf{x}\mathbf{R}\mathbf{x})}{\partial \mathbf{x}} = (\mathbf{R} + \mathbf{R}^{T})\mathbf{x} $$

Proof

Inner Product

$$ \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } =\frac{ \partial (\mathbf{w}^{T}\mathbf{x})}{ \partial \mathbf{x} } &=\begin{bmatrix} \dfrac{ \partial \left( \sum _{i=1} ^{n} w_{i}x_{i}\right)}{ \partial x_{1} } & \dfrac{ \partial \left( \sum _{i=1} ^{n} w_{i}x_{i}\right)}{ \partial x_{2} } & \cdots & \dfrac{ \partial \left( \sum _{i=1} ^{n} w_{i}x_{i}\right)}{ \partial x_{n} } \end{bmatrix}^{T} \\ &= \begin{bmatrix} w_{1} & w_{2} & \cdots & w_{n} \end{bmatrix}^{T} \\ &= \mathbf{w} \end{align*} $$

Also, since $\mathbf{w}^{T}\mathbf{x} = \mathbf{x}^{T}\mathbf{w}$,

$$ \frac{ \partial \mathbf{x}^{T}\mathbf{w}}{ \partial \mathbf{x} } = \mathbf{w} $$

Norm

Similar to the proof for the inner product,

$$ \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } = \frac{ \partial (\mathbf{x}^{T}\mathbf{x})}{ \partial \mathbf{x} } &=\begin{bmatrix} \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}x_{i}\right)}{ \partial x_{1} } & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}x_{i}\right)}{ \partial x_{2} } & \cdots & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}x_{i}\right)}{ \partial x_{n} } \end{bmatrix}^{T} \\ &=\begin{bmatrix} \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}^{2}\right)}{ \partial x_{1} } & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}^{2}\right)}{ \partial x_{2} } & \cdots & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}^{2}\right)}{ \partial x_{n} } \end{bmatrix}^{T} \\ &= \begin{bmatrix} 2x_{1} & 2x_{2} & \cdots & 2x_{n} \end{bmatrix}^{T} \\ &= 2\mathbf{x} \end{align*} $$

Quadratic Form

To ease the computation of the differentiation, first, the calculation is done as follows. For any given $k\in\left\{1,\dots,n\right\}$, the following equation is obtained.

$$ \begin{align*} f(\mathbf{x}) &= \mathbf{x}^{T}\mathbf{R}\mathbf{x} =\sum \limits _{j=1} ^{n} x_{j} \sum \limits _{i=1} ^{n}r_{ji}x_{i} \\ & = x_{k} \sum \limits _{i=1} ^{n}r_{ki}x_{i} +\sum \limits _{j\ne k}x_{j} \sum \limits _{i=1} ^{n}r_{ji}x_{i} \\ &= x_{k}\left(r_{kk}x_{k} + \sum \limits _{i\ne k} r_{ki}x_{i}\right) +\sum \limits _{j\ne k}x_{j} \left(r_{jk}x_{k}+ \sum \limits _{i \ne k} r_{ji}x_{i} \right) \\ &= x_{k}^{2}r_{kk} + x_{k}\sum \limits _{i\ne k} r_{ki}x_{i} + \sum \limits _{j\ne k}x_{j}r_{jk}x_{k}+ \sum \limits _{j\ne k}\sum \limits _{i \ne k}x_{j} r_{ji}x_{i} \end{align*} $$

When calculating $\dfrac{ \partial f(\mathbf{x})}{ \partial x_{k}}$, it is as follows.

$$ \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial x_{k} }&=\frac{ \partial }{ \partial x_{k} } \left( x_{k}^{2}r_{kk} + x_{k}\sum \limits _{i\ne k} r_{ki}x_{i} + \sum \limits _{j\ne k}x_{j}r_{jk}x_{k}+ \sum \limits _{j\ne k}\sum \limits _{i \ne k}x_{j} r_{ji}x_{i} \right) \\ &=2w_{k}r_{kk} + \sum \limits _{i\ne k} r_{ki}x_{i} + \sum \limits _{j\ne k}x_{j}r_{jk} \\ &=\sum \limits _{i=1}^{n} r_{ki}x_{i} + \sum \limits _{j=1}^{n}r_{jk}x_{j} \end{align*} $$

Therefore, calculating the gradient of $f$ results in the following.

$$ \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } = \dfrac{\partial \left( \mathbf{x}^{T}\mathbf{R}\mathbf{x} \right)}{\partial \mathbf{x}} &= \begin{bmatrix} \dfrac{ \partial f (\mathbf{x})}{ \partial x_{1} } & \dfrac{ \partial f (\mathbf{x})}{ \partial x_{2} } & \dots & \dfrac{ \partial f (\mathbf{x})}{ \partial x_{n} } \end{bmatrix}^{T} \\[1em] &= \begin{bmatrix} \sum _{i=1} ^{n} r_{1i}x_{i} + \sum _{j=1}^{n} r_{j1}x_{j} \\[0.5em] \sum _{i=1} ^{n} r_{2i}x_{i} + \sum _{j=1}^{n} r_{j2}x_{j} \\[0.5em] \vdots \\[0.5em] \sum _{i=1} ^{n} r_{ni}x_{i} + \sum _{j=1}^{n} r_{jn}x_{j} \end{bmatrix} \\[3.5em] &= \begin{bmatrix} \sum _{i=1} ^{n} r_{1i}x_{i} \\[0.5em] \sum _{i=1} ^{n} r_{2i}x_{i} \\[0.5em] \vdots \\[0.5em] \sum _{i=1} ^{n} r_{ni}x_{i} \end{bmatrix} + \begin{bmatrix} \sum _{j=1}^{n} r_{j1}x_{j} \\[0.5em] \sum _{j=1}^{n} r_{j2}x_{j} \\[0.5em] \vdots \\[0.5em] \sum _{j=1}^{n} r_{jn}x_{j} \end{bmatrix} \\[3.5em] &=\mathbf{R} \mathbf{x} + \mathbf{R}^{T}\mathbf{x}=\left( \mathbf{R}+\mathbf{R}^{T} \right)\mathbf{x} \end{align*} $$

Matrix-Vector Multiplication

For $\mathbf{X} \in M_{n\times n}$, $\mathbf{y} \in M_{n \times 1}$,

$$ \mathbf{X}\mathbf{y} =\begin{bmatrix} \sum _{i=1} ^{n}x_{1i}y_{i} \\ \sum _{i=1} ^{n}x_{2i}y_{i} \\ \vdots \\ \sum _{i=1} ^{n}x_{ni}y_{i} \end{bmatrix},\qquad \mathbf{X}^{T}\mathbf{y} =\begin{bmatrix} \sum _{i=1} ^{n}x_{i1}y_{i} \\ \sum _{i=1} ^{n}x_{i2}y_{i} \\ \vdots \\ \sum _{i=1} ^{n}x_{in}y_{i} \end{bmatrix} $$

If $\mathbf{R}$ is a symmetric matrix,

$$ \frac{ \partial }{ \partial \mathbf{x} }\left( \mathbf{x}^{T}\mathbf{R}\mathbf{x} \right)=2\mathbf{R}\mathbf{x} $$