logo

Derivatives of Vectors and Matrices 📂Vector Analysis

Derivatives of Vectors and Matrices

Gradient of a Scalar Function

Scalar function f:RnRf : \mathbb{R}^{n} \to \mathbb{R}’s gradient is as follows. f(x)x:=f(x)=[f(x)x1f(x)x2f(x)xn]T \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } := \nabla f(\mathbf{x}) = \begin{bmatrix} \dfrac{ \partial f(\mathbf{x})}{ \partial x_{1} } & \dfrac{ \partial f(\mathbf{x})}{ \partial x_{2} } & \cdots & \dfrac{ \partial f(\mathbf{x})}{ \partial x_{n} } \end{bmatrix}^{T}

Here, f(x)xi\dfrac{ \partial f(\mathbf{x})}{ \partial x_{i} } is the partial derivative of ff with respect to xix_{i}.

Inner Product

For a fixed wRn\mathbf{w} \in \mathbb{R}^{n}, let’s assume,

f(x)x=(wTx)x=(xTw)x=w \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } = \frac{ \partial (\mathbf{w}^{T}\mathbf{x})}{ \partial \mathbf{x} } = \frac{ \partial (\mathbf{x}^{T}\mathbf{w})}{ \partial \mathbf{x} } = \mathbf{w}

Norm

Let’s assume,

f(x)=x2x=2x \nabla f(\mathbf{x}) = \dfrac{\partial \left\| \mathbf{x} \right\|^{2}}{\partial \mathbf{x}} = 2\mathbf{x}

Quadratic Form

For matrix n×nn \times n and R\mathbf{R}, let’s assume,

f(x)x=(xRx)x=(R+RT)x \dfrac{\partial f(\mathbf{x})}{\partial \mathbf{x}} = \dfrac{\partial (\mathbf{x}\mathbf{R}\mathbf{x})}{\partial \mathbf{x}} = (\mathbf{R} + \mathbf{R}^{T})\mathbf{x}

Proof

Inner Product

f(x)x=(wTx)x=[(i=1nwixi)x1(i=1nwixi)x2(i=1nwixi)xn]T=[w1w2wn]T=w \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } =\frac{ \partial (\mathbf{w}^{T}\mathbf{x})}{ \partial \mathbf{x} } &=\begin{bmatrix} \dfrac{ \partial \left( \sum _{i=1} ^{n} w_{i}x_{i}\right)}{ \partial x_{1} } & \dfrac{ \partial \left( \sum _{i=1} ^{n} w_{i}x_{i}\right)}{ \partial x_{2} } & \cdots & \dfrac{ \partial \left( \sum _{i=1} ^{n} w_{i}x_{i}\right)}{ \partial x_{n} } \end{bmatrix}^{T} \\ &= \begin{bmatrix} w_{1} & w_{2} & \cdots & w_{n} \end{bmatrix}^{T} \\ &= \mathbf{w} \end{align*}

Also, since wTx=xTw\mathbf{w}^{T}\mathbf{x} = \mathbf{x}^{T}\mathbf{w},

xTwx=w \frac{ \partial \mathbf{x}^{T}\mathbf{w}}{ \partial \mathbf{x} } = \mathbf{w}

Norm

Similar to the proof for the inner product,

f(x)x=(xTx)x=[(i=1nxixi)x1(i=1nxixi)x2(i=1nxixi)xn]T=[(i=1nxi2)x1(i=1nxi2)x2(i=1nxi2)xn]T=[2x12x22xn]T=2x \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } = \frac{ \partial (\mathbf{x}^{T}\mathbf{x})}{ \partial \mathbf{x} } &=\begin{bmatrix} \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}x_{i}\right)}{ \partial x_{1} } & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}x_{i}\right)}{ \partial x_{2} } & \cdots & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}x_{i}\right)}{ \partial x_{n} } \end{bmatrix}^{T} \\ &=\begin{bmatrix} \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}^{2}\right)}{ \partial x_{1} } & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}^{2}\right)}{ \partial x_{2} } & \cdots & \dfrac{ \partial \left( \sum _{i=1} ^{n} x_{i}^{2}\right)}{ \partial x_{n} } \end{bmatrix}^{T} \\ &= \begin{bmatrix} 2x_{1} & 2x_{2} & \cdots & 2x_{n} \end{bmatrix}^{T} \\ &= 2\mathbf{x} \end{align*}

Quadratic Form

To ease the computation of the differentiation, first, the calculation is done as follows. For any given k{1,,n}k\in\left\{1,\dots,n\right\}, the following equation is obtained.

f(x)=xTRx=j=1nxji=1nrjixi=xki=1nrkixi+jkxji=1nrjixi=xk(rkkxk+ikrkixi)+jkxj(rjkxk+ikrjixi)=xk2rkk+xkikrkixi+jkxjrjkxk+jkikxjrjixi \begin{align*} f(\mathbf{x}) &= \mathbf{x}^{T}\mathbf{R}\mathbf{x} =\sum \limits _{j=1} ^{n} x_{j} \sum \limits _{i=1} ^{n}r_{ji}x_{i} \\ & = x_{k} \sum \limits _{i=1} ^{n}r_{ki}x_{i} +\sum \limits _{j\ne k}x_{j} \sum \limits _{i=1} ^{n}r_{ji}x_{i} \\ &= x_{k}\left(r_{kk}x_{k} + \sum \limits _{i\ne k} r_{ki}x_{i}\right) +\sum \limits _{j\ne k}x_{j} \left(r_{jk}x_{k}+ \sum \limits _{i \ne k} r_{ji}x_{i} \right) \\ &= x_{k}^{2}r_{kk} + x_{k}\sum \limits _{i\ne k} r_{ki}x_{i} + \sum \limits _{j\ne k}x_{j}r_{jk}x_{k}+ \sum \limits _{j\ne k}\sum \limits _{i \ne k}x_{j} r_{ji}x_{i} \end{align*}

When calculating f(x)xk\dfrac{ \partial f(\mathbf{x})}{ \partial x_{k}}, it is as follows.

f(x)xk=xk(xk2rkk+xkikrkixi+jkxjrjkxk+jkikxjrjixi)=2wkrkk+ikrkixi+jkxjrjk=i=1nrkixi+j=1nrjkxj \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial x_{k} }&=\frac{ \partial }{ \partial x_{k} } \left( x_{k}^{2}r_{kk} + x_{k}\sum \limits _{i\ne k} r_{ki}x_{i} + \sum \limits _{j\ne k}x_{j}r_{jk}x_{k}+ \sum \limits _{j\ne k}\sum \limits _{i \ne k}x_{j} r_{ji}x_{i} \right) \\ &=2w_{k}r_{kk} + \sum \limits _{i\ne k} r_{ki}x_{i} + \sum \limits _{j\ne k}x_{j}r_{jk} \\ &=\sum \limits _{i=1}^{n} r_{ki}x_{i} + \sum \limits _{j=1}^{n}r_{jk}x_{j} \end{align*}

Therefore, calculating the gradient of ff results in the following.

f(x)x=(xTRx)x=[f(x)x1f(x)x2f(x)xn]T=[i=1nr1ixi+j=1nrj1xji=1nr2ixi+j=1nrj2xji=1nrnixi+j=1nrjnxj]=[i=1nr1ixii=1nr2ixii=1nrnixi]+[j=1nrj1xjj=1nrj2xjj=1nrjnxj]=Rx+RTx=(R+RT)x \begin{align*} \frac{ \partial f(\mathbf{x})}{ \partial \mathbf{x} } = \dfrac{\partial \left( \mathbf{x}^{T}\mathbf{R}\mathbf{x} \right)}{\partial \mathbf{x}} &= \begin{bmatrix} \dfrac{ \partial f (\mathbf{x})}{ \partial x_{1} } & \dfrac{ \partial f (\mathbf{x})}{ \partial x_{2} } & \dots & \dfrac{ \partial f (\mathbf{x})}{ \partial x_{n} } \end{bmatrix}^{T} \\[1em] &= \begin{bmatrix} \sum _{i=1} ^{n} r_{1i}x_{i} + \sum _{j=1}^{n} r_{j1}x_{j} \\[0.5em] \sum _{i=1} ^{n} r_{2i}x_{i} + \sum _{j=1}^{n} r_{j2}x_{j} \\[0.5em] \vdots \\[0.5em] \sum _{i=1} ^{n} r_{ni}x_{i} + \sum _{j=1}^{n} r_{jn}x_{j} \end{bmatrix} \\[3.5em] &= \begin{bmatrix} \sum _{i=1} ^{n} r_{1i}x_{i} \\[0.5em] \sum _{i=1} ^{n} r_{2i}x_{i} \\[0.5em] \vdots \\[0.5em] \sum _{i=1} ^{n} r_{ni}x_{i} \end{bmatrix} + \begin{bmatrix} \sum _{j=1}^{n} r_{j1}x_{j} \\[0.5em] \sum _{j=1}^{n} r_{j2}x_{j} \\[0.5em] \vdots \\[0.5em] \sum _{j=1}^{n} r_{jn}x_{j} \end{bmatrix} \\[3.5em] &=\mathbf{R} \mathbf{x} + \mathbf{R}^{T}\mathbf{x}=\left( \mathbf{R}+\mathbf{R}^{T} \right)\mathbf{x} \end{align*}

Matrix-Vector Multiplication

For XMn×n\mathbf{X} \in M_{n\times n}, yMn×1\mathbf{y} \in M_{n \times 1},

Xy=[i=1nx1iyii=1nx2iyii=1nxniyi],XTy=[i=1nxi1yii=1nxi2yii=1nxinyi] \mathbf{X}\mathbf{y} =\begin{bmatrix} \sum _{i=1} ^{n}x_{1i}y_{i} \\ \sum _{i=1} ^{n}x_{2i}y_{i} \\ \vdots \\ \sum _{i=1} ^{n}x_{ni}y_{i} \end{bmatrix},\qquad \mathbf{X}^{T}\mathbf{y} =\begin{bmatrix} \sum _{i=1} ^{n}x_{i1}y_{i} \\ \sum _{i=1} ^{n}x_{i2}y_{i} \\ \vdots \\ \sum _{i=1} ^{n}x_{in}y_{i} \end{bmatrix}

If R\mathbf{R} is a symmetric matrix,

x(xTRx)=2Rx \frac{ \partial }{ \partial \mathbf{x} }\left( \mathbf{x}^{T}\mathbf{R}\mathbf{x} \right)=2\mathbf{R}\mathbf{x}