logo

Proving the Invariance of the Laplace Equation with Respect to Orthogonal Transformations 📂Partial Differential Equations

Proving the Invariance of the Laplace Equation with Respect to Orthogonal Transformations

Theorem1

Let’s say uu satisfies the Laplace equation. And let’s define v(x)v(x) as follows.

v(x):=u(Rx) v(x) :=u(Rx)

Then, RR is a rotation transformation. Therefore, v(x)v(x) also satisfies the Laplace equation.

Δv=0 \Delta v=0

Explanation

In fact, the content above holds for all orthogonal transformations. Thus, the fact that the Laplace equation is invariant under rotation transformation is a specific case of the fact that the Laplace equation is invariant under orthogonal transformations.

Proof

Assuming that uu satisfies the Laplace equation, let’s say OO is an arbitrary orthogonal transformation. Then, the following equation is the objective of the proof.

v(x)=u(Ox)     Δv=0 v(x)=u(Ox)\ \implies \Delta v=0

Let’s specifically assume that OO is as follows.

O=[oij]=(o11o12o1no21o22o2non1on2onn) O=[o_{ij}]=\begin{pmatrix} o_{11} & o_{12} & \cdots &o_{1n} \\ o_{21} & o_{22} & \cdots & o_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ o_{n1} & o_{n2} & \cdots & o_{nn} \end{pmatrix}

Then, the following holds.

Ox=(o11o12o1no21o22o2non1on2onn)(x1x2xn)=(o11x1+o12x2++o1nxno21x1+o22x2++o2nxnon1x1+on2x2++onnxn) Ox=\begin{pmatrix} o_{11} & o_{12} & \cdots &o_{1n} \\ o_{21} & o_{22} & \cdots & o_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ o_{n1} & o_{n2} & \cdots & o_{nn} \end{pmatrix} \begin{pmatrix} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{pmatrix} =\begin{pmatrix} o_{11}x_{1} + o_{12}x_{2} + \cdots +o_{1n}x_{n} \\ o_{21}x_{1}+ o_{22}x_{2}+ \cdots + o_{2n}x_{n} \\ \vdots \\ o_{n1}x_{1}+ o_{n2}x_{2}+ \cdots +o_{nn}x_{n} \end{pmatrix}

If we set Ox=yOx=y as follows, then we get the following.

Ox=(o11x1+o12x2++o1nxno21x1+o22x2++o2nxnon1x1+on2x2++onnxn)=(y1y2yn)=y Ox=\begin{pmatrix} o_{11}x_{1} + o_{12}x_{2} + \cdots +o_{1n}x_{n} \\ o_{21}x_{1}+ o_{22}x_{2}+ \cdots + o_{2n}x_{n} \\ \vdots \\ o_{n1}x_{1}+ o_{n2}x_{2}+ \cdots +o_{nn}x_{n} \end{pmatrix} =\begin{pmatrix} y_{1} \\ y_{2} \\ \vdots \\ y_{n} \end{pmatrix}=y

This is the same as the following equation.

v(x)=u(y) v(x)=u(y)

If we calculate the total derivative of vv, we get the following.

dv=uy1dy1+uy2dy2++uyndyn=uy1dy1+uy2dy2++uyndyn \begin{align*} dv &=\dfrac{\partial u}{\partial y_{1}}dy_{1}+\dfrac{\partial u}{\partial y_{2}}dy_{2}+\cdots + \dfrac{\partial u}{\partial y_{n}}dy_{n} \\ &= u_{y_{1}}dy_{1} + u_{y_{2}}dy_{2} + \cdots +u_{y_{n}}dy_{n} \end{align*}

Therefore, vxi=vxi\dfrac{\partial v}{\partial x_{i}}=v_{x_{i}} is as follows.

vxi=uy1o1i+uy2o2i++uynoni=j=1nuyjoji v_{x_{i}} = u_{y_{1}}o_{1i}+u_{y_{2}}o_{2i}+\cdots + u_{y_{n}}o_{ni}=\sum \limits_{j=1}^{n} u_{y_{j}}o_{ji}

In the same manner, we get the following.

vxixi=k=1nj=1nuyjykojioki v_{x_{i}x_{i}}=\sum \limits_{k=1}^{n} \sum \limits_{j=1}^{n} u_{y_{j}y_{k}}o_{ji}o_{ki}

At this time, since OO is an orthogonal matrix, we have OOT=IOO^T=I, and thus the following equation holds.

i=1nojioki=δjk \sum \limits_{i=1}^{n} o_{ji}o_{ki}=\delta_{jk}

Therefore, we obtain the following result.

Δv=i=1nvxixi=i=1nk=1nj=1nuyjykojioki=k=1nj=1nuyjykδjk=j=1nuyjyj=Δu=0 \begin{align*} \Delta v=\sum_{i=1}^{n} v_{x_{i}x_{i}} &= \sum \limits_{i=1}^{n}\sum \limits_{k=1}^{n} \sum \limits_{j=1}^{n} u_{y_{j}y_{k}}o_{ji}o_{ki} \\ &= \sum \limits_{k=1}^{n} \sum \limits_{j=1}^{n} u_{y_{j}y_{k}}\delta_{jk} \\ &= \sum \limits_{j=1}^{n} u_{y_{j}y_{j}} \\ &= \Delta u=0 \end{align*}


  1. Lawrence C. Evans, Partial Differential Equations (2nd Edition, 2010), p85(Problem 2) ↩︎