logo

Proving the Invariance of the Laplace Equation with Respect to Orthogonal Transformations 📂Partial Differential Equations

Proving the Invariance of the Laplace Equation with Respect to Orthogonal Transformations

Theorem1

Let’s say $u$ satisfies the Laplace equation. And let’s define $v(x)$ as follows.

$$ v(x) :=u(Rx) $$

Then, $R$ is a rotation transformation. Therefore, $v(x)$ also satisfies the Laplace equation.

$$ \Delta v=0 $$

Explanation

In fact, the content above holds for all orthogonal transformations. Thus, the fact that the Laplace equation is invariant under rotation transformation is a specific case of the fact that the Laplace equation is invariant under orthogonal transformations.

Proof

Assuming that $u$ satisfies the Laplace equation, let’s say $O$ is an arbitrary orthogonal transformation. Then, the following equation is the objective of the proof.

$$ v(x)=u(Ox)\ \implies \Delta v=0 $$

Let’s specifically assume that $O$ is as follows.

$$ O=[o_{ij}]=\begin{pmatrix} o_{11} & o_{12} & \cdots &o_{1n} \\ o_{21} & o_{22} & \cdots & o_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ o_{n1} & o_{n2} & \cdots & o_{nn} \end{pmatrix} $$

Then, the following holds.

$$ Ox=\begin{pmatrix} o_{11} & o_{12} & \cdots &o_{1n} \\ o_{21} & o_{22} & \cdots & o_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ o_{n1} & o_{n2} & \cdots & o_{nn} \end{pmatrix} \begin{pmatrix} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{pmatrix} =\begin{pmatrix} o_{11}x_{1} + o_{12}x_{2} + \cdots +o_{1n}x_{n} \\ o_{21}x_{1}+ o_{22}x_{2}+ \cdots + o_{2n}x_{n} \\ \vdots \\ o_{n1}x_{1}+ o_{n2}x_{2}+ \cdots +o_{nn}x_{n} \end{pmatrix} $$

If we set $Ox=y$ as follows, then we get the following.

$$ Ox=\begin{pmatrix} o_{11}x_{1} + o_{12}x_{2} + \cdots +o_{1n}x_{n} \\ o_{21}x_{1}+ o_{22}x_{2}+ \cdots + o_{2n}x_{n} \\ \vdots \\ o_{n1}x_{1}+ o_{n2}x_{2}+ \cdots +o_{nn}x_{n} \end{pmatrix} =\begin{pmatrix} y_{1} \\ y_{2} \\ \vdots \\ y_{n} \end{pmatrix}=y $$

This is the same as the following equation.

$$ v(x)=u(y) $$

If we calculate the total derivative of $v$, we get the following.

$$ \begin{align*} dv &=\dfrac{\partial u}{\partial y_{1}}dy_{1}+\dfrac{\partial u}{\partial y_{2}}dy_{2}+\cdots + \dfrac{\partial u}{\partial y_{n}}dy_{n} \\ &= u_{y_{1}}dy_{1} + u_{y_{2}}dy_{2} + \cdots +u_{y_{n}}dy_{n} \end{align*} $$

Therefore, $\dfrac{\partial v}{\partial x_{i}}=v_{x_{i}}$ is as follows.

$$ v_{x_{i}} = u_{y_{1}}o_{1i}+u_{y_{2}}o_{2i}+\cdots + u_{y_{n}}o_{ni}=\sum \limits_{j=1}^{n} u_{y_{j}}o_{ji} $$

In the same manner, we get the following.

$$ v_{x_{i}x_{i}}=\sum \limits_{k=1}^{n} \sum \limits_{j=1}^{n} u_{y_{j}y_{k}}o_{ji}o_{ki} $$

At this time, since $O$ is an orthogonal matrix, we have $OO^T=I$, and thus the following equation holds.

$$ \sum \limits_{i=1}^{n} o_{ji}o_{ki}=\delta_{jk} $$

Therefore, we obtain the following result.

$$ \begin{align*} \Delta v=\sum_{i=1}^{n} v_{x_{i}x_{i}} &= \sum \limits_{i=1}^{n}\sum \limits_{k=1}^{n} \sum \limits_{j=1}^{n} u_{y_{j}y_{k}}o_{ji}o_{ki} \\ &= \sum \limits_{k=1}^{n} \sum \limits_{j=1}^{n} u_{y_{j}y_{k}}\delta_{jk} \\ &= \sum \limits_{j=1}^{n} u_{y_{j}y_{j}} \\ &= \Delta u=0 \end{align*} $$


  1. Lawrence C. Evans, Partial Differential Equations (2nd Edition, 2010), p85(Problem 2) ↩︎