logo

Graph Laplacian 📂Graph Theory

Graph Laplacian

Definition

Let the degree matrix of a graph be $D$ and the adjacency matrix be $A$. Then, the graph Laplacian $L$ of $G$ is defined as follows:

$$ L := D - A $$

Explanation

Unlike the Laplacian $\Delta = \sum\limits_{i} \frac{\partial^{2}}{\partial x_{i}^{2}}$ defined for functions over Euclidean space, it depends on the given graph.

For the graph $G = (V, E)$, denote $f : V \to \mathbb{R}$ as the node signal and represent it as $\mathbf{f} = \begin{bmatrix} f(v_{1}) & \cdots & f(v_{n}) \end{bmatrix}^{\mathsf{T}}$. Then, each component of $L\mathbf{f}$ indicates how much the value at each vertex differs from the values at adjacent vertices.

Derivation

For a multivariable function $g : \mathbb{R}^{n} \to \mathbb{R}$, the Laplacian is defined as follows:

$$ \nabla^{2} g = \Delta g = \dfrac{\partial^{2} g}{\partial x_{1}^{2}} + \cdots + \dfrac{\partial^{2} g}{\partial x_{n}^{2}} $$

The differential of a function $f : \mathbb{R} \to \mathbb{R}$ defined over real space can be numerically approximated as follows:

$$ f^{\prime}(x) = \dfrac{f(x+h) - f(x)}{h} $$

$$ f^{\prime \prime}(x) = \dfrac{f(x+h) - 2f(x) + f(x-h)}{h^{2}} \tag{1} $$

Let’s consider the above equation for a graph. If we see $x$ as a vertex, $x+h$ and $x-h$ can be viewed as vertices connected to $x$. $f(x)$ preceding $2$ indicates that there are two adjacent points to $x$. Therefore, from the graph perspective, $(1)$ represents subtracting the value of the reference vertex from the values at adjacent vertices by the degree. Expressing it mathematically yields $A - D$, but reversing the sign yields a positive definite matrix, thus defined as follows:

$$ L := D - A $$

Properties

  1. The graph Laplacian $L$ is a positive definite matrix.

Proof

1.

Let $D = [D_{ij}]$ and $A = [A_{ij}]$ be the degree matrix and adjacency matrix, respectively. Let $\mathbf{x} \in \mathbb{R}^{n}$ be as follows:

$$ \begin{align*} \mathbf{x}^{\mathsf{T}} L \mathbf{x} &= \mathbf{x}^{\mathsf{T}} (D - A) \mathbf{x} \\ &= \mathbf{x}^{\mathsf{T}} D \mathbf{x} - \mathbf{x}^{\mathsf{T}} A \mathbf{x} \\ &= \sum\limits_{i=1}^{n} D_{ii} x_{i}^{2} - \sum\limits_{i,j = 1}^{n} A_{ij} x_{i} x_{j} \\ &= \sum\limits_{i=1}^{n} \left( \sum\limits_{j=1}^{n} A_{ij} \right) x_{i}^{2} - \sum\limits_{i,j = 1}^{n} A_{ij} x_{i} x_{j} \\ \end{align*} $$

Here, $i, j$ is a dummy index and since $A$ is a symmetric matrix, $\sum_{i,j}A_{ij}x_{i} = \sum_{i,j}A_{ij}x_{j}$ holds true. Therefore, we obtain the following:

$$ \begin{align*} \mathbf{x}^{\mathsf{T}} L \mathbf{x} &= \sum\limits_{i,j = 1}^{n} A_{ij} \left( x_{i}^{2} - x_{i} x_{j} \right)\\ &= \dfrac{1}{2}\sum\limits_{i,j = 1}^{n} A_{ij} \left( 2x_{i}^{2} - 2x_{i} x_{j} \right)\\ &= \dfrac{1}{2}\sum\limits_{i,j = 1}^{n} A_{ij} \left( x_{i}^{2} + x_{j}^{2} - 2x_{i} x_{j} \right)\\ &= \dfrac{1}{2}\sum\limits_{i,j = 1}^{n} A_{ij} \left( x_{i} - x_{j} \right)^{2} \end{align*} $$

Since $A_{ij} \ge 0$, $L$ is a positive definite matrix.