logo

Hard Thresholding and Soft Thresholding as Functions 📂Functions

Hard Thresholding and Soft Thresholding as Functions

Definition 1

A threshold $\lambda \in \mathbb{R}$ is given.

Hard Thresholding

We define $\eta _{H} \left( x ; \lambda \right) : \mathbb{R} \to \mathbb{R}$ as Hard Thresholding as follows: $$ \begin{align*} \eta _{H} \left( x ; \lambda \right) =& x \cdot \mathbf{1}_{\left\{ \left| x \right| \ge \lambda \right\}} \\ =& \begin{cases} x & , \text{if } x \in [-\lambda, \lambda] \\ 0 & , \text{if } x \notin [-\lambda, \lambda] \end{cases} \end{align*} $$ Here, $\mathbf{1}_{\cdot}$ is an indicator function.

Soft Thresholding

We define $\eta _{S} \left( x ; \lambda \right) : \mathbb{R} \to \mathbb{R}$ as Soft Thresholding as follows: $$ \begin{align*} \eta _{S} \left( x ; \lambda \right) =& \operatorname{sign} (x) \cdot \operatorname{ReLU} \left( \left| x \right| - \lambda \right) \\ =& \begin{cases} \lambda - \left| x \right| & , \text{if } x < - \lambda \\ 0 & , \text{if } x \in [-\lambda, \lambda] \\ \left| x \right| - \lambda & , \text{if } x > \lambda \end{cases} \end{align*} $$ Here, $\operatorname{sign}$ is sign, and $\operatorname{ReLU}$ is ReLU.

Description

The shapes of the introduced functions are as follows2.

Thresholding, from the algorithm’s perspective, has the meaning of removing not so significant values, which is a form of denoising.

The difference between hard and soft mainly stands out mathematically in the continuity at $\pm \lambda$, and otherwise, the point that $\left[ - \lambda , \lambda \right]$ goes to $0$ or nearly similar aspects such as using the same pseudo-functions. Usually, one is chosen over the other according to the purpose, making it rare to use both. Notation often omits $\eta_{\lambda} (x)$, $H$, or $S$, placing the threshold $\lambda$ as a subscript instead.


  1. Gavish. (2014). The Optimal Hard Threshold for Singular Values is 4/√3: https://doi.org/10.1109/TIT.2014.2323359 ↩︎

  2. https://www.mathworks.com/help/wavelet/ref/wthresh.html ↩︎