We define ηH(x;λ):R→R as Hard Thresholding as follows:
ηH(x;λ)==x⋅1{∣x∣≥λ}{x0,if x∈[−λ,λ],if x∈/[−λ,λ]
Here, 1⋅ is an indicator function.
Soft Thresholding
We define ηS(x;λ):R→R as Soft Thresholding as follows:
ηS(x;λ)==sign(x)⋅ReLU(∣x∣−λ)⎩⎨⎧λ−∣x∣0∣x∣−λ,if x<−λ,if x∈[−λ,λ],if x>λ
Here, sign is sign, and ReLU is ReLU.
Description
The shapes of the introduced functions are as follows2.
Thresholding, from the algorithm’s perspective, has the meaning of removing not so significant values, which is a form of denoising.
The difference between hard and soft mainly stands out mathematically in the continuity at ±λ, and otherwise, the point that [−λ,λ] goes to 0 or nearly similar aspects such as using the same pseudo-functions. Usually, one is chosen over the other according to the purpose, making it rare to use both. Notation often omits ηλ(x), H, or S, placing the threshold λ as a subscript instead.