logo

Softmax Function in Deep Learning 📂Machine Learning

Softmax Function in Deep Learning

Definition

Let’s refer to it as $\mathbb{x} := (x_{1} , \cdots , x_{n}) \in \mathbb{R}^{n}$.

For $\displaystyle \sigma_{j} ( \mathbb{x} ) = {{ e^{x_{j}} } \over {\sum_{i=1}^{n} e^{x_{i}} }}$, $\sigma ( \mathbb{x} ) := \left( \sigma_{1} (\mathbb{x}) , \cdots , \sigma_{n} (\mathbb{x} ) \right)$ is defined as $\sigma : \mathbb{R}^{n} \to (0,1)^{n}$, which is called the softmax.

Explanation

The softmax function is a type of activation function characterized by its domain being $\mathbb{R}^{n}$. It is used to normalize the values of a vector as input. For any $\mathbb{x} \in \mathbb{R}$, every component of $\sigma ( \mathbb{x} )$ is between $0$ and $1$, and when they are all added together, they exactly become $1$.

This property is similar to probability, and in practice, it is conveniently used in solving classification problems when implementing artificial neural networks.