Multilayer Perceptron (MLP), Fully Connected Neural Network (FCNN)
Definition
Let $L_{i} : \mathbb{R}^{n_{i}} \to \mathbb{R}^{n_{i+1}}$ be referred to as a fully connected layer. Let $\sigma : \mathbb{R} \to \mathbb{R}$ be referred to as an activation function. The composition of these is called a multilayer perceptron.
$$ \operatorname{MLP}(\mathbf{x}) = T_{N} \circ \overline{\sigma} \circ T_{N-1} \circ \overline{\sigma} \circ \cdots \circ T_{1} (\mathbf{x}) $$
Here, $\overline{\sigma}$ is a function that applies $\sigma$ to each component.
$$ \overline{\sigma}(\mathbf{x}) = \begin{bmatrix} \sigma(x_{1}) \\ \sigma(x_{2}) \\ \vdots \\ \sigma(x_{n}) \end{bmatrix} \qquad \text{where } \mathbf{x} = \begin{bmatrix} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{bmatrix} $$
Explanation
It is called a multilayer perceptron because it is a composition of single-layer perceptrons multiple times, primarily using fully connected layers, and is also referred to as a fully connected neural network. These terms refer to the same neural network.