딥러닝에서 인공신경망(ANN), 심층신경망(DNN), 순방향신경망(FNN)의 뜻과 차이점
Overview
This document summarizes terms used in deep learning, such as artificial neural networks, deep neural networks, and feedforward neural networks. These terms are often used interchangeably without clear definitions and can be confusing for beginners, but essentially, they can be considered the same.
The origins and historical contexts of the terms explained below are not based on exhaustive research and are the author’s own hypotheses.
Artificial Neural Networks and Deep Neural Networks
The composition of layers and activation functions is called an artificial neural network.
A deep neural network refers to the repeated composition of layers and activation functions.
The first proposed ANN was a very simple model composed of a linear function and a step function. This was called a (single-layer) perceptron. A more advanced model proposed later is the multi-layer perceptron, which is a more extensive composition of linear and step functions. It seems that terms like layer and deep began to be used during this development phase. These words do not have special meanings; they were intuitively coined because visualizing multiple compositions shows layers stacked deeply. For example, when , visualizing the function composed of numerous s is as follows.
Initially, the term DNN was used to distinguish it from a single-layer ANN, but now that distinction is meaningless. They can be considered the same. In the early days of neural network theory, DNN was often synonymous with MLP, as evident in some historical papers. Nowadays, ANN and DNN refer to concepts that include not just MLP but also CNN, GAN, RNN, GNN, and other neural networks.
Past:
Recent:
The past and recent can be thought of roughly as the 20th and 21st centuries, respectively.
Multi-Layer Perceptron and Fully-Connected Neural Network
A multi-layer perceptron is the result of composing multiple single-layer perceptrons. A fully-connected neural network is a network that composes fully connected layers with activation functions. These two terms refer to the same type of neural network.
Feedforward Neural Networks
Networks other than those handling time-series data, like recurrent neural networks (RNN) or those with output feeding back into the input in any form, are called feedforward neural networks. In the past, when there were fewer types of neural networks, FNN seemed to be used synonymously with MLP. There still appear to be cases where it is used in this narrower sense.