logo

Dropout in Deep Learning 📂Machine Learning

Dropout in Deep Learning

Definition

20190328_143431.png

Dropout is a technique to prevent overfitting by stochastically not using neurons in an artificial neural network.

Explanation

At first glance, it might seem like it’s just learning less, and to some extent, that’s true. By not using neurons with a certain probability, it’s possible to ignore neurons that are ’too influential’. Being too influential can be seen as being too confident about the training data. If there are many such neurons, it’s clear that while they may solve problems well in the training data, their sensitivity to real data can decrease.

If we compare it to humans, it’s like not immediately solving a familiar type of problem when encountered, but rather carefully reading the problem. It’s because there might be parts that I haven’t thought of, even for problems I know. The best part of dropout is its simplicity above everything. Although there are many other techniques to prevent overfitting, dropout is conceptually simple and easy to implement, making it easy to see its effects and convenient to use.