logo

Generative Model 📂Machine Learning

Generative Model

Overview

Determining the exact probability distribution that our data follows is a crucial yet very challenging problem in many application fields. For instance, if we precisely know the probability distribution of human face photographs and the method to sample from this distribution, we can obtain plausible human face images every time we sample data from this distribution. Obviously, this task is nearly impossible. Much like how many difficult problems begin by solving simpler ones, a generative model is a method that approximates the desired (unknown and complex) distribution starting from a given (known and simpler) distribution.

Definition

Let the unknown probability distribution that the dataset (random sample) {yj}\left\{ y_{j} \right\} follows be denoted as YY. Assume that the dataset {xi}X\left\{ x_{i} \right\} \sim X follows a well-known probability distribution XX. A generative model is a function ff or a methodology to discover ff.

f:{xi}{yj} f : \left\{ x_{i} \right\} \rightarrow \left\{ y_{j} \right\}

Explanation

The normal distribution is the most commonly used easy distribution XX. Thus, a generative model can be simply explained as a method to derive data that follows another unknown distribution from the normal distribution. Recently, generative models using neural networks have been extensively researched. When it is clear that a generative model employs neural networks and deep learning, it is sometimes referred to as a deep generative model.

If one can find ff that satisfies the definition, extracting xix_{i} becomes straightforward, allowing for the generation of yi=f(xi)y_{i} = f(x_{i}).

Types

  • Autoencoder
    • Variational Autoencoder (VAE)
  • Generative Adversarial Network (GAN)
  • Diffusion Models or Diffusion Probabilistic Models
    • Denoising Diffusion Probabilistic Models (DDPM)
  • Deep Image Prior