logo

Polynomial Distribution 📂Probability Distribution

Polynomial Distribution

Definition

Let a random vector composed of $n \in \mathbb{N}$ and $k \in \mathbb{N}$ counts of random variables be denoted as $\left( X_{1} , \cdots , X_{k} \right)$. $$ \sum_{i=1}^{k} X_{i} = n \qquad \& \qquad \sum_{i=1}^{k} p_{i} = 1 $$ For $\mathbf{p} = \left( p_{1} , \cdots , p_{k} \right) \in [0,1]^{k}$ that satisfies this, a multivariate probability distribution $M_{k} \left( n, \mathbf{p} \right)$ with the following probability mass function is called the Multinomial Distribution. $$ p \left( x_{1} , \cdots , x_{k} \right) = {{ n! } \over { x_{1} ! \cdots x_{k}! }} p_{1}^{x_{1}} \cdots p_{k}^{x_{k}} \qquad , x_{1} , \cdots , x_{k} \in \mathbb{N}_{0} $$


  • $[0,1]^{k} = [0,1] \times \cdots \times [0,1]$ is a $k$-cell.
  • $\mathbb{N}_{0} = \left\{ 0 \right\} \cup \mathbb{N}$ is a set that includes natural numbers and $0$.

Description

To interpret the definition as it is, $\left( X_{1} , \cdots , X_{k} \right)$ is a random vector indicating how many elements are actually in each category when $n$ elements have a probability $p_{i}$ of falling into the $i$ category among $k$ categories, having a probability mass function of $$ \begin{align*} p \left( x_{1} , \cdots , x_{k} \right) =& P \left( X_{1} = x_{1} , \cdots , X_{k} = x_{k} \right) \\ =& {{ n! } \over { x_{1} ! \cdots x_{k}! }} p_{1}^{x_{1}} \cdots p_{k}^{x_{k}} \end{align*} $$ Especially, when $k = 2$, it becomes a generalization of the binomial distribution itself.

Basic Properties

Mean and Covariance

  • [1]: If $\mathbf{X} := \left( X_{1} , \cdots , X_{k} \right) \sim M_{k} \left( n, \mathbf{p} \right)$, the expected value of the $i$ component $X_{i}$ is $$ E \left( X_{i} \right) = n p_{i} $$ and the covariance matrix is as follows. $$ \operatorname{Cov} \left( \mathbf{X} \right) = n \begin{bmatrix} p_{1} \left( 1 - p_{1} \right) & - p_{1} p_{2} & \cdots & - p_{1} p_{k} \\ - p_{2} p_{1} & p_{2} \left( 1 - p_{2} \right) & \cdots & - p_{2} p_{2} \\ \vdots & \vdots & \ddots & \vdots \\ - p_{k} p_{1} & - p_{k} p_{2} & \cdots & p_{k} \left( 1 - p_{k} \right) \end{bmatrix} $$

Theorem

Lumping Property

For $i \ne j$, $X_{i} + X_{j}$ follows the binomial distribution $\text{Bin} \left( n , p_{i} + p_{j} \right)$. $$ X_{i} + X_{j} \sim \text{Bin} \left( n , p_{i} + p_{j} \right) $$ This is called the Lumping Property.

Proof

Mean

Looking at each component $X_{i}$ alone, it’s essentially a binomial distribution regarding whether it falls into category $i$ with a probability $p_{i}$ or not, hence $X_{i} \sim \text{Bin} \left( n , p_{i} \right)$, and its expected value is $E \left( X_{i} \right) = n p_{i}$.

Covariance

It is directly deduced using the lumping property.

Lumping Property 1

In the case of $n = 1$, that is, when considering only a single trial, $X_{i} + X_{j}$ is exactly $1$ when the outcome of that trial belongs to either the $i$ or $j$ category, and follows a Bernoulli distribution $\text{Bin} \left( 1, p_{i} + p_{j} \right)$ which is $0$ in all other cases.

Addition of Binomial Distributions: Let’s assume that the probability variables $X_{1} , \cdots , X_{n}$ are mutually independent. In the case of binomial distributions, if $X_i \sim \text{Bin} ( n_{i}, p)$ is true, $$ \displaystyle \sum_{i=1}^{m} X_{i} \sim \text{Bin} \left( \sum_{i=1}^{m} n_{i} , p \right) $$

Since $n$ trials are conducted independently, the following is obtained according to the addition of binomial distributions. $$ X_{i} + X_{j} \sim \text{Bin} \left( \sum_{j=1}^{n} 1 , p_{i} + p_{j} \right) = \text{Bin} \left( n , p_{i} + p_{j} \right) $$