logo

Summation Summary of Random Variables Following a Specific Distribution 📂Mathematical Statistics

Summation Summary of Random Variables Following a Specific Distribution

Theorem

Let’s say the random variables $X_{1} , \cdots , X_{n}$ are mutually independent.

  • [1] Binomial distribution: If $X_i \sim \text{Bin} ( n_{i}, p)$, then $$ \sum_{i=1}^{m} X_{i} \sim \text{Bin} \left( \sum_{i=1}^{m} n_{i} , p \right) $$
  • [2] Poisson distribution: If $X_i \sim \text{Poi}( m_{i} )$, then $$ \sum_{i=1}^{n} X_{i} \sim \text{Poi} \left( \sum_{i=1}^{n} m_{i} \right) $$
  • [3] Gamma distribution: If $X_i \sim \Gamma ( k_{i}, \theta)$, then $$ \sum_{i=1}^{n} X_{i} \sim \Gamma \left( \sum_{i=1}^{n} k_{i} , \theta \right) $$
  • [4] Chi-squared distribution: If $X_i \sim \chi^2 ( r_{i} )$, then $$ \sum_{i=1}^{n} X_{i} \sim \chi ^2 \left( \sum_{i=1}^{n} r_{i} \right) $$
  • [5] Normal distribution: If $X_i \sim N( \mu_{i}, \sigma_{i}^{2} )$, then for the given vector $(a_{1} , \cdots , a_{n}) \in \mathbb{R}^{n}$, $$ \sum_{i=1}^{n} a_{i} X_{i} \sim N \left( \sum_{i=1}^{n} a_{i } \mu_{i} , \sum_{i=1}^{n} a_{i }^2 \sigma_{i}^2 \right) $$

Proof

Strategy: Derive using the moment generating function. The condition that the variables are mutually independent is essential for applying the following theorem.

If $X_{1} , \cdots , X_{n}$ are mutually independent and each has moment generating function of $M_{i}(t) \qquad , -h_{i} < t < h_{i}$, then the moment generating function of their linear combination $\displaystyle T := \sum_{i=1}^{n} a_{i} X_{i}$ is $$ M_{T} (t) = \prod_{i=1}^{n} M_{i} \left( a_{i} t \right) \qquad , -\text{min}_{i=1, \cdots, n}^{n} h_{i} < t < \text{min}_{i=1, \cdots, n} h_{i} $$

[1]1

Moment generating function of the binomial distribution: $$ m(t) = \left[ (1-p) + pe^{t} \right]^{n} \qquad , t \in \mathbb{R} $$

Let’s denote $\displaystyle Y := \sum_{i=1}^{m} X_{i}$, given that $X_{1} , \cdots , X_{m}$ are mutually independent, $$ \begin{align*} M_{Y} (t) =& M_{1} (t) \cdots M_{m} (t) \\ =& \left[ (1-p) + pe^{t} \right]^{n_{1}} \cdots \left[ (1-p) + pe^{t} \right]^{n_{m}} \\ =& \left[ (1-p) + pe^{t} \right]^{\sum_{i=1}^{m} n_{i}} \end{align*} $$ Therefore, $$ Y \sim \text{Bin} \left( \sum_{i=1}^{m} n_{i} , p \right) $$

[2]2

Moment generating function of the Poisson distribution: $$ m(t) = \exp \left[ \lambda \left( e^{t} - 1 \right) \right] \qquad , t \in \mathbb{R} $$

Let’s denote $\displaystyle Y := \sum_{i=1}^{n} X_{i}$, given that $X_{1} , \cdots , X_{n}$ are mutually independent, $$ \begin{align*} M_{Y} (t) =& M_{1} (t) \cdots M_{n} (t) \\ =& \exp \left[ m_{1} \left( e^{t} - 1 \right) \right] \cdots \exp \left[ m_{n} \left( e^{t} - 1 \right) \right] \\ =& \exp \left[ \sum_{i=1}^{n} m_{i} \left( e^{t} - 1 \right) \right] \end{align*} $$ Therefore, $$ Y \sim \text{Poi} \left( \sum_{i=1}^{m} m_{i} \right) $$

[3]3

Moment generating function of the gamma distribution: $$ m(t) = \left( 1 - \theta t\right)^{-k} \qquad , t < {{ 1 } \over { \theta }} $$

Let’s denote $\displaystyle Y := \sum_{i=1}^{n} X_{i}$, given that $X_{1} , \cdots , X_{n}$ are mutually independent, $$ \begin{align*} M_{Y} (t) =& M_{1} (t) \cdots M_{n} (t) \\ =& \left( 1 - \theta t\right)^{-k_{1}} \cdots \left( 1 - \theta t\right)^{-k_{n}} \\ =& \left( 1 - \theta t\right)^{-\sum_{i=1}^{n} k_{i}} \end{align*} $$ Therefore, $$ Y \sim \Gamma \left( \sum_{i=1}^{n} k_{i} , \theta \right) $$

[4]4

Relationship between gamma distribution and chi-squared distribution: $$ \Gamma \left( { r \over 2 } , 2 \right) \iff \chi ^2 (r) $$

With $\displaystyle Y := \sum_{i=1}^{n} X_{i}$ and if we denote $\displaystyle \sum_{i=1}^{n} k_{i} := {{ r_{i} } \over { 2 }}$ and $\theta := 2$, according to Theorem [3] $$ Y \sim \Gamma \left( \sum_{i=1}^{n} {{ r_{i} } \over { 2 }} , 2 \right) $$

[5]5

Moment generating function of the normal distribution: $$ m(t) = \exp \left( \mu t + {{ \sigma^{2} t^{2} } \over { 2 }} \right) \qquad , t \in \mathbb{R} $$

Let’s denote $\displaystyle Y := \sum_{i=1}^{n} a_{i} X_{i}$, given that $X_{1} , \cdots , X_{n}$ are mutually independent, $$ \begin{align*} M_{Y} =& M_{1} (t) \cdots M_{n} (t) \\ =& \prod_{i=1}^{n} \exp \left[ t a_{i} \mu_{i} + {{ t^{2} a_{i}^{2} \sigma_{i}^{2} } \over { 2 }} \right] \\ =& \exp \left[ t \sum_{i=1}^{n} a_{i} \mu_{i} + {{ t^{2} \sum_{i=1}^{n} a_{i}^{2} \sigma_{i}^{2} } \over { 2 }} \right] \end{align*} $$ Therefore, $$ Y \sim N \left( \sum_{i=1}^{n} a_{i } \mu_{i} , \sum_{i=1}^{n} a_{i }^2 \sigma_{i}^2 \right) $$

Note

It’s important to note that technically, there’s no term for the addition of random variables. More precisely, it refers to a special case among linear combinations of random variables. Obviously, a stronger condition like iid makes it easier to identify their distribution.


  1. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p145. ↩︎

  2. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p155. ↩︎

  3. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p163. ↩︎

  4. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p163. ↩︎

  5. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p176. ↩︎