logo

Summation Summary of Random Variables Following a Specific Distribution 📂Mathematical Statistics

Summation Summary of Random Variables Following a Specific Distribution

Theorem

Let’s say the random variables X1,,XnX_{1} , \cdots , X_{n} are mutually independent.

  • [1] Binomial distribution: If XiBin(ni,p)X_i \sim \text{Bin} ( n_{i}, p), then i=1mXiBin(i=1mni,p) \sum_{i=1}^{m} X_{i} \sim \text{Bin} \left( \sum_{i=1}^{m} n_{i} , p \right)
  • [2] Poisson distribution: If XiPoi(mi)X_i \sim \text{Poi}( m_{i} ), then i=1nXiPoi(i=1nmi) \sum_{i=1}^{n} X_{i} \sim \text{Poi} \left( \sum_{i=1}^{n} m_{i} \right)
  • [3] Gamma distribution: If XiΓ(ki,θ)X_i \sim \Gamma ( k_{i}, \theta), then i=1nXiΓ(i=1nki,θ) \sum_{i=1}^{n} X_{i} \sim \Gamma \left( \sum_{i=1}^{n} k_{i} , \theta \right)
  • [4] Chi-squared distribution: If Xiχ2(ri)X_i \sim \chi^2 ( r_{i} ), then i=1nXiχ2(i=1nri) \sum_{i=1}^{n} X_{i} \sim \chi ^2 \left( \sum_{i=1}^{n} r_{i} \right)
  • [5] Normal distribution: If XiN(μi,σi2)X_i \sim N( \mu_{i}, \sigma_{i}^{2} ), then for the given vector (a1,,an)Rn(a_{1} , \cdots , a_{n}) \in \mathbb{R}^{n}, i=1naiXiN(i=1naiμi,i=1nai2σi2) \sum_{i=1}^{n} a_{i} X_{i} \sim N \left( \sum_{i=1}^{n} a_{i } \mu_{i} , \sum_{i=1}^{n} a_{i }^2 \sigma_{i}^2 \right)

Proof

Strategy: Derive using the moment generating function. The condition that the variables are mutually independent is essential for applying the following theorem.

If X1,,XnX_{1} , \cdots , X_{n} are mutually independent and each has moment generating function of Mi(t),hi<t<hiM_{i}(t) \qquad , -h_{i} < t < h_{i}, then the moment generating function of their linear combination T:=i=1naiXi\displaystyle T := \sum_{i=1}^{n} a_{i} X_{i} is MT(t)=i=1nMi(ait),mini=1,,nnhi<t<mini=1,,nhi M_{T} (t) = \prod_{i=1}^{n} M_{i} \left( a_{i} t \right) \qquad , -\text{min}_{i=1, \cdots, n}^{n} h_{i} < t < \text{min}_{i=1, \cdots, n} h_{i}

[1]1

Moment generating function of the binomial distribution: m(t)=[(1p)+pet]n,tR m(t) = \left[ (1-p) + pe^{t} \right]^{n} \qquad , t \in \mathbb{R}

Let’s denote Y:=i=1mXi\displaystyle Y := \sum_{i=1}^{m} X_{i}, given that X1,,XmX_{1} , \cdots , X_{m} are mutually independent, MY(t)=M1(t)Mm(t)=[(1p)+pet]n1[(1p)+pet]nm=[(1p)+pet]i=1mni \begin{align*} M_{Y} (t) =& M_{1} (t) \cdots M_{m} (t) \\ =& \left[ (1-p) + pe^{t} \right]^{n_{1}} \cdots \left[ (1-p) + pe^{t} \right]^{n_{m}} \\ =& \left[ (1-p) + pe^{t} \right]^{\sum_{i=1}^{m} n_{i}} \end{align*} Therefore, YBin(i=1mni,p) Y \sim \text{Bin} \left( \sum_{i=1}^{m} n_{i} , p \right)

[2]2

Moment generating function of the Poisson distribution: m(t)=exp[λ(et1)],tR m(t) = \exp \left[ \lambda \left( e^{t} - 1 \right) \right] \qquad , t \in \mathbb{R}

Let’s denote Y:=i=1nXi\displaystyle Y := \sum_{i=1}^{n} X_{i}, given that X1,,XnX_{1} , \cdots , X_{n} are mutually independent, MY(t)=M1(t)Mn(t)=exp[m1(et1)]exp[mn(et1)]=exp[i=1nmi(et1)] \begin{align*} M_{Y} (t) =& M_{1} (t) \cdots M_{n} (t) \\ =& \exp \left[ m_{1} \left( e^{t} - 1 \right) \right] \cdots \exp \left[ m_{n} \left( e^{t} - 1 \right) \right] \\ =& \exp \left[ \sum_{i=1}^{n} m_{i} \left( e^{t} - 1 \right) \right] \end{align*} Therefore, YPoi(i=1mmi) Y \sim \text{Poi} \left( \sum_{i=1}^{m} m_{i} \right)

[3]3

Moment generating function of the gamma distribution: m(t)=(1θt)k,t<1θ m(t) = \left( 1 - \theta t\right)^{-k} \qquad , t < {{ 1 } \over { \theta }}

Let’s denote Y:=i=1nXi\displaystyle Y := \sum_{i=1}^{n} X_{i}, given that X1,,XnX_{1} , \cdots , X_{n} are mutually independent, MY(t)=M1(t)Mn(t)=(1θt)k1(1θt)kn=(1θt)i=1nki \begin{align*} M_{Y} (t) =& M_{1} (t) \cdots M_{n} (t) \\ =& \left( 1 - \theta t\right)^{-k_{1}} \cdots \left( 1 - \theta t\right)^{-k_{n}} \\ =& \left( 1 - \theta t\right)^{-\sum_{i=1}^{n} k_{i}} \end{align*} Therefore, YΓ(i=1nki,θ) Y \sim \Gamma \left( \sum_{i=1}^{n} k_{i} , \theta \right)

[4]4

Relationship between gamma distribution and chi-squared distribution: Γ(r2,2)    χ2(r) \Gamma \left( { r \over 2 } , 2 \right) \iff \chi ^2 (r)

With Y:=i=1nXi\displaystyle Y := \sum_{i=1}^{n} X_{i} and if we denote i=1nki:=ri2\displaystyle \sum_{i=1}^{n} k_{i} := {{ r_{i} } \over { 2 }} and θ:=2\theta := 2, according to Theorem [3] YΓ(i=1nri2,2) Y \sim \Gamma \left( \sum_{i=1}^{n} {{ r_{i} } \over { 2 }} , 2 \right)

[5]5

Moment generating function of the normal distribution: m(t)=exp(μt+σ2t22),tR m(t) = \exp \left( \mu t + {{ \sigma^{2} t^{2} } \over { 2 }} \right) \qquad , t \in \mathbb{R}

Let’s denote Y:=i=1naiXi\displaystyle Y := \sum_{i=1}^{n} a_{i} X_{i}, given that X1,,XnX_{1} , \cdots , X_{n} are mutually independent, MY=M1(t)Mn(t)=i=1nexp[taiμi+t2ai2σi22]=exp[ti=1naiμi+t2i=1nai2σi22] \begin{align*} M_{Y} =& M_{1} (t) \cdots M_{n} (t) \\ =& \prod_{i=1}^{n} \exp \left[ t a_{i} \mu_{i} + {{ t^{2} a_{i}^{2} \sigma_{i}^{2} } \over { 2 }} \right] \\ =& \exp \left[ t \sum_{i=1}^{n} a_{i} \mu_{i} + {{ t^{2} \sum_{i=1}^{n} a_{i}^{2} \sigma_{i}^{2} } \over { 2 }} \right] \end{align*} Therefore, YN(i=1naiμi,i=1nai2σi2) Y \sim N \left( \sum_{i=1}^{n} a_{i } \mu_{i} , \sum_{i=1}^{n} a_{i }^2 \sigma_{i}^2 \right)

Note

It’s important to note that technically, there’s no term for the addition of random variables. More precisely, it refers to a special case among linear combinations of random variables. Obviously, a stronger condition like iid makes it easier to identify their distribution.


  1. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p145. ↩︎

  2. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p155. ↩︎

  3. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p163. ↩︎

  4. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p163. ↩︎

  5. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p176. ↩︎