Summation Summary of Random Variables Following a Specific Distribution
📂Mathematical StatisticsSummation Summary of Random Variables Following a Specific Distribution
Theorem
Let’s say the random variables X1,⋯,Xn are mutually independent.
- [1] Binomial distribution: If Xi∼Bin(ni,p), then
i=1∑mXi∼Bin(i=1∑mni,p)
- [2] Poisson distribution: If Xi∼Poi(mi), then
i=1∑nXi∼Poi(i=1∑nmi)
- [3] Gamma distribution: If Xi∼Γ(ki,θ), then
i=1∑nXi∼Γ(i=1∑nki,θ)
- [4] Chi-squared distribution: If Xi∼χ2(ri), then
i=1∑nXi∼χ2(i=1∑nri)
- [5] Normal distribution: If Xi∼N(μi,σi2), then for the given vector (a1,⋯,an)∈Rn,
i=1∑naiXi∼N(i=1∑naiμi,i=1∑nai2σi2)
Proof
Strategy: Derive using the moment generating function. The condition that the variables are mutually independent is essential for applying the following theorem.
If X1,⋯,Xn are mutually independent and each has moment generating function of Mi(t),−hi<t<hi, then the moment generating function of their linear combination T:=i=1∑naiXi is
MT(t)=i=1∏nMi(ait),−mini=1,⋯,nnhi<t<mini=1,⋯,nhi
[1]
Moment generating function of the binomial distribution:
m(t)=[(1−p)+pet]n,t∈R
Let’s denote Y:=i=1∑mXi, given that X1,⋯,Xm are mutually independent,
MY(t)===M1(t)⋯Mm(t)[(1−p)+pet]n1⋯[(1−p)+pet]nm[(1−p)+pet]∑i=1mni
Therefore,
Y∼Bin(i=1∑mni,p)
■
[2]
Moment generating function of the Poisson distribution:
m(t)=exp[λ(et−1)],t∈R
Let’s denote Y:=i=1∑nXi, given that X1,⋯,Xn are mutually independent,
MY(t)===M1(t)⋯Mn(t)exp[m1(et−1)]⋯exp[mn(et−1)]exp[i=1∑nmi(et−1)]
Therefore,
Y∼Poi(i=1∑mmi)
■
[3]
Moment generating function of the gamma distribution:
m(t)=(1−θt)−k,t<θ1
Let’s denote Y:=i=1∑nXi, given that X1,⋯,Xn are mutually independent,
MY(t)===M1(t)⋯Mn(t)(1−θt)−k1⋯(1−θt)−kn(1−θt)−∑i=1nki
Therefore,
Y∼Γ(i=1∑nki,θ)
■
[4]
Relationship between gamma distribution and chi-squared distribution:
Γ(2r,2)⟺χ2(r)
With Y:=i=1∑nXi and if we denote i=1∑nki:=2ri and θ:=2, according to Theorem [3]
Y∼Γ(i=1∑n2ri,2)
■
[5]
Moment generating function of the normal distribution:
m(t)=exp(μt+2σ2t2),t∈R
Let’s denote Y:=i=1∑naiXi, given that X1,⋯,Xn are mutually independent,
MY===M1(t)⋯Mn(t)i=1∏nexp[taiμi+2t2ai2σi2]exp[ti=1∑naiμi+2t2∑i=1nai2σi2]
Therefore,
Y∼N(i=1∑naiμi,i=1∑nai2σi2)
■
Note
It’s important to note that technically, there’s no term for the addition of random variables. More precisely, it refers to a special case among linear combinations of random variables. Obviously, a stronger condition like iid makes it easier to identify their distribution.