Derivation of the Beta Distribution from Two Independent Gamma Distributions📂Probability Distribution
Derivation of the Beta Distribution from Two Independent Gamma Distributions
Theorem
If two random variablesX1,X2 are independent and X1∼Γ(α1,1), X2∼Γ(α2,1), then
X1+X2X1∼beta(α1,α2)
Explanation
If two data points follow the gamma distribution and are independent, it could be possible to explain the ratio of their sum using probability distribution theory. Specifically, the gamma distribution allows for relatively free movement among various probability distributions, so it’s good to know as a fact.
Definition of beta distribution: For α,β>0, the following probability density function for a continuous probability distribution Beta(α,β) is called the beta distribution.
f(x)=B(α,β)1xα−1(1−x)β−1,x∈[0,1]
Since X1,X2 is independent, the joint density function h for x1,x2∈(0,∞) is as follows.
h(x1,x2)=Γ(α1)Γ(α2)1x1α1−1x2α2−1e−x1−x2
Now, if we let Y1:=X1+X2 and Y2:=X1/(X1+X2), then x1=y1y2 and x2=y1(1−y2), so
J=y21−y2y1−y1=−y1=0
Therefore, the joint density function of Y1,Y2 for y1∈(0,∞),y2∈(0,1) is
g(y1,y2)===∣y1∣Γ(α1)Γ(α2)1(y1y2)α1−1[y1(1−y2)]α2−1e−y1Γ(α1)Γ(α2)1y1α1+α2−1e−y1⋅y2α1−1(1−y2)α2−1Γ(α1+α2)1y1α1+α2−1e−y1⋅Γ(α1)Γ(α2)Γ(α1+α2)y2α1−1(1−y2)α2−1
The marginal density function of Y1,Y2g1,g2 is
g1(y1)=Γ(α1+α2)1y1α1+α2−1e−y1g2(y2)=Γ(α1)Γ(α2)Γ(α1+α2)y2α1−1(1−y2)α2−1
Thus,
Y1∼Γ(α1+α2,1)Y2∼beta(α1,α2)
■
Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): 164-165. ↩︎