logo

Derivation of Beta Distribution from F-Distribution 📂Probability Distribution

Derivation of Beta Distribution from F-Distribution

Theorem 1

A random variable $X \sim F \left( r_{1}, r_{2} \right)$ following an F-distribution with degrees of freedom $r_{1} , r_{2}$ is defined as follows $Y$ and follows a beta distribution $\text{Best} \left( {{ r_{1} } \over { 2 }} , {{ r_{2} } \over { 2 }} \right)$. $$ Y := {{ \left( r_{1} / r_{2} \right) X } \over { 1 + \left( r_{1} / r_{2} \right) X }} \sim \text{Beta} \left( {{ r_{1} } \over { 2 }} , {{ r_{2} } \over { 2 }} \right) $$

Proof

Strategy: Direct deduction using the probability density function.

Definition of F-distribution: A continuous probability distribution $F \left( r_{1} , r_{2} \right)$ with the following probability density function for degrees of freedom $r_{1}, r_{2} > 0$ is called an F-distribution. $$ f(x) = {{ 1 } \over { B \left( r_{1}/2 , r_{2} / 2 \right) }} \left( {{ r_{1} } \over { r_{2} }} \right)^{r_{1} / 2} x^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{-(r_{1} + r_{2}) / 2} \qquad , x \in (0, \infty) $$

Definition of Beta Distribution: A continuous probability distribution $\text{Beta}(\alpha,\beta)$ with the following probability density function for $\alpha , \beta > 0$ is called a Beta distribution. $$ f(x) = {{ 1 } \over { B(\alpha,\beta) }} x^{\alpha - 1} (1-x)^{\beta - 1} \qquad , x \in [0,1] $$


$$ \begin{align*} & Y = {{ \left( r_{1} / r_{2} \right) X } \over { 1 + \left( r_{1} / r_{2} \right) X }} \\ \implies & Y \left( 1 + \left( r_{1} / r_{2} \right) X \right) = \left( r_{1} / r_{2} \right) X \\ \implies & Y = \left( r_{1} / r_{2} \right) X (1 - Y) \\ \implies & \left( r_{1} / r_{2} \right) X = {{ Y } \over { 1 - Y }} \end{align*} $$ and $$ \begin{align*} dy =& \left[ {{ \left( r_{1} / r_{2} \right) } \over { 1 + \left( r_{1} / r_{2} \right) x }} - \left( r_{1} / r_{2} \right) {{ \left( r_{1} / r_{2} \right) x } \over { \left[ 1 + \left( r_{1} / r_{2} \right) x \right]^{2} }} \right] dx \\ =& {{ \left( r_{1} / r_{2} \right) } \over { 1 + \left( r_{1} / r_{2} \right) x }} \left[ {{ 1 + \left( r_{1} / r_{2} \right) x } \over { 1 + \left( r_{1} / r_{2} \right) x }} - {{ \left( r_{1} / r_{2} \right) x } \over { 1 + \left( r_{1} / r_{2} \right) x }} \right] dx \\ =& {{ \left( r_{1} / r_{2} \right) } \over { \left[ 1 + \left( r_{1} / r_{2} \right) x \right]^{2} }} dx \end{align*} $$ therefore, the probability density function $f_{Y}$ of $Y$ is $$ \begin{align*} & B \left( r_{1}/2 , r_{2} / 2 \right) f_{Y} (y) \\ =& \left( {{ r_{1} } \over { r_{2} }} \right)^{r_{1} / 2} x^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{-(r_{1} + r_{2}) / 2} \cdot {{ \left[ 1 + \left( r_{1} / r_{2} \right) x \right]^{2} } \over { \left( r_{1} / r_{2} \right) }} \\ =& \left( {{ r_{1} } \over { r_{2} }} \right)^{r_{1} / 2 - 1} x^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{2-(r_{1} + r_{2}) / 2} \\ =& \left( {{ r_{1} } \over { r_{2} }} x \right)^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{2-(r_{1} + r_{2}) / 2} \\ =& y^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{2-(r_{1} + r_{2}) / 2} \\ =& y^{r_{1} / 2 - 1} \left( 1 + {{ r_{1} } \over { r_{2} }} x \right)^{1 - r_{2} / 2} \\ =& y^{r_{1} / 2 - 1} \left( 1 + {{ y } \over { 1 - y }} \right)^{1 - r_{2} / 2} \\ =& y^{r_{1} / 2 - 1} \left( {{ 1 } \over { 1 - y }} \right)^{1 - r_{2} / 2} \\ =& y^{r_{1} / 2 - 1} \left( 1 - y \right)^{r_{2} / 2 - 1} \end{align*} $$ Summarizing, $Y$ has the following probability density function $\text{Beta} \left( {{ r_{1} } \over { 2 }} , {{ r_{2} } \over { 2 }} \right)$. $$ f_{Y} (y) = {{ 1 } \over { B \left( r_{1}/2 , r_{2} / 2 \right) }} y^{r_{1} / 2 - 1} \left( 1 - y \right)^{r_{2} / 2 - 1} $$


  1. Casella. (2001). Statistical Inference(2nd Edition): p225. ↩︎