logo

t-Distribution 📂Probability Distribution

t-Distribution

Definition 1

pdf.png

A continuous probability distribution t(ν)t \left( \nu \right), known as the t-distribution, is defined for degrees of freedom ν>0\nu > 0 as having the following probability density function: f(x)=Γ(ν+12)νπΓ(ν2)(1+x2ν)ν+12,xR f(x) = {{ \Gamma \left( {{ \nu + 1 } \over { 2 }} \right) } \over { \sqrt{\nu \pi} \Gamma \left( {{ \nu } \over { 2 }} \right) }} \left( 1 + {{ x^{2} } \over { \nu }} \right)^{- {{ \nu + 1 } \over { 2 }}} \qquad ,x \in \mathbb{R}


Description

The t-distribution is widely known for its discovery and publication by William S. Gosset, who worked at the Guinness brewery, still famous for its beer. At that time, being bound to the company, he submitted his work under the pseudonym Student, hence it is also called the Student’s t-distribution. For freshmen in statistics, it is initially encountered for small samples, which are assumed to follow a normal distribution but in reality do not exceed 30 samples. It is considered to converge to the normal distribution when ν30\nu \ge 30.

Meanwhile, the distribution when ν=1\nu = 1 is called the Cauchy distribution.

Basic Properties

Moment Generating Function

  • [1]: The moment generating function does not exist for the tt-distribution.

Mean and Variance

  • [2]: If Xt(ν)X \sim t (\nu) then E(X)=0,ν>1Var(X)=νν2,ν>2 \begin{align*} E(X) =& 0 & \qquad , \nu >1 \\ \operatorname{Var}(X) =& {{ \nu } \over { \nu - 2 }} & \qquad , \nu > 2 \end{align*}

Theorem

Let us assume two random variables W,VW,V are independent and WN(0,1)W \sim N(0,1), Vχ2(r)V \sim \chi^{2} (r).

kkth Moment

  • [a]: If k<rk < r, then T:=WV/r\displaystyle T := { {W} \over {\sqrt{V/r} } } is the kkth moment and ETk=EWk2k/2Γ(r2k2)Γ(r2)rk/2 E T^{k} = E W^{k} {{ 2^{-k/2} \Gamma \left( {{ r } \over { 2 }} - {{ k } \over { 2 }} \right) } \over { \Gamma \left( {{ r } \over { 2 }} \right) r^{-k/2} }}

Derived from Standard Normal and Chi-square Distributions

  • [b]: WV/rt(r){ {W} \over {\sqrt{V/r} } } \sim t(r)

Deriving Standard Normal Distribution as the Limiting Distribution of Student’s t-Distribution

  • [c]: If Tnt(n)T_n \sim t(n) then Tn DN(0,1) T_n \ \overset{D}{\to} N(0,1)

Deriving the F-Distribution

  • [d]: A random variable Xt(ν)X \sim t(\nu) following a t-distribution with degrees of freedom ν>0\nu > 0 is defined as YY, and follows an F-distribution F(1,ν)F (1,\nu). Y:=X2F(1,ν) Y := X^{2} \sim F (1,\nu)

  • N(μ,σ2)N \left( \mu , \sigma^{2} \right) is a normal distribution with mean μ\mu and variance σ2\sigma^{2}.
  • χ2(r)\chi^{2} \left( r \right) is a chi-square distribution with degrees of freedom rr.

Proof

[1]

The existence of the moment generating function for a random variable implies the existence of all kkth moments for every kNk \in \mathbb{N}. However, as theorem [a] states, the kkth moment of the t-distribution exists only when k<rk < r, thus the moment generating function cannot exist.

[2]

Using the moment formula [a].

[a]

Chi-square distribution moments: Let Xχ2(r)X \sim \chi^{2} (r). If k>r/2k > - r/ 2, then the kkth moment exists and EXk=2kΓ(r/2+k)Γ(r/2) E X^{k} = {{ 2^{k} \Gamma (r/2 + k) } \over { \Gamma (r/2) }}

Multiplying both sides of k<rk < r by 1/2-1/2 results in k/2>r/2-k/2 > -r/2, hence ETk=E[Wk(Vr)k/2]=EWkE(Vr)k/2=EWk2k/2Γ(r2k2)Γ(r2)rk/2 \begin{align*} E T^{k} =& E \left[ W^{k} \left( {{ V } \over { r }} \right)^{-k/2} \right] \\ =& E W^{k} E \left( {{ V } \over { r }} \right)^{-k/2} \\ =& E W^{k} {{ 2^{-k/2} \Gamma \left( {{ r } \over { 2 }} - {{ k } \over { 2 }} \right) } \over { \Gamma \left( {{ r } \over { 2 }} \right) r^{-k/2} }} \end{align*}

[b]

Derived directly from the joint density function.

[c]

Using the Stirling approximation on the probability density function.

[d]

Circumventing by the ratio of chi-square distributions.

Code

Below is Julia code displaying the probability density functions for the Cauchy distribution, t-distribution, and Cauchy distribution.

@time using LaTeXStrings
@time using Distributions
@time using Plots

cd(@__DIR__)

x = -4:0.1:4
plot(x, pdf.(Cauchy(), x),
 color = :red,
 label = "Cauchy", size = (400,300))
plot!(x, pdf.(TDist(3), x),
 color = :orange,
 label = "t(3)", size = (400,300))
plot!(x, pdf.(TDist(30), x),
 color = :black, linestyle = :dash,
 label = "t(30)", size = (400,300))
plot!(x, pdf.(Normal(), x),
 color = :black,
 label = "Standard Normal", size = (400,300))

xlims!(-4,5); ylims!(0,0.5); title!(L"\mathrm{pdf\,of\, t}(\nu)")
png("pdf")

  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p191. ↩︎