Sample meanX and sample varianceS2 are defined as follows as random variables.
X:=n1k=1∑nXkS2:=n−11k=1∑n(Xk−X)2
Description
It’s commonly used among statisticians, but in fact, it also has a name. It is divided into four parts, making it difficult to cite specifically.
(b) is a fact that seems obvious if it’s obvious and strange if it’s strange, that is, despite both sample mean and sample variance coming from the same data, they are independent.
Inference about the population mean for small samples
Now, defining a random vector Y:=(X1−X,⋯,Xn−X), some random vector W can be represented as follows.
W=[XY]=[vTI−1vT]X
Since W is a linear transformation of random vectors that follow a multivariate normal distribution, it still follows a multivariate normal distribution, and its mean vector is
EW=[vTI−1vT]μ1=[μ0n]
when taking the expected value of the above equation, and its covariance matrixΣ is
Σ==[vTI−1vT]σ2I[vTI−1vT]Tσ2[1/n0n0nTI−1vT].
Here, it can be seen that X is independent of Y, and
S2=n−11k=1∑n(Xk−X)2=n−11YTY
therefore, X⊥S2 is true.
■
(c)
If it is said that V=i=1∑n(σXi−μ)2, then since σXi−μ∼N(0,1), it will be V∼χ2(n),
Where
i=1∑n(σXi−X)2=σ2n−1i=1∑nn−1(Xi−X)2=(n−1)σ2S2
To summarize
V=(n−1)σ2S2+(σ/nX−μ)2
Since it is V∼χ2(n), and by (a) of Student’s theorem
(σ/nX−μ)∼N(0,1)
The square of the standard normal distribution follows a chi-squared distribution, so
(σ/nX−μ)2∼χ2(1)
In (b) of Student’s theorem, it was shown that X and S2 are independent, so if both sides are put into a form of moment-generating functions,
(1−2t)−n/2=E{exp((n−1)σ2S2t)}(1−2t)−1/2
Therefore, the moment-generating function of (n−1)σ2S2 is (1−2t)−(n−1)/2
T=S/nX−μ=(n−1)S2/(σ2(n−1))(X−μ)/(σ/n)
In (a) of Student’s theorem, it was shown that X∼N(μ,nσ2) and from (c) it was shown that (n−1)σ2S2∼χ2(n−1),
T∼t(n−1)
■
Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p195. ↩︎