logo

Stochastic Increment and Decrement Functions and Confidence Intervals 📂Mathematical Statistics

Stochastic Increment and Decrement Functions and Confidence Intervals

Theorem 1

Definition of Stochastic Monotone Functions

If the cumulative distribution function F(t;θ)F \left( t ; \theta \right) is a monotone (increasing or decreasing) function for θ\theta, it is called a Stochastic Increasing(Decreasing) Function.

Pivoting a Continuous Cumulative Distribution Function

Let’s say a statistic TT has a continuous cumulative distribution function FT(t;θ)F_{T} \left( t ; \theta \right). For a fixed α(0,1)\alpha \in (0,1), let α1+α2=α\alpha_{1} + \alpha_{2} = \alpha, and for all tTt \in \mathcal{T} in the support T\mathcal{T} of TT, θL(t)\theta_{L} (t) and θU(t)\theta_{U} (t) are defined as

  • (1): If FT(t;θ)F_{T} \left( t ; \theta \right) is a stochastic decreasing function, FT(t;θU(t))=α1&FT(t;θL(t))=1α2 F_{T} \left( t ; \theta_{U}(t) \right) = \alpha_{1} \quad \& \quad F_{T} \left( t ; \theta_{L}(t) \right) = 1 - \alpha_{2}
  • (2): If FT(t;θ)F_{T} \left( t ; \theta \right) is a stochastic increasing function, FT(t;θU(t))=1α2&FT(t;θL(t))=α1 F_{T} \left( t ; \theta_{U}(t) \right) = 1 - \alpha_{2} \quad \& \quad F_{T} \left( t ; \theta_{L}(t) \right) = \alpha_{1}

In this case, the random interval [θL(t),θU(t)]\left[ \theta_{L} (t) , \theta_{U} (t) \right] is a confidence interval for θ\theta.

Explanation

For example, if Texp(θ)T \sim \exp (\theta), i.e., follows an exponential distribution, its cumulative distribution function F(t;θ)=1et/θF (t; \theta) = 1 - e^{t / \theta} is a stochastic decreasing function for all tt since the function values decrease as θ\theta increases, satisfying condition (1) of the theorem and making it easy to obtain the 1α1-\alpha confidence interval.

The term pivoting in the name of the theorem originates from the term pivot.

Proof

We only prove case (1). Although not completely identical, there is a similar theorem for discrete cumulative distributions2.


{t:α1FT(t;θ0)1α2} \left\{ t : \alpha_{1} \le F_{T} \left( t ; \theta_{0} \right) \le 1 - \alpha_{2} \right\} Assuming the 1α1-\alpha acceptance region is made as above. Since FTF_{T} is a stochastic decreasing function and from the definition of α<1\alpha < 1, 1α2>α11 - \alpha_{2} > \alpha_{1}, therefore θL(t)<θU(t)\theta_{L}(t) < \theta_{U}(t) holds, and their function values are unique. Moreover, FT(t;θ)<α1    θ>θU(t)FT(t;θ)>1α2    θ<θL(t) \begin{align*} F_{T} \left( t ; \theta \right) < \alpha_{1} \iff& \theta > \theta_{U}(t) \\ F_{T} \left( t ; \theta \right) > 1 - \alpha_{2} \iff& \theta < \theta_{L}(t) \end{align*} thus, the following is obtained. {t:α1FT(t;θ0)1α2}={θ:θL(t)θθU(t)} \left\{ t : \alpha_{1} \le F_{T} \left( t ; \theta_{0} \right) \le 1 - \alpha_{2} \right\} = \left\{ \theta : \theta_{L}(t) \le \theta \le \theta_{U}(t) \right\}


  1. Casella. (2001). Statistical Inference(2nd Edition): p432. ↩︎

  2. Casella. (2001). Statistical Inference(2nd Edition): p434. ↩︎