logo

Stochastic Increment and Decrement Functions and Confidence Intervals 📂Mathematical Statistics

Stochastic Increment and Decrement Functions and Confidence Intervals

Theorem 1

Definition of Stochastic Monotone Functions

If the cumulative distribution function $F \left( t ; \theta \right)$ is a monotone (increasing or decreasing) function for $\theta$, it is called a Stochastic Increasing(Decreasing) Function.

Pivoting a Continuous Cumulative Distribution Function

Let’s say a statistic $T$ has a continuous cumulative distribution function $F_{T} \left( t ; \theta \right)$. For a fixed $\alpha \in (0,1)$, let $\alpha_{1} + \alpha_{2} = \alpha$, and for all $t \in \mathcal{T}$ in the support $\mathcal{T}$ of $T$, $\theta_{L} (t)$ and $\theta_{U} (t)$ are defined as

  • (1): If $F_{T} \left( t ; \theta \right)$ is a stochastic decreasing function, $$ F_{T} \left( t ; \theta_{U}(t) \right) = \alpha_{1} \quad \& \quad F_{T} \left( t ; \theta_{L}(t) \right) = 1 - \alpha_{2} $$
  • (2): If $F_{T} \left( t ; \theta \right)$ is a stochastic increasing function, $$ F_{T} \left( t ; \theta_{U}(t) \right) = 1 - \alpha_{2} \quad \& \quad F_{T} \left( t ; \theta_{L}(t) \right) = \alpha_{1} $$

In this case, the random interval $\left[ \theta_{L} (t) , \theta_{U} (t) \right]$ is a confidence interval for $\theta$.

Explanation

For example, if $T \sim \exp (\theta)$, i.e., follows an exponential distribution, its cumulative distribution function $F (t; \theta) = 1 - e^{t / \theta}$ is a stochastic decreasing function for all $t$ since the function values decrease as $\theta$ increases, satisfying condition (1) of the theorem and making it easy to obtain the $1-\alpha$ confidence interval.

The term pivoting in the name of the theorem originates from the term pivot.

Proof

We only prove case (1). Although not completely identical, there is a similar theorem for discrete cumulative distributions2.


$$ \left\{ t : \alpha_{1} \le F_{T} \left( t ; \theta_{0} \right) \le 1 - \alpha_{2} \right\} $$ Assuming the $1-\alpha$ acceptance region is made as above. Since $F_{T}$ is a stochastic decreasing function and from the definition of $\alpha < 1$, $1 - \alpha_{2} > \alpha_{1}$, therefore $\theta_{L}(t) < \theta_{U}(t)$ holds, and their function values are unique. Moreover, $$ \begin{align*} F_{T} \left( t ; \theta \right) < \alpha_{1} \iff& \theta > \theta_{U}(t) \\ F_{T} \left( t ; \theta \right) > 1 - \alpha_{2} \iff& \theta < \theta_{L}(t) \end{align*} $$ thus, the following is obtained. $$ \left\{ t : \alpha_{1} \le F_{T} \left( t ; \theta_{0} \right) \le 1 - \alpha_{2} \right\} = \left\{ \theta : \theta_{L}(t) \le \theta \le \theta_{U}(t) \right\} $$


  1. Casella. (2001). Statistical Inference(2nd Edition): p432. ↩︎

  2. Casella. (2001). Statistical Inference(2nd Edition): p434. ↩︎