logo

Density and Cumulative Distribution Functions of Random Variables Defined by Measure Theory 📂Probability Theory

Density and Cumulative Distribution Functions of Random Variables Defined by Measure Theory

Definition 1

A probability space $( \Omega , \mathcal{F} , P)$ is given, and let’s say $m$ is a measure.

  1. For an integrable $f \ge 0$, if measure $P : \mathcal{F} \to \mathbb{R}$ has the form of $$ A \mapsto P(A) = \int_{A} f dm $$ then $P$ is said to be absolutely continuous. In particular, such $f$ is called the density of $P$ with respect to measure $m$.
  2. The following defined $F$ is called the (cumulative) distribution function corresponding to density $f$. $$ F(y) := \int_{-\infty}^{y} f(x) dx $$
  3. The following defined $F_{X}$ is called the (cumulative) distribution function of the random variable $X$. $$ F_{X} (y) := P_{X} \left( ( -\infty , y ] \right) $$
  4. For all $y \in \mathbb{R}$, the following satisfied $f_{X}$ is called the density of the random variable $X$. $$ F_{X} (y) = \int_{-\infty}^{y} f_{X} (x) dx $$

  • If you’re not yet acquainted with measure theory, you can ignore the term “probability space.”

Explanation

Naturally, according to the definition of probability, $\displaystyle \int_{\Omega} f dm = P ( \Omega ) = 1$ is true.

Absolute Continuity

When $P$ is $\displaystyle P(A) = \int_{A} f dm$, being absolutely continuous may seem hardly surprising upon a little thought, because there can’t be any discontinuous point in $P$ wherein the function values for $P(A \cup \left\{ a \right\})$ at $A \cup \left\{ a \right\}$, added with a point to $P(A)$ and $A$, differ significantly—because it’s continuous. This originates from the properties of measure $m$, regardless of $P$ or $f$. Even if $f$ is a discontinuous function, $P$ must be continuous to satisfy the condition.

Theorem

Properties of the Distribution Function 2

The distribution function has the following properties:

  • [1] Non-decreasing: $$y_{1} \le y_{2} \implies F_{X} (y_{1}) \le F_{X} ( y_{2} )$$
  • [2] Limits at the extremes: $$ \begin{align*} \lim_{y \to \infty} F_{X} (y) =& 1 \\ \lim_{y \to -\infty} F_{X} (y) =& 0 \end{align*} $$
  • [3] Right-continuous: For $y \ge y_{0}$ $$y \to y_{0} \implies F_{X} (y) \to F_{X} (y_{0} )$$
  • [4] With the definition of density, you obtain another useful expression for the expected value. $$ E(X) = \int_{0}^{\infty} P(X>t) dt $$

Proof

[4]

$0 \le y \le t < \infty$ and according to Fubini’s theorem $$ \begin{align*} \int_{0}^{\infty} P(X>t) dt &= \int_{0}^{\infty} P_{X}( (\infty,t] ) dt \\ =& \int_{0}^{\infty} F_{X} (t) dt \\ =& \int_{0}^{\infty} \int_{t}^{\infty} f_{X} (y) dy dt \\ =& \int_{0}^{\infty} \int_{0}^{y} f_{X} (y) dt dy \\ =& \int_{0}^{\infty} \int_{0}^{y} dt f_{X} (y) dy \\ =& \int_{0}^{\infty} y f_{X} (y) dy \\ =& E(X) \end{align*} $$

See Also


  1. Capinski. (1999). Measure, Integral and Probability: p106~109. ↩︎

  2. Capinski. (1999). Measure, Integral and Probability: p110. ↩︎