Density and Cumulative Distribution Functions of Random Variables Defined by Measure Theory
📂Probability TheoryDensity and Cumulative Distribution Functions of Random Variables Defined by Measure Theory
Definition
A probability space (Ω,F,P) is given, and let’s say m is a measure.
- For an integrable f≥0, if measure P:F→R has the form of
A↦P(A)=∫Afdm
then P is said to be absolutely continuous. In particular, such f is called the density of P with respect to measure m.
- The following defined F is called the (cumulative) distribution function corresponding to density f.
F(y):=∫−∞yf(x)dx
- The following defined FX is called the (cumulative) distribution function of the random variable X.
FX(y):=PX((−∞,y])
- For all y∈R, the following satisfied fX is called the density of the random variable X.
FX(y)=∫−∞yfX(x)dx
- If you’re not yet acquainted with measure theory, you can ignore the term “probability space.”
Explanation
Naturally, according to the definition of probability, ∫Ωfdm=P(Ω)=1 is true.
Absolute Continuity
When P is P(A)=∫Afdm, being absolutely continuous may seem hardly surprising upon a little thought, because there can’t be any discontinuous point in P wherein the function values for P(A∪{a}) at A∪{a}, added with a point to P(A) and A, differ significantly—because it’s continuous. This originates from the properties of measure m, regardless of P or f. Even if f is a discontinuous function, P must be continuous to satisfy the condition.
Theorem
Properties of the Distribution Function
The distribution function has the following properties:
- [1] Non-decreasing:
y1≤y2⟹FX(y1)≤FX(y2)
- [2] Limits at the extremes:
y→∞limFX(y)=y→−∞limFX(y)=10
- [3] Right-continuous: For y≥y0
y→y0⟹FX(y)→FX(y0)
- [4] With the definition of density, you obtain another useful expression for the expected value.
E(X)=∫0∞P(X>t)dt
Proof
[4]
0≤y≤t<∞ and according to Fubini’s theorem
∫0∞P(X>t)dt=======∫0∞PX((∞,t])dt∫0∞FX(t)dt∫0∞∫t∞fX(y)dydt∫0∞∫0yfX(y)dtdy∫0∞∫0ydtfX(y)dy∫0∞yfX(y)dyE(X)
■
See Also