Joint and Marginal Distributions Defined by Measure Theory
📂Probability TheoryJoint and Marginal Distributions Defined by Measure Theory
Definition
Let’s assume that a probability space (Ω,F,P) is given.
- Joint Distribution: If there are two random variables X and Y defined in (Ω,F,P), the distribution of the random vector (X,Y):Ω→R2 for a Borel set B⊂B(R2) is defined as
P(X,Y)(B):==P((X,Y)∈B)∫Bf(X,Y)(x,y)dm2(x,y)
and if there exists f(X,Y) that satisfies this, X and Y are said to have a joint density.
- Marginal Distribution: For a Borel set A⊂R, the following is referred to as the marginal distribution:
PX(A):=P(X,Y)(A×R)PY(A):=P(X,Y)(R×A)
- If you haven’t encountered measure theory yet, you can ignore the term “probability space.”
Introducing the formula for the sum of two random variables X+Y. Since the sum of random variables directly leads to the concept of the average, its importance is considered to be significant.
For X and Y that have a joint density, marginal density is found as follows.
fX(x)=∫Rf(X,Y)(x,y)dyfY(y)=∫Rf(X,Y)(x,y)dx
If X and Y have a joint density fX,Y,
fX+Y(z)=∫RfX,Y(x,z−x)dx
Derivation
If we define y′=x+y, according to the Fubini’s theorem,
fX+Y(z)=====P(X+Y≤z)PX,Y({(x,y):x+y≤z})∬{(x,y):x+y≤z}fX,Y(x,y)dxdy∫R∫−∞z−xfX,Y(x,y)dydx∫−∞z∫RfX,Y(x,y′−x)dxdy′
■