Probability Theory
In this category, we delve into advanced topics in probability theory that typically require a graduate level understanding, and make use of Measure Theory and Topology. Intuitive probability theory that doesn’t involve these complex theories is categorized under Mathematical Statistics. The difficulty level is indicated by 🔥 marks, where one mark means understanding measure theory is sufficient, two or more indicate the need for more complex measure theory or topology, and marks are also added for proofs or derivations that are exceptionally complex.
Mark | Difficulty Level |
---|---|
🔥 | Difficult |
🔥🔥 | Insanely difficult |
🔥🔥🔥 | Freaking insanely difficult |
- Summary of Measure Theory and Probability Theory: Basic definitions and concepts are consolidated in this post.
$$ \begin{array}{lll} \text{Analysts’ Term} && \text{Probabilists’ Term} \\ \hline \text{Measure space } (X, \mathcal{E}, \mu) \text{ such that } \mu (X) = 1 && \text{Probability space } (\Omega, \mathcal{F}, P) \\ \text{Measure } \mu : \mathcal{E} \to \mathbb{R} \text{ such that } \mu (X) = 1 && \text{Probability } P : \mathcal{F} \to \mathbb{R} \\ (\sigma\text{-)algebra $\mathcal{E}$ on $X$} && (\sigma\text{-)field $\mathcal{F}$ on $\Omega$} \\ \text{Mesurable set } E \in \mathcal{E} && \text{Event } E \in \mathcal{F} \\ \text{Measurable real-valued function } f : X \to \mathbb{R} && \text{Random variable } X : \Omega \to \mathbb{R} \\ \text{Integral of } f, {\displaystyle \int f d\mu} && \text{Expextation of } f, E(X) \\ f \text{ is } L^{p} && X \text{ has finite $p$th moment} \\ \text{Almost everywhere, a.e.} && \text{Almost surely, a.s.} \end{array} $$
Measure-theoretic Probability Theory
Rigorous Definitions
- Probability 🔥
- Random Variables and Probability Distributions 🔥
- Independence of Random Variables 🔥
- Density and Cumulative Distribution Functions of Random Variables 🔥
- Dirac Measure and Discrete Probability Distributions 🔥
- Expectation 🔥
- Characteristic Functions and Moment Generating Functions 🔥
- Joint and Marginal Distributions 🔥
Conditional Probability
- Conditional Expectation of Random Variables 🔥🔥
- Conditional Probability of Random Variables 🔥🔥
- Conditional Variance 🔥🔥
- Proof of Brooks’ Lemma
Stochastic Processes
- Convergence in Probability 🔥
- Lévy’s Theorem in Probability Theory 🔥🔥🔥
- Separating Class in Probability Theory 🔥🔥🔥
- Equality of Two Probability Measures 🔥🔥🔥
- Tight Probability Measures 🔥🔥
- Mixture Theorem in Probability Theory 🔥🔥🔥
- Convergence in Distribution 🔥🔥🔥
- Continuous Mapping Theorem 🔥🔥
- Lévy’s Continuity Theorem
Stochastic Process Theory
- Introduction to Stochastic Processes $X_{t}$
- Types of States in Stochastic Process Theory
- Transition Probabilities $p_{ij}$, $P(t)$
- Limit of Transition Probabilities $\pi_{j}$
- Kolmogorov Differential Equations $P’(t) = Q P(t)$
- Generalized Random Walk
- Gambler’s Ruin Problem
Markov Chains
- Discrete Time Markov Chain DTMC
- Continuous Time Markov Chain CTMC
- Chapman-Kolmogorov Equation $P^{(n+m)} = P^{(n)} P^{(m)}$
- Hidden Markov Chain
- Poisson Process Defined through Exponential Distribution
- Poisson Process Defined through Infinitesimal Matrix
- Gillespie Stochastic Simulation Algorithm SSA
- Branching Process
- Galton-Watson Process
Brownian Motion
- Increments in Stochastic Process Theory
- Gaussian Process
- Wiener Process BM $W_{t}$, $B_{t}$
- Geometric Brownian Motion GBM
- Fractional Brownian Motion FBM
Martingales
- Definition of Martingales 🔥
- Stopping Times in Stochastic Process Theory 🔥
- Doob’s Maximal Inequality 🔥
- Uprisings in Stochastic Process Theory 🔥
- Submartingale Convergence Theorem 🔥
- Regular and Closable Martingales 🔥🔥
- Uniformly Integrable Martingales 🔥🔥
- Convergence in L1 for Martingales 🔥🔥🔥
- Closable Martingales 🔥🔥🔥
Donsker’s Theorem
- Projection Mapping in Stochastic Process Theory 🔥🔥🔥
- Precompact Stochastic Processes 🔥🔥🔥
- Tight Stochastic Processes 🔥🔥
- Donsker’s Theorem 🔥
Stochastic Information Theory
Entropy
- Shannon Information: Information Defined through Probability 🔥
- Shannon Entropy: Entropy Defined for Random Variables 🔥
- Joint Entropy
- Conditional Entropy
- Cross Entropy 🔥
- Relative Entropy, Kullback-Leibler Divergence
- Gibbs’ Inequality
References
- Applebaum. (2008). Probability and Information(2nd Edition)
- Capinski. (1999). Measure, Integral and Probability
- Kimmel, Axelrod. (2006). Branching Processes in Biology
All posts
- Proof that if Two Events are Mutually Exclusive, They are Dependent
- Two Events Being Independent Proves that Their Complements Are Also Independent
- Probability Defined by Measure Theory
- Independence of Events and Conditional Probability
- What is a Stochastic Process?
- Discrete Markov Chains
- Derivation of the Chapman-Kolmogorov Equation
- Types of States in Stochastic Processes
- Limit of Transition Probabilities
- Generalized Random Walk
- The Gambler's Ruin Problem
- Hidden Markov Chain
- Definition of Poisson Process Through Exponential Distribution
- Increment of Stochastic Processes
- Wiener Process
- Probability Variables and Probability Distributions Defined by Measure Theory
- Random Variables Defined by Measure Theory
- Density and Cumulative Distribution Functions of Random Variables Defined by Measure Theory
- Dirac Measure and Discrete Probability Distribution Defined by Measure Theory
- Expected Value Defined by Measure Theory
- Joint and Marginal Distributions Defined by Measure Theory
- Conditional Expectation of Random Variables Defined by Measure Theory
- Conditional Probability of Random Variables Defined by Measure Theory
- Properties of Conditional Expectation
- Proof of the Monotone Convergence Theorem for Conditional Cases
- Proof of the Dominated Convergence Theorem
- Conditional Properties of Probability
- Smoothing Properties of Conditional Expectation
- Conditional Variance Defined by Measure Theory
- Proof of Conditional Jensen's Inequality
- The Definition of Martingale
- Stopping Times in Stochastic Processes
- Properties of Stopping Times
- Selective Sampling Theorem Proof
- Inequalities of Martingales
- Dob's Inequality Proof
- Crossings in Stochastic Processes
- Proof of the Submartingale Convergence Theorem
- Regular Martingales and Closable Martingales
- If It Is a Regular Martingale, It Is a Uniformly Integrable Martingale
- Convergence of Probabilities Defined by Measure Theory
- Uniformly Integrable Martingales are L1 Convergent Martingales
- If L1 Convergent, Then Martingale is Closable
- Proof of Lévy's Theorem in Probability Theory
- In Probability Theory: Separating Classes
- Conditions for Two Probability Measures to Coincide
- Tight Probability Measures
- Probability Measures Defined on Polish Spaces are Tight
- Projection Mapping in Stochastic Processes
- Proof of the Mixing Theorem in Probability Theory
- Convergence of Distributions Defined by Measure Theory
- Precompact Stochastic Process
- Tight Probability Processes
- Don'sker's Theorem
- Proof of the Continuity Mapping Theorem
- Shannon Information: Information Defined by Probability Theory
- Shannon Entropy: Entropy Defined by Random Variables
- Joint Entropy
- Conditional Entropy
- Cross Entropy
- Relative Entropy, Kullback-Leibler Divergence
- Gibbs' Inequality
- Geometric Brownian Motion
- Gaussian Processes
- Self-similarity and the Hurst Index of Stochastic Processes
- Fractal Brownian Motion
- Branching Process
- Galton-Watson Process
- Transition Probabilities of Stochastic Processes
- Continuous Markov Chain
- Kolmogorov Differential Equation Derivation
- Gillespie Stochastic Simulation Algorithm
- Definition of Poisson Processes through Differential Operator Matrices
- Summary of Measure Theory and Probability Theory
- Levy's Continuity Theorem in Probability Theory
- Brook's Auxiliary Lemma Proof
- Hellinger Distance of Probability Distributions