[5]: For all λ>0,
P(n≤Nmax∣Xn∣≥λ)≤λp1E∣XN∣p a.s.
Explanation
The inequalities introduced here can be similarly used by dividing both sides by λ to specify bounds of probability. Notably, since a martingale is both a supermartingale and a submartingale, all the above inequalities can be utilized.
[3]: According to the Markov inequality, we already know P(n≤NmaxXn≥λ)≤λ1En≤NmaxXn, but with the condition being a submartingale - adding information, we can consider the reduction to bounds by just looking at N as in P(n≤NmaxXn≥λ)≤λ1E∣XN∣.
[4]: Conditions of submartingale-supermartingale contrast with [1] and can be considered as a pair.
[5]: This formula, similar to the generalized Markov inequalityP(∣X∣p≥Cp)≤Cp1E∣X∣p, allows for reducing probability bounds as p increases given λ>1, but one must remember that XN moments of p must exist beforehand.
Proof
Strategy [1], [2]: Considering the bounded N, we start from the following theorem.
In the proof, the defined T is the moment when Xn exceeds λ for the first time, considering abandoning the count if it exceeds N and simply taking N. Understanding this is the crux, and this forms the basis for using a trick to split the integration domain.
[1]
Part 1. Definition of stopping time T
Define the random variableT:={inf{n≤N:Xn≥λ}N,n<N,otherwise. Now for all k∈N, check if (T=k)∈Fk to demonstrate that T is a stopping time. For k=n<N,
(T=n)=(X1<λ,⋯,Xn−1<λ,Xn≥λ)∈Fn
and for k=N, it is sufficient to check up to the k=N−1th to see if it is Xk<λ, so
(T=N)=(X1<λ,⋯,Xn−1<λ)∈FN−1⊂FN
Part 2.
According to the optional sampling theorem, almost surely, E(XT∣F1)≤X1 hence
∫ΩX1dP≥==∫ΩE(XT∣F1)dP∫ΩXTdP∫(maxn≤NXn≥λ)XTdP+∫(maxn≤NXn<λ)XTdP a.s.
The last line of the above development divides the integration domain into (maxn≤NXn≥λ) and (maxn≤NXn<λ). Then, by the definition of T, on (maxn≤NXn≥λ) it is XT≥λ and on (maxn≤NXn<λ) it is XT=XN, so
∫ΩX1dP≥==∫(maxn≤NXn≥λ)XTdP+∫(maxn≤NXn<λ)XTdP∫(maxn≤NXn≥λ)λdP+∫(maxn≤NXn<λ)XNdPλP(n≤NmaxXn≥λ)+∫(maxn≤NXn<λ)XNdP a.s.
Part 3.
Simplifying the terms yields
λP(n≤NmaxXn≥λ)≤=EX1−∫(maxn≤NXn<λ)XN+dP+∫(maxn≤NXn<λ)XN−dP a.s.∫ΩX1dP−∫(maxn≤NXn<λ)XNdP
Since XN+≥0, eliminate −∫(maxn≤NXn<λ)XN+dP and
∫(maxn≤NXn<λ)XN−dP≤∫ΩXN−dP=EXN−
thus
λP(n≤NmaxXn≥λ)≤EX1+EXN− a.s.
■
[2]
Part 1. Definition of stopping time T
Define the random variableT:={inf{n≤N:Xn≤−λ}N,n<N,otherwise. Now for all k∈N, check if (T=k)∈Fk to demonstrate that T is a stopping time. For k=n<N,
(T=n)=(X1>−λ,⋯,Xn−1>−λ,Xn≤−λ)∈Fn
and for k=N, it is sufficient to check up to the k=N−1th to see if it is Xk>−λ, so
(T=N)=(X1>−λ,⋯,Xn−1>−λ)∈FN−1⊂FN
and T≤N is true.
According to the optional sampling theorem, almost surely E(XN∣FT)≤Xn, and taking the expectation E on both sides similar to Part 2. of proof [1],
EXN≤==≤EXT∫ΩXTdP∫(minn≤NXn≤−λ)XTdP+∫(minn≤NXn>−λ)XTdP−λP(n≤NminXn≤−λ)+∫(minn≤NXn>−λ)XNdP
But since EXN=∫(minn≤NXn≤−λ)XNdP+∫(minn≤NXn>−λ)XNdP,
≤∫(minn≤NXn≤−λ)XNdP+∫(minn≤NXn>−λ)XNdP−λP(n≤NminXn≤−λ)+∫(minn≤NXn>−λ)XNdP
eliminating ∫(minn≤NXn>−λ)XNdP from both sides yields
∫(minn≤NXn≤−λ)XNdP≤−λP(n≤NminXn≤−λ)
Part 3.
Reversing the sign gives
λP(n≤NminXn≤−λ)≤=−∫(minn≤NXn≤−λ)XNdP−∫(minn≤NXn≤−λ)XN+dP+∫(minn≤NXn≤−λ)XN−dP
Similarly to Part 3. of proof [1], since XN+≥0, eliminate −∫(minn≤NXn≤−λ)XN+dP and
∫(minn≤NXn≤−λ)XN−dP≤∫ΩXN−dP=EXN−
thus
λP(n≤NminXn≤−λ)≤EXN− a.s.
■
[3]
λP(n≤NmaxXn≥λ)==λP(−n≤NmaxXn≤−λ)λP(n≤Nmin(−Xn)≤−λ)
Since {(−Xn,Fn)} is a supermartingale, according to [2]
λP(n≤Nmin(−Xn)≤−λ)≤≤≤≤≤−∫(minn≤NXn≤−λ)(−XN)dP∫(maxn≤NXn≥λ)XNdP∫(XN≥0)XNdPEXN+E∣XN∣ a.s.
■
[4]
Since {(−Xn,Fn)} is a supermartingale, according to [1]
λP(n≤NminXn≤−λ)=≤===λP(n≤Nmax(−Xn)≥λ)E(−X1)+E(−XN)−E(−XN)−−EX1E(−XN++XN−)−−EX1E(XN)+−EX1 a.s.
■
[5]
According to the conditional Jensen’s inequality corollary, since {(∣Xn∣p,Fn)} is a submartingale, according to [3]
⟺⟺λpP(n≤Nmax∣Xn∣p≥λp)≤E∣XN∣p a.s.P(n≤Nmax∣Xn∣p≥λp)≤λp1E∣XN∣p a.s.P(n≤Nmax∣Xn∣≥λ)≤λp1E∣XN∣p a.s.