Martingales
Discrete time -definition
I DefinitionA discrete time stochastic process Xn on a probability space(Ω,F ,P) is a martingale with respect to a filteration Fn if
1. X is adapted to Fn.2. E[|Xn|] <∞3. E[Xn+1|Fn] = Xn
I When the equality in 3 is replaced by ≤ we havesuper-martingale.
I When it is replaced by ≥ we have submartingale.I For m < n,
E[Xn|Fm] = E[E[Xn|Fn−1]|Fm] = E[Xn−1|Fm] = · · ·Xm
I If Xn is a supermartingale then E[Xn|Fm] ≤ Xm
Examples
1. Let Xi i.i.d with expectation µ. Sn =∑
Xi − nµ.2. Branching process: Consider a population. At time n the
number of individulas is Zn. Indivdual (n, i) give birth to Xn,iindividuals where Xn,i are i.i.d with mean µ.E[Zn+1|Z0, · · · ,Zn] = E[
∑Zni=1 X(n,i)] = µZn. Clearly
martingale if µ = 1 otherwise Zn/µn martingale.
3. Consider a Casino game where at each game you win 1with probability p and lose 1 with probability 1− p. LetYi = 1 if you win 1 at the i th game and Yi = −1 if you lose1 at the i th game . Let Sn = S0 +
∑ni=1 Yi . Consider
Xn = zSn , then E[Xn+1|Fn] = Xn(pz + q/z) Take z = q/pthem we have a martingale.
4. Let Xi i.i.d with moment generating function ϕ(θ). ThenMn = eθSn
ϕ(θ)n is a martingale.
Martingales and convex function
I Convex function: g(αx + (1− α)y) ≤ αg(x) + (1− α)g(y)
I Jensen inequality: E[g(X )|G] ≥ g(E[X |G])
I LemmaLet Xn be a martingale and g a convex function. Then g(Xnis sub-martingale.
I ProofE[g(Xn+1)|Fn] ≥ g(E[Xn+1|Fn)] = g(Xn)
Martingale’s differencesI Let Mn be a martingale. Let ξj = Mj −Mj−1 then
E [ξn+1|Fn] = 0. (HW)
I Assume that E [M2n ] <∞ then
1. E [M2j ] <∞ for j ≤ n.
2. E [M2n ] = E [M2
0 ] +∑n
j=1 E [ξ2j ]
I Proof
1. E [(M2n ] = E [E [M2
n |Fk ]] ≥ E [(E [Mn|Fk ])2] = E [M2k ]
2.
E [(Mn −M0)2] = E [(n∑
j=1
ξj )2] =
n∑j=1
E [ξ2j ] + 2
∑i<j
E [ξiξj ]
For i < j ,
E [ξiξj ] = E [E [ξiξj |Fi ]] = E [ξiE [ξj |Fi ] = 0
Since E [ξj |Fi ] = E [Mj −Mj−1|Fi ] = 0.
I Let H0,H1, · · · be a bounded adapted process, and Mn amartingale.
I
Zn =n−1∑i=0
Hi(Mi+1 −Mi)
is a martingale.I Proof: E [|Zn|] ≤
∑ni=0 |Hi ||(Mi+1|+ |Mi |) <∞ Zn is
adapted,and
E [Zn+1|Fn]
=n−1∑i=0
Hi(Mi+1 −Mi) + E [Hn(Mn+1 −Mn)|Fn]
=n−1∑i=0
Hi(Mi+1 −Mi) + HnE [(Mn+1 −Mn)|Fn] = Zn
Stopping times
DefinitionLet (Ω,F ,P) be a probability sapace with filtrationF0 ⊆ F1 ⊆ .. ⊆ F . A non-negative random variable τ is astopping time if (τ ≤ n) ∈ Fn.properties of Stopping times Let S and T be stopping timesthen with respect to probability space (Ω,P,F), and filtrationF0 ⊆ F1, · · · ⊆ F
1. S + T is a stopping time2. S ∧ T is a stopping time3. S ∧ T is a stopping time4. S ∨ T is a stopping time5. An integer constant is stopping time.
Examples of Stopping times
I Consider a simple random walk, and consider the first timethe process reaches 1 .
I Consider a random walk as in example 3. let τ the firsttome that Sn /∈ (0,b) where b is an integer.Why?
(τ > n) = S0 ∈ (0,b),S1 ∈ (0,b), · · ·Sn ∈ (0,b) ⊂ Fn
Is τ − 2 a stoppimg time? is τ + 2 a stopping time?
The stopped process
Let T be a stopping time and Xn,n ≥ 1 be a stochasticprocess. Then Xn∧T is called the stopped process.
LemmaIf Xn,n ≥ 1 is a martingale defined on (Ω,F ,P) with respectto a filtration (Fn), and T is a stopping time with respect to thesame filtration. Then the stopped process Xn∧T is a martingale.Proof: Let
Xn∧T = X0 +n−1∑j=0
Hj(Xj+1 − Xj) (1)
Hj =
0 if T ≤ j
1 otherwise
H is bounded and adapted.
LemmaIf Xn,n ≥ 1 is a martingale (super) (sub) martingale definedon (Ω,P,F) with respect to a filtration (Fn), and T is a stoppingtime with respect to the same filtration then the stoppedprocess Xn∧T is a martingale (super) (sub) martingale.
Observation 1 E[XT∧n] = E[[E[XT∧n|Fn−1] = E[XT∧(n−1)] =· · · = E[X0]
Observation 2 If P(T <∞) = 1 then as n→∞, XT∧n → XTwith probability 1
Question?
I We saw that for martingalesE[Xn] = E[E[Xn|Fn−1]] = E[Xn−1] = · · ·E[X0]
I Is it always true that for a stopping time τ
E[Xτ ] = E[X0] (2)
I Answer : Not always.Example: consider a random walk with p = 1/2 In thiscase Sn is recurrent Markov chain and thus with probability1 T1 = infn : Sn = 1 <∞. E[ST1∧n] = 0 while E[ST1 ] = 1.
Doob’s Optional Stopping Theorem:TheoremLet (Ω,F ,P) be a probability space with filtration Fn. Let T be astopping time with respect to Fn, and X be a martingale,adapted to Fn. Then
E[XT ] = E[X0]
in each of the following situations:(i) T is bounded . (For some N, T (w) ≤ N for all w).(ii) X is bounded ( For Some K > 0 |Xn(w)| ≤ K for
all n and all w ) and T is finite with probability 1.(ii’) T is finite with probability 1. and |Xn∧T (w)| ≤ K(iii) E[T ] <∞, and for some K > 0
|Xn(w)− Xn−1(w)| ≤ K
for all n ≥ 1 and all w
proof
(i) E[XT∧N ] = E[XT ] = E[X0]
(ii) Since P(T <∞) = 1 then XT∧n → XT , and theresult follows from the Bounded convergencetheorem.
(ii’) Since T is finite XT∧n → XT , and XT ≤ K .|E[X0]− E[XT ]| = |E[XT∧n]− E[XT ]| ≤ 2KP(T >n)→ 0
(iii) XT =∑T
i=1(Xi − Xi−1), |XT | ≤∑T
i=1 |(Xi − Xi−1)|
E [T∑
i=1
|(Xi − Xi−1)|] ≤ KE(T )]
and use dominated convergence Thm (DMT).
Example-Wald identity
ExampleLet Xi be i.i.d r.v where E[|X |] <∞ and let T be a stopping time(with respect to the filtration Fn generated by (X1, · · · ,Xn) ).Assume E[T ] <∞, and E[Xi ] = µ. Then:
1. Sn =∑n
i=1 Xi − nµ is a martingale.
2. E [∑T
i=1 Xi ] = E[T ]µ.
Proof
I First consider non-negative random variables Xi .I E[
∑T∧ni=1 Xi = E [T ∧ n]µ and the result follows from MCT.
I For general Xi since E |Xi | <∞ |∑T∧n
i=1 Xi | ≤∑T
i=1 |Xi | theresult follows from the BCT.
Example:Gamble-ruin
ExampleA gambler wins 1 with probability p and loses 1 with probabilityq = 1− p 6= 1/2. She starts with fortune x and each time betson 1 $. Let T the first time the fortunes is either 0 or b,T = inf[n : Sn /∈ (0,b),where Sn = S0 +
∑ni=1 Yi , Yi is 1 w.p.p
and −1 w.p. 1− p. The probability she will exit happy?Solution
I Define Mn = (q/p)Sn then Mn is a martingale.I T is finite with probability 1. (Clearly there is a positive
probability ε to be out after b steps, from any state. Thusthe probability of getting out after nb steps is at least1− (1− ε)n → 1.
I MT∧n ≤ max((q/p)b,1)thus apply (ii’).I E[MT ] = (q/p)x
Gambler ruin-Cont.
I Tb- be the first time Sn hits b.I T0 the first time that it hits 0.
(q/p)x = E[(q/p)ST ] = P(Tb < T0)(q/p)b + (1− P(Tb < T0))
P(Tb < T0) =(q/p)x − 1(q/p)b − 1
In the symmetric case Sn is a martingale.|Sn∧T | ≤ b applying (ii’)
E[ST ] = x = bP(Tb < T0)
P(Tb < T0) = x/b
Expected exit time symmetric random walk
ExampleIn the last example for the symmetric random walk let us obtainthe expected time to exit (−a,b)Obsevation: Var(Yi) = 1, and Var(Sn) = E[(Sn)2] = nS2
T∧n ≤ max(a2,b2) (Why?)S2
n − n is a martingale.Assume that we can apply the optional stopping theorem then:
0 = E[(ST )2 − T ] = E[(ST )2]− E[T ]
thus,E[T ] = (a/(a + b)b2 + b/(a + b)a2 = ab
Justification
E[S2T∧n − T ∧ n] = 0
thus
E[S2T∧n] = E[T ∧ n]
Apply DCT for left hand side and MCT for right hand we get theresult.
One -side exit time
I Consider again a random walk with p < 1/2.I Let S0 = x .I Consider the first time that the process is 0.I Mn = Sn − n(1− 2p) is a martingale.I E[Sn∧T ]− (n ∧ T )(2p − 1)] = x
E[(n ∧ T )(1− 2p)] = x − E[Sn∧T ] ≤ x
E[(n ∧ T ] ≤ x/(1− 2p)
E(T ) ≤ x/(1− 2p)
Thus can apply Wald identity.
Ruin Probability
Example
I Consider an insurance company.I probability for a claim at any time period is q = 1− p, the
claim size is 2. premium 1 .I Intial capital xI The net profit condition 1 > 2(1− p), or p > 1/2.I what is the ruin probability?I (q/p)Sn is a martingale. Thus
(q/p)x = E[(q/p)Sn∧T ] = P(T ≤ n) + E[(q/p)n1T>n]
I P(T ≤ n)→ P(T <∞)
I E[(q/p)n1T>n] ≤ (q/p)n → 0I Thus P(T <∞) = (q/p)x
Continuous time martingales
I Let Ft : t ∈ [0,∞) be a family of sub-sigma algebras withthe property that Fs ⊆,Ft whenever s ≤ t .
I Xt is a martingale if:1. Xt is Ft measurable.2. E|Xt | <∞.3. E[Xt |Fs] = Xs
Definitionτ is a Ft stopping time if (τ ≤ t) ∈ Ft
I DefinitionF+
t = ∩ε>0Ft+ε.The filtration is right continuous if Ft = F+
t .I Problem with continuous martingales: inft : Xt ≥ u is a
stopping time. However, inft : Xt > u is not always astopping time. However when Ft is right continuous bothare stopping times.
I In our case when speaking about continuous timemartingale (or stochastic process)we assume that:
1. If A ⊂ B and B ∈ F , and P(B) = 0 then A ∈ F andP(A) = 0.
2. F0 contains all the P-null set.3. Ft is right continuous.
Under the above condiotons the optional sampling theoremholds also for continuous martingales.