From Wikibooks, open books for an open world
Let
X
n
,
n
≥
0
{\displaystyle X_{n},n\geq 0}
be a Markov chain on the state space
X
=
{
1
,
2
}
{\displaystyle \mathrm {X} =\{1,2\}}
having transition matrix
P
{\displaystyle P}
with elements
P
11
=
1
/
3
,
P
12
=
2
/
3
,
P
21
=
P
22
=
1
/
2
{\displaystyle P_{11}=1/3,P_{12}=2/3,P_{21}=P_{22}=1/2}
. Let
f
:
X
→
R
{\displaystyle f:\mathrm {X} \to \mathbb {R} }
be the function with
f
(
1
)
=
1
{\displaystyle f(1)=1}
and
f
(
2
)
=
4
{\displaystyle f(2)=4}
. Find a function
g
:
X
→
R
{\displaystyle g:\mathrm {X} \to \mathbb {R} }
such that
Y
n
=
f
(
X
n
)
−
f
(
X
0
)
−
∑
i
=
0
n
−
1
g
(
X
i
)
,
n
≥
1
,
{\displaystyle Y_{n}=f(X_{n})-f(X_{0})-\sum _{i=0}^{n-1}g(X_{i}),n\geq 1,}
is a martingale relative to the filtration
F
n
X
{\displaystyle {\mathcal {F}}_{n}^{X}}
generated by the process
X
n
{\displaystyle X_{n}}
.
Notice that since
f
,
g
{\displaystyle f,g}
are measurable functions, then
Y
n
{\displaystyle Y_{n}}
is composed of linear combinations of
F
n
{\displaystyle {\mathcal {F}}_{n}}
-measurable functions and hence
Y
n
{\displaystyle Y_{n}}
is
F
n
{\displaystyle {\mathcal {F}}_{n}}
-adapted. Furthermore, for any
n
{\displaystyle n}
,
Y
n
{\displaystyle Y_{n}}
is finite everywhere, hence is
L
1
{\displaystyle L^{1}}
.
Therefore, we only need to check the conditional martingale property, i.e. we want to show
Y
n
=
E
(
Y
n
+
1
|
F
n
{\displaystyle Y_{n}=E(Y_{n+1}|{\mathcal {F}}_{n}}
.
That is, we want
f
(
X
n
)
−
f
(
X
0
)
−
∑
i
=
1
n
−
1
g
(
X
i
)
=
E
[
f
(
X
n
+
1
)
−
f
(
X
0
)
−
∑
i
=
1
n
g
(
X
i
)
|
F
n
]
f
(
X
n
)
=
E
[
f
(
X
n
+
1
)
|
F
n
]
−
g
(
X
n
)
{\displaystyle {\begin{aligned}f(X_{n})-f(X_{0})-\sum _{i=1}^{n-1}g(X_{i})&=E[f(X_{n+1})-f(X_{0})-\sum _{i=1}^{n}g(X_{i})|{\mathcal {F}}_{n}]\\f(X_{n})&=E[f(X_{n+1})|{\mathcal {F}}_{n}]-g(X_{n})\end{aligned}}}
Therefore, if
Y
n
{\displaystyle Y_{n}}
is to be a martingale, we must have
g
(
X
n
)
=
E
[
f
(
X
n
+
1
)
|
F
n
]
−
f
(
X
n
)
{\displaystyle g(X_{n})=E[f(X_{n+1})|{\mathcal {F}}_{n}]-f(X_{n})}
.
Since
X
=
{
1
,
2
}
{\displaystyle \mathrm {X} =\{1,2\}}
, we can compute the right hand side without too much work.
g
(
1
)
=
E
[
f
(
X
n
+
1
)
|
X
n
=
1
]
−
f
(
1
)
=
(
1
⋅
1
/
3
+
4
⋅
2
/
3
)
−
1
=
2
{\displaystyle g(1)=E[f(X_{n+1})|X_{n}=1]-f(1)=(1\cdot 1/3+4\cdot 2/3)-1=2}
g
(
2
)
=
E
[
f
(
X
n
+
1
)
|
X
n
=
2
]
−
f
(
2
)
=
(
1
⋅
1
/
2
+
4
⋅
1
/
2
)
=
−
3
2
{\displaystyle g(2)=E[f(X_{n+1})|X_{n}=2]-f(2)=(1\cdot 1/2+4\cdot 1/2)=-{\frac {3}{2}}}
This explicitly defines the function
g
{\displaystyle g}
and verifies that
Y
n
{\displaystyle Y_{n}}
is a martingale.
Let
ξ
n
{\displaystyle \xi _{n}}
be independent identically distributed random variables with uniform distribution on [0,1]. For which values of
α
>
0
{\displaystyle \alpha >0}
does the series
∑
n
=
1
∞
(
ξ
n
+
n
−
α
)
(
n
α
+
1
)
{\displaystyle \sum _{n=1}^{\infty }(\xi _{n}+n^{-\alpha })^{(n^{\alpha +1})}}
converge almost surely?