# Chapter 3. Markov Chain: Introduction 3.1. Examples

Document Sample

```					6

Chapter 3.
Markov Chain: Introduction

Whatever happened in the past, be it glory or misery, be Markov!

3.1. Examples

Example 3.1. ⋆ (Coin Tossing.)
Let ξ0 = 0 and, for i ≥ 1,
1 if the i-th toss is a Head (with probability p)
ξi =
0 if the i-th toss is a Tail (with probability 1 − p)

Set Xn = n ξi , n ≥ 0. Xn is the random number which counts the number of Heads up to the
i=0
n-th toss. Then,
P (Xn+1 = j|X0 = 0, X1 = i1 , ..., Xn−1 = in−1 , Xn = i)
P (ξn+1 = 1|X0 = 0, X1 = i1 , ..., Xn−1 = in−1 , Xn = i) if j = i + 1
=      P (ξn+1 = 0|X0 = 0, X1 = i1 , ..., Xn−1 = in−1 , Xn = i) if j = i
0                                                        otherwise
P (ξn+1 = 1) if j = i + 1
=      P (ξn+1 = 0) if j = i
0              otherwise
p      if j = i + 1
=      1 − p if j = i
0      otherwise
P (ξn+1 = 1|Xn = i) if j = i + 1
=      P (ξn+1 = 0|Xn = i) if j = i
0                     otherwise
=   P (Xn+1 = j|Xn = i)

It implies, once the present Xn is ﬁxed, the past history X0 , ..., Xn−1 , shall not aﬀect the future
distribution of Xn+1 . Here, we are using time n as present, time before n is past, and time beyond
n is future.
The above statement is the same as saying that the future {Xk : k ≥ n + 1} and the past {Xk :
k ≤ n − 1} are conditionally independent given the present Xn (taking any ﬁxed value.)
Remark. Not only n + 1 but, all future distribution Xn+1 , Xn+2 , .... shall depend on the past and
now only through now.
Example 3.2. ⋆ (Mickey in Maze) Mickey mouse travels in a maze with nine 3 × 3 cells. The
cells are numbered as 0, 1, ..., 8 from left to right and top down. Each step Mickey travels from
where it is to one of the surrounding connected cells with equal chance.

0         1         2

3         4         5

6         7         8
7

Let Xn denote the cell number of Mickey at step n. (X0 = 4). Then,

P (Xn+1 = j|X0 = 0, X1 = i1 , ..., Xn−1 = in−1 , Xn = i)
= P (Xn+1 = j|Xn = i).

Suppose currently Mickey is in cell 5, for example, the future movement or path of Mickey is
irrelevant with the past movement or path of Mickey. In other words, how Mickey has got to cell
5 in the past has nothing to do with how Mickey would move around in the future. The process
{Xn : n = 0, 1, 2, ...} is a Markov chain.
Example for fun (“First Blood.”) John Rambo only obeys the order from Colonel Samuel Trautman,
who supposedly only obeys the order from the Pentagon. Then Pentagon −→ Trautman −→ Rambo
forms a M C.

3.2. Deﬁnitions/Descriptions
1. Stochastic Process: a family of random variables {Xt } index by t.
2. State space: the set of values of the stochastic processes, which in general does not have to be
real numbers.
3. Markov Process: A stochastic process {Xt } indexed by time t such that, at each time t, the future
of the process {Xs : s > t} is conditionally independent of the past of the process {Xs : s < t}
given the present of the process Xt taking any ﬁxed value. Another interpretation is: at each time
t, the future of the process {Xs : s > t} depends on the the past of the process {Xs : s < t} only
through the present Xt .
Caution: Xs : s > t is in general not independent of Xs : s < t.
4. Markov chain (M C): Markov process with discrete state space. Discrete state space is usually
denoted by numbers 0, 1, 2, .....
5. Discrete/continuous time Markov chain: Discrete time M C: the time domain is {0, 1, 2, ...}.
Continuous time M C: the time domain is [0, ∞).
Examples 3.1 and 3.2 are discrete time Markov chains. Poisson process is continuous time M C.
Brownian motion is a continuous time Markov process (but not M C).
For a discrete time M C: {Xn : n = 0, 1, ...}: the deﬁning equation is, for any n ≥ 0,

P (Xn+1 = j|X0 = 0, X1 = i1 , ..., Xn−1 = in−1 , Xn = i)
= P (Xn+1 = j|Xn = i).

3.3. Transition probabilities.
One-step transition probability: P (Xn+1 = j|Xn = i).
If one-step transition probability is irrelevant with n, i.e., it’s the same for all n, we call the M C
{Xt } a M C with stationary transition probabilities. Throughout the course, we only consider M C
with stationary transition probabilities.
Let Pij = P (Xn+1 = j|Xn = i). The one-step transition probability matrix, or transition matrix in
brief, is
0     1     2    ...

0 P00       P01   P02   ...
                          
1  P10     P11   P12   ... 
P = 2  P20
         P21   P22   ... 

.
.    .
.       .
.     .
.    ..
.    .       .     .       .
8

For the Mickey in Maze (Example 3.2) example:

0       1          2       3           4         5       6       7    8

0               1/2                1/2
                                                                                
1    1/3     1/3     1/3                 
2         1/2             1/2
                                     
                                     
3    1/3             1/3     1/3
                                     

P= 4         1/4     1/4     1/4     1/4
                                     
                                     
5             1/3     1/3             1/3 
                                     

6                 1/2             1/2
                                     
                                     
7                     1/3     1/3     1/3
                                     
8                         1/2     1/2

(k)
Denote the k-step transition probability as Pij = P (Xn+k = j|Xn = i) and the k-step transition
probability matrix as
(k)
P(k) =             Pij             .

For notational convenience, we always let

(1)                        (0)         1       if i = j
Pij = Pij ,            Pij =
0       if i = j.

Then,
Theorem 3.1. For all 0 ≤ m ≤ n and n ≥ 0,
∞
(n)                    (m)       (n−m)
Pij        =             Pil     Plj
l=0
(n)                n
P          = P .

Proof. Write
(n)
Pij       = P (Xn = j|X0 = i) = P (Xn = j, X0 = i)/P (X0 = i)
∞
=         P (Xn = j, Xm = l, X0 = i)/P (X0 = i)
l=0
∞
=         P (Xn = j|Xm = l, X0 = i)P (Xm = l, X0 = i)/P (X0 = i)
l=0
∞
=         P (Xn = j|Xm = l, X0 = i)P (Xm = l|X0 = i)
l=0
∞
=         P (Xn = j|Xm = l)P (Xm = l, X0 = i)/P (X0 = i)
l=0
∞
(n−m)     (m)
=         Plj         Pil
l=0

(n)                                                                                   (m)
Observe that Pij      is the (i + 1, j + 1)-th entry of the matrix P(n) , Pil                               , l = 0, 1, 2, ... are the
(m)                (n−m)
(i + 1)-th row of the matrix P          , and      Plj     ,l     = 0, 1, 2, ... are the (j + 1)-th column of the matrix
P(n−m) . Hence,
P(n) = P(m) P(n−m)
9

for all 0 ≤ m ≤ n. Since P(1) = P, we have

P(2) = PP = P2 ,          P(3) = PP(2) = P3 ,           ... P(n) = Pn

by induction.

(3)   (3)
Example 3.3 ⋆ (Michey in Maze, Example 3.2 continued) Compute P4 8 and P1 8 .
Solution. Write
∞
(3)               (2)                        (2)                      (2)
P4 8 =         P4 k Pk 8 =               P4 k Pk 8 = 1/4               Pk 8 = 0;
k=0                 k=1,3,5,7                     k=1,3,5,7

(3)
P1 8 = 1/3 × 1/2 × 1/3 + 1/3 × 1/4 × 1/3 + 1/3 × 1/4 × 1/3 = 0.

3.4. More Examples.
Example 3.4. ⋆⋆ (An Inventory Model) Let Xn be the number of TV sets at a store in the
end of day n with X0 = 2. Let ξn be the sales of the number of TVs on day n. Assume ξ1 , ξ2 , ...
are iid (independent, identically distributed) such that
0.5 i = 0
P (ξn = i) =         0.4 i = 1
0.1 i = 2
At the end of any day n, if Xn = 0 or −1, two TVs will be sent to the store overnight. Moreover,
in the case of Xn = −1, another TV will be sent directly to the customer’s house. If Xn = 1 or 2,
nothing happens. With this inventory policy,

Xn − ξn+1         if Xn = 1, 2
Xn+1 =
2 − ξn+1          if Xn = −1, 0.

Then, X0 , X1 , X2 , ..., is a M C with state space {−1, 0, 1, 2} and with one-step transition probability

−1          0     1      2
−1          0       0.1   0.4   0.5
                          
0         0        0.1   0.4   0.5 
P=                                     .
1          0.1      0.4   0.5    0

2           0       0.1   0.4   0.5

Example 3.5. ⋆⋆ (The Ehrenfest Model) There are 2N particles in a jar separated by a
membrane into two chambers A and B. Let Yn be the number of particles in A after n crossings.
Each crossing is a particle from A to B or from B to A. Assume when any once crossing happens,
it happens to any one of the 2N particles with each equal chance 1/2N .

A                   B
10

Then Yn , n ≥ 0 is a M C with state space {0, 1, 2, ..., 2N } and with one step transition probability:

 i/(2N )        if j = i − 1
Pij = P (Yn+1 = j|Yn = i) = 1 − i/(2N ) if j = i + 1
0            else;


for 0 ≤ i ≤ j ≤ 2N .
11

More DIY Exercises:
Exercise 3.1 ⋆⋆ Bunny rabbit has three dens A, B and C. It likes A better than B and C. If it’s
in B or C on any night, it will always take chance 0.9 to go to A and chance 0.1 to go to the other
den for the following night. Once it reaches A, it will stay there for two nights and the third night
will be in B or C with equal chance 1/2. Let Xn be the den Bunny stays for night n. What is the
state space of the {Xn }? Is {Xn } a M C?

Den C

Den A           Den B

Exercise 3.2 ⋆⋆ For three events A, B and C, the following three statements are equivalent: (i)
P (A ∩ B|C) = P (A|C)P (B|C); (ii). P (A|C ∩ B) = P (A|C); and (iii) P (B|A ∩ C) = P (B|C).
(Assume all quantities here are well deﬁned.) Notice that statement (i) says that A and B are
conditionally independent given C.
Exercise 3.3 ⋆ ⋆ ⋆ Suppose Xn , n = 0, 1, 2, ... is a discrete time M C. Let 0 ≤ n0 < n1 < n2 < ...
be a subsequence of the nonnegative integers and Yk = Xnk . Is {Yk : k = 0, 1, 2, ..., } a M C?
Exercise 3.4 ⋆ ⋆ ⋆ Suppose {Xn : n = · · · − 2, −1, 0, 1, 2, · · ·} is a discrete time M C with the
time being from all integers from −∞ to ∞. Let Yn = X−n for all integers n. Is {Yn : n =
· · · − 2, −1, 0, 1, 2, · · ·} a M C?

```
DOCUMENT INFO
Shared By:
Categories:
Stats:
 views: 20 posted: 4/6/2010 language: English pages: 6
Description: introduction-example pdf