Lesson 11: Markov Chains

leingang 7,436 views 50 slides Oct 20, 2007
Slide 1
Slide 1 of 50
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50

About This Presentation

Markov chains are a very common model for systems that change probablistically over time. We show a few fun examples, define the objects, state the main theorems, and show how to find the steady-state vector.


Slide Content

Lesson 11
Markov Chains
Math 20
October 15, 2007
Announcements
IReview Session (ML), 10/16, 7:30{9:30 Hall E
IProblem Set 4 is on the course web site. Due
IMidterm I 10/18, Hall A 7{8:30pm
IOH: Mondays 1{2, Tuesdays 3{4, Wednesdays 1{3 (SC 323)
IOld exams and solutions on website

The Markov Dance
Divide the class into three groupsA,B, andC.
Upon my signal:
I1=3of groupAgoes to groupB, and
1=3of groupAgoes to
groupC.
I1=4of groupBgoes to groupA, and
1=4of groupAgoes to
groupC.
I1=2of groupCgoes to groupB.

The Markov Dance
Divide the class into three groupsA,B, andC.
Upon my signal:
I1=3of groupAgoes to groupB, and
1=3of groupAgoes to
groupC.
I1=4of groupBgoes to groupA, and
1=4of groupAgoes to
groupC.
I1=2of groupCgoes to groupB.

The Markov Dance
Divide the class into three groupsA,B, andC.
Upon my signal:
I1=3of groupAgoes to groupB, and
1=3of groupAgoes to
groupC.
I1=4of groupBgoes to groupA, and
1=4of groupAgoes to
groupC.
I1=2of groupCgoes to groupB.

The Markov Dance
Divide the class into three groupsA,B, andC.
Upon my signal:
I1=3of groupAgoes to groupB, and
1=3of groupAgoes to
groupC.
I1=4of groupBgoes to groupA, and
1=4of groupAgoes to
groupC.
I1=2of groupCgoes to groupB.

Math 20 - October 15, 2007.GWB
Monday, Oct 15, 2007
Page2of9

Another Example
Suppose on any given class day you wake up and decide whether to
come to class. If you went to class the time before, you're 70%
likely to go today, and if you skipped the last class, you're 80%
likely to go today.
Some questions you might ask are:
IIf I go to class on Monday, how likely am I to go to class on
Friday?
IAssuming the class is innitely long (the horror!),
approximately what portion of class will I attend?

Another Example
Suppose on any given class day you wake up and decide whether to
come to class. If you went to class the time before, you're 70%
likely to go today, and if you skipped the last class, you're 80%
likely to go today.
Some questions you might ask are:
IIf I go to class on Monday, how likely am I to go to class on
Friday?
IAssuming the class is innitely long (the horror!),
approximately what portion of class will I attend?

Another Example
Suppose on any given class day you wake up and decide whether to
come to class. If you went to class the time before, you're 70%
likely to go today, and if you skipped the last class, you're 80%
likely to go today.
Some questions you might ask are:
IIf I go to class on Monday, how likely am I to go to class on
Friday?
IAssuming the class is innitely long (the horror!),
approximately what portion of class will I attend?

Many times we are interested in the transition of something
between certain \states" over discrete time steps. Examples are
Imovement of people between regions
Istates of the weather
Imovement between positions on a Monopoly board
Iyour score in blackjack
Denition
AMarkov chainorMarkov processis a process in which the
probability of the system being in a particular state at a given
observation period depends only on its state at the immediately
preceding observation period.

Many times we are interested in the transition of something
between certain \states" over discrete time steps. Examples are
Imovement of people between regions
Istates of the weather
Imovement between positions on a Monopoly board
Iyour score in blackjack
Denition
AMarkov chainorMarkov processis a process in which the
probability of the system being in a particular state at a given
observation period depends only on its state at the immediately
preceding observation period.

Common questions about a Markov chain are:
IWhat is the probability of transitions from state to state over
multiple observations?
IAre there any \equilibria" in the process?
IIs there a long-term stability to the process?

Denition
Suppose the system hasnpossible states. For eachiandj, lettij
be the probability of switching from statejto statei. The matrix
Twhoseijth entry istijis called thetransition matrix.
Example
The transition matrix for the skipping class example is
T=

0:7 0:8
0:3 0:2

Denition
Suppose the system hasnpossible states. For eachiandj, lettij
be the probability of switching from statejto statei. The matrix
Twhoseijth entry istijis called thetransition matrix.
Example
The transition matrix for the skipping class example is
T=

0:7 0:8
0:3 0:2

Math 20 - October 15, 2007.GWB
Monday, Oct 15, 2007
Page3of9

Math 20 - October 15, 2007.GWB
Monday, Oct 15, 2007
Page4of9

The big idea about the transition matrix reects an important fact
about probabilities:
IAll entries are nonnegative.
IThe columns add up to one.
Such a matrix is called astochastic matrix.

Denition
Thestate vectorof a Markov process withn-states at time stepk
is the vector
x
(k)
=
0
B
B
B
B
@
p
(k)
1
p
(k)
2
.
.
.
p
(k)
n
1
C
C
C
C
A
wherep
(k)
j
is the probability that the system is in statejat time
stepk.
Example
Suppose we start out with 20 students in groupAand 10 students
in groupsBandC. Then the initial state vector is
x
(0)
=
0
@
0:5
0:25
0:25
1
A:

Math 20 - October 15, 2007.GWB
Monday, Oct 15, 2007
Page5of9

Denition
Thestate vectorof a Markov process withn-states at time stepk
is the vector
x
(k)
=
0
B
B
B
B
@
p
(k)
1
p
(k)
2
.
.
.
p
(k)
n
1
C
C
C
C
A
wherep
(k)
j
is the probability that the system is in statejat time
stepk.
Example
Suppose we start out with 20 students in groupAand 10 students
in groupsBandC. Then the initial state vector is
x
(0)
=
0
@
0:5
0:25
0:25
1
A:

Denition
Thestate vectorof a Markov process withn-states at time stepk
is the vector
x
(k)
=
0
B
B
B
B
@
p
(k)
1
p
(k)
2
.
.
.
p
(k)
n
1
C
C
C
C
A
wherep
(k)
j
is the probability that the system is in statejat time
stepk.
Example
Suppose we start out with 20 students in groupAand 10 students
in groupsBandC. Then the initial state vector is
x
(0)
=
0
@
0:5
0:25
0:25
1
A:

Math 20 - October 15, 2007.GWB
Monday, Oct 15, 2007
Page6of9

Example
Suppose after three weeks of class I am equally likely to come to
class or skip. Then my state vector would bex
(10)
=

0:5
0:5

The big idea about state vectors reects an important fact about
probabilities:
IAll entries are nonnegative.
IThe entries add up to one.
Such a vector is called aprobability vector.

Example
Suppose after three weeks of class I am equally likely to come to
class or skip. Then my state vector would bex
(10)
=

0:5
0:5

The big idea about state vectors reects an important fact about
probabilities:
IAll entries are nonnegative.
IThe entries add up to one.
Such a vector is called aprobability vector.

Example
Suppose after three weeks of class I am equally likely to come to
class or skip. Then my state vector would bex
(10)
=

0:5
0:5

The big idea about state vectors reects an important fact about
probabilities:
IAll entries are nonnegative.
IThe entries add up to one.
Such a vector is called aprobability vector.

Lemma
LetTbe an nn stochastic matrix andxan n1probability
vector. Then Txis a probability vector.
Proof.
We need to show that the entries ofTxadd up to one. We have
n
X
i=1
(Tx)i=
n
X
i=1
0
@
n
X
j=1
tijxj
1
A=
n
X
j=1

n
X
i=1
tij
!
xj
=
n
X
j=1
1xj= 1

Lemma
LetTbe an nn stochastic matrix andxan n1probability
vector. Then Txis a probability vector.
Proof.
We need to show that the entries ofTxadd up to one. We have
n
X
i=1
(Tx)i=
n
X
i=1
0
@
n
X
j=1
tijxj
1
A=
n
X
j=1

n
X
i=1
tij
!
xj
=
n
X
j=1
1xj= 1

Theorem
IfTis the transition matrix of a Markov process, then the state
vectorx
(k+1)
at the (k+ 1)th observation period can be
determined from the state vectorx
(k)
at thekth observation
period, as
x
(k+1)
=Tx
(k)
This comes from an important idea inconditional probability:
P(stateiatt=k+ 1)
=
n
X
j=1
P(move from statejto statei)P(statejatt=k)
That is, for eachi,
p
(k+1)
i
=
n
X
j=1
tijp
(k)
j

Theorem
IfTis the transition matrix of a Markov process, then the state
vectorx
(k+1)
at the (k+ 1)th observation period can be
determined from the state vectorx
(k)
at thekth observation
period, as
x
(k+1)
=Tx
(k)
This comes from an important idea inconditional probability:
P(stateiatt=k+ 1)
=
n
X
j=1
P(move from statejto statei)P(statejatt=k)
That is, for eachi,
p
(k+1)
i
=
n
X
j=1
tijp
(k)
j

Illustration
Example
How does the probability of going to class on Wednesday depend
on the probabilities of going to class on Monday?
goskipgoskipgoskipp
(k)
1
t11t21p
(k)
2
t12t22MondayWednesday
p
(k+1)
1
=t
11
p
(k)
1
+t
12
p
(k)
2
p
(k+1)
2
=t
21
p
(k)
1
+t
22
p
(k)
2

Example
If I go to class on Monday, what's the probability I'll go to class on
Friday?
Solution
We havex
(0)
=

1
0

. We want to knowx
(2)
. We have
x
(2)
=Tx
(1)
=T(Tx
(0)
) =T
2
=Tx
(0)
=

0:7 0:8
0:3 0:2

2
1
0

=

0:7 0:8
0:3 0:2

0:7
0:3

=

0:73
0:27

Example
If I go to class on Monday, what's the probability I'll go to class on
Friday?
Solution
We havex
(0)
=

1
0

. We want to knowx
(2)
. We have
x
(2)
=Tx
(1)
=T(Tx
(0)
) =T
2
=Tx
(0)
=

0:7 0:8
0:3 0:2

2
1
0

=

0:7 0:8
0:3 0:2

0:7
0:3

=

0:73
0:27

Let's look at successive powers of the probability matrix. Do they
converge? To what?

Let's look at successive powers of the transition matrix in the
Markov Dance.
T=
0
@
0:333333 0:25 0:
0:333333 0:5 0:5
0:333333 0:25 0:5
1
A
T
2
=
0
@
0:194444 0:208333 0:125
0:444444 0:458333 0:5
0:361111 0:333333 0:375
1
AT
3
=
0
@
0:175926 0:184028 0:166667
0:467593 0:465278 0:479167
0:356481 0:350694 0:354167
1
A

Let's look at successive powers of the transition matrix in the
Markov Dance.
T=
0
@
0:333333 0:25 0:
0:333333 0:5 0:5
0:333333 0:25 0:5
1
A
T
2
=
0
@
0:194444 0:208333 0:125
0:444444 0:458333 0:5
0:361111 0:333333 0:375
1
AT
3
=
0
@
0:175926 0:184028 0:166667
0:467593 0:465278 0:479167
0:356481 0:350694 0:354167
1
A

Let's look at successive powers of the transition matrix in the
Markov Dance.
T=
0
@
0:333333 0:25 0:
0:333333 0:5 0:5
0:333333 0:25 0:5
1
A
T
2
=
0
@
0:194444 0:208333 0:125
0:444444 0:458333 0:5
0:361111 0:333333 0:375
1
AT
3
=
0
@
0:175926 0:184028 0:166667
0:467593 0:465278 0:479167
0:356481 0:350694 0:354167
1
A

T
4
=
0
@
0:17554 0:177662 0:175347
0:470679 0:469329 0:472222
0:353781 0:353009 0:352431
1
A
T
5
=
0
@
0:176183 0:176553 0:176505
0:470743 0:47039 0:470775
0:353074 0:353057 0:35272
1
AT
6
=
0
@
0:176414 0:176448 0:176529
0:470636 0:470575 0:470583
0:35295 0:352977 0:352889
1
A
Do they converge? To what?

T
4
=
0
@
0:17554 0:177662 0:175347
0:470679 0:469329 0:472222
0:353781 0:353009 0:352431
1
A
T
5
=
0
@
0:176183 0:176553 0:176505
0:470743 0:47039 0:470775
0:353074 0:353057 0:35272
1
AT
6
=
0
@
0:176414 0:176448 0:176529
0:470636 0:470575 0:470583
0:35295 0:352977 0:352889
1
A
Do they converge? To what?

T
4
=
0
@
0:17554 0:177662 0:175347
0:470679 0:469329 0:472222
0:353781 0:353009 0:352431
1
A
T
5
=
0
@
0:176183 0:176553 0:176505
0:470743 0:47039 0:470775
0:353074 0:353057 0:35272
1
AT
6
=
0
@
0:176414 0:176448 0:176529
0:470636 0:470575 0:470583
0:35295 0:352977 0:352889
1
A
Do they converge? To what?

A transition matrix (or corresponding Markov process) is called
regularif some power of the matrix has all nonzero entries. Or,
there is a positive probability of eventually moving from every state
to every state.

Theorem 2.5
IfTis the transition matrix of a regular Markov process, then
(a) n! 1,T
n
approaches a matrix
A=
0
B
B
@
u1u1: : :u1
u2u2: : :u2
: : : : : : : : : : : :
unun: : :un
1
C
C
A
;
all of whose columns are identical.
(b) uis a a probability vector all of whose
components are positive.

Theorem 2.6
IfTis a regular

transition matrix andAanduare as above, then
(a) x,T
n
x!uasn! 1, so thatuis
asteady-state vector.
(b) uis the unique probability vector
satisfying the matrix equationTu=u.

Finding the steady-state vector
We know the steady-state vector is unique. So we use the equation
it satises to nd it:Tu=u.
This is a matrix equation if you put it in the form
(TI)u=0

Finding the steady-state vector
We know the steady-state vector is unique. So we use the equation
it satises to nd it:Tu=u.
This is a matrix equation if you put it in the form
(TI)u=0

Math 20 - October 15, 2007.GWB
Monday, Oct 15, 2007
Page9of9

Example (Skipping class)
If the transition matrix isT=

0:7 0:8
0:3 0:2

, what is the
steady-state vector?
Solution
We can combine the equations(TI)u=0, u1+u2= 1into a
single linear system with augmented matrix
0
@

3=10
8=100
3=10
8=100
1 1 1
1
A
0
@
1 0
8=11
0 1
3=11
0 0 0
1
A
So the steady-state vector is

8=11
3=11

. You'll go to class about 72%
of the time.

Example (Skipping class)
If the transition matrix isT=

0:7 0:8
0:3 0:2

, what is the
steady-state vector?
Solution
We can combine the equations(TI)u=0, u1+u2= 1into a
single linear system with augmented matrix
0
@

3=10
8=100
3=10
8=100
1 1 1
1
A
0
@
1 0
8=11
0 1
3=11
0 0 0
1
A
So the steady-state vector is

8=11
3=11

. You'll go to class about 72%
of the time.

Example (Skipping class)
If the transition matrix isT=

0:7 0:8
0:3 0:2

, what is the
steady-state vector?
Solution
We can combine the equations(TI)u=0, u1+u2= 1into a
single linear system with augmented matrix
0
@

3=10
8=100
3=10
8=100
1 1 1
1
A
0
@
1 0
8=11
0 1
3=11
0 0 0
1
A
So the steady-state vector is

8=11
3=11

. You'll go to class about 72%
of the time.

Example (The Markov Dance)
If the transition matrix isT=
0
@
1=3
1=40
1=3
1=2
1=2
1=3
1=4
1=2
1
A, what is the
steady-state vector?
Solution
We have
0
B
B
@

2=3
1=4 00
1=3
1=2
1=20
1=3
1=4
1=20
1 1 1 1
1
C
C
A

0
B
B
@
1 0 0
3=17
0 1 0
8=17
0 0 1
6=17
0 0 0 0
1
C
C
A

Example (The Markov Dance)
If the transition matrix isT=
0
@
1=3
1=40
1=3
1=2
1=2
1=3
1=4
1=2
1
A, what is the
steady-state vector?
Solution
We have
0
B
B
@

2=3
1=4 00
1=3
1=2
1=20
1=3
1=4
1=20
1 1 1 1
1
C
C
A

0
B
B
@
1 0 0
3=17
0 1 0
8=17
0 0 1
6=17
0 0 0 0
1
C
C
A