68
Chapter 3: Problems 7, 11, 14 Chapter 4: Problems 5, 6, 14 Due date: Monday, March 15, 2004 Assignment 3

Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

  • Upload
    ricky

  • View
    30

  • Download
    0

Embed Size (px)

DESCRIPTION

Assignment 3. Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004. Example. - PowerPoint PPT Presentation

Citation preview

Page 1: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• Chapter 3: Problems 7, 11, 14• Chapter 4: Problems 5, 6, 14

• Due date: Monday, March 15, 2004

Assignment 3

Page 2: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Inventory System: Inventory at a store is reviewed daily. If inventory drops below 3 units, an order is placed with the supplier which is delivered the next day. The order size should bring inventory position to 6 units. Daily demand D is i.i.d. with distribution P(D = 0) =1/3 P(D = 1) =1/3 P(D = 2) =1/3.

Let Xn describe inventory level on the nth day. Is the process {Xn} a Markov chain? Assume we start with 6 units.

Example

Page 3: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

Markov Chains

Page 4: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

Markov Chains

Page 5: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

{i: i=0, 1, 2, ...} is the state space

Markov Chains

Page 6: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

{i: i=0, 1, 2, ...} is the state space

If P(Xn+1 =j|Xn =i, Xn-1 =in-1, ..., X0 =i0}=P(Xn+1 =j|Xn =i} = Pij, the process is said to be a Discrete Time Markov Chain (DTMC).

Markov Chains

Page 7: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

{i: i=0, 1, 2, ...} is the state space

If P(Xn+1 =j|Xn =i, Xn-1 =in-1, ..., X0 =i0}=P(Xn+1 =j|Xn =i} = Pij, the process is said to be a Discrete Time Markov Chain (DTMC).

Pij is the transition probability from state i to state j

Markov Chains

Page 8: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

00 01 02

10 11 12

0 1 2

0, , 0 1, 0,1,...

...

...

. . . .

. . . .

...

. . . .

. . . .

ij ijj

i i i

P i j P i

P P P

P P P

P P P

P

P: transition matrix

Page 9: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

Page 10: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

State 0 = rainState 1 = no rain

Page 11: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

State 0 = rainState 1 = no rain

1

1

P

Page 12: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

Page 13: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p

(i≠0, M)

Page 14: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p

(i≠0, M)

P(Xn=i-1| Xn-1 =i, Xn-2 = in-2, ..., X0 =N} = P(Xn =i-1|Xn-1 =i}=1–p

(i≠0, M)

Page 15: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p

(i≠0, M)

P(Xn=i-1| Xn-1 =i, Xn-2 = in-2, ..., X0 =N} = P(Xn =i-1|Xn-1 =i}=1–p

(i≠0, M)

Pi, i+1=P(Xn=i+1|Xn-1 =i}; Pi, i-1=P(Xn=i-1|Xn-1 =i}

Page 16: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Pi, i+1= p;

Pi, i-1=1-p for i≠0, M

P0,0= 1; PM, M=1 for i≠0, M (0 and M are called absorbing states)

Pi, j= 0, otherwise

Page 17: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

random walk: A Markov chain whose state space is 0, 1, 2, ..., and Pi,i+1= p = 1 - Pi,i-1 for i=0, 1,

2, ..., and 0 < p < 1 is said to be a random walk.

Page 18: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Chapman-Kolmogorv Equations

{ | }, 0, , 0nij n m mP P X j X i n i j

Page 19: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Chapman-Kolmogorv Equations

1

{ | }, 0, , 0nij n m m

ij ij

P P X j X i n i j

P P

Page 20: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Chapman-Kolmogorv Equations

1

0

{ | }, 0, , 0

for all , 0, and , 0

( )

nij n m m

ij ij

n m n mij ik kjk

P P X j X i n i j

P P

P P P n m i j

Chapman - Kolmogrov equations

Page 21: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0{ | },

n mij n mP P X j X i

Page 22: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

00

{ | },

= { , | }

n mij n m

n m nk

P P X j X i

P X j X k X i

Page 23: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

00

0 00

{ | },

= { , | }

{ | , } { | }

n mij n m

n m nk

n m n nk

P P X j X i

P X j X k X i

P X j X k X i P X k X i

Page 24: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

00

0 00

00

{ | },

= { , | }

{ | , } { | }

{ | } { | }

n mij n m

n m nk

n m n nk

n m n nk

P P X j X i

P X j X k X i

P X j X k X i P X k X i

P X j X k P X k X i

Page 25: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

00

0 00

00

0 0

{ | },

= { , | }

{ | , } { | }

{ | } { | }

n mij n m

n m nk

n m n nk

n m n nk

m n n mkj ik ik kjk k

P P X j X i

P X j X k X i

P X j X k X i P X k X i

P X j X k P X k X i

P P P P

Page 26: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

( ) : the matrix of transition probabilities n nijn P

P

Page 27: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

( )

( ) ( ) ( )

: the matrix of transition probabilities n nij

n m n m

n P

P

P P × P

Page 28: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

( )

( ) ( ) ( )

1

: the matrix of transition probabilities

(Note: if [ ] and [ ], then [ ])

n nij

n m n m

M

ij ij ik kjk

n P

a b a b

P

P P × P

A B A × B

Page 29: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

What is the probability that it will rain four days from today given that it is raining today? Let = 0.7 and = 0.4.

State 0 = rainState 1 = no rain

Page 30: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

400What is ?P

Page 31: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

400What is ?

0.7 0.3

0.4 0.6

P

P

Page 32: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

400

(2)

What is ?

0.7 0.3

0.4 0.6

0.7 0.3 0.7 0.3 0.61 0.39

0.4 0.6 0.4 0.6 0.52 0.48

P

P

P ×

Page 33: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

400

(2)

(4) (2) (2)

What is ?

0.7 0.3

0.4 0.6

0.7 0.3 0.7 0.3 0.61 0.39

0.4 0.6 0.4 0.6 0.52 0.48

0.61 0.39 0.61 0.39 0.5749 0.4251

0.52 0.48 0.52 0.48 0.5668 0.4332

P

P

P ×

P P × P ×

Page 34: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

400

(2)

(4) (2) (2)

400

What is ?

0.7 0.3

0.4 0.6

0.7 0.3 0.7 0.3 0.61 0.39

0.4 0.6 0.4 0.6 0.52 0.48

0.61 0.39 0.61 0.39 0.5749 0.4251

0.52 0.48 0.52 0.48 0.5668 0.4332

0.574

P

P

P

P ×

P P × P ×

9

Page 35: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

How do we calculate ( )?nP X j

Unconditional probabilities

Page 36: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

How do we calculate ( )?

Let ( )

n

i

P X j

P X i

Unconditional probabilities

Page 37: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

0 01

How do we calculate ( )?

Let ( )

( ) ( | ) ( )

n

i

n ni

P X j

P X i

P X j P X j X i P X i

Unconditional probabilities

Page 38: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

0 01

1

How do we calculate ( )?

Let ( )

( ) ( | ) ( )

n

i

n ni

nij ii

P X j

P X i

P X j P X j X i P X i

P

Unconditional probabilities

Page 39: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

State is accessible from state if 0 for some 0.

Two states that are accessible to each other are said

to communicate ( ).

Any state communicates with itself since 1.

nij

ii

j i P n

i j

P

Classification of States

Page 40: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

If state communicates with state , then state communicates

with state .

i j j

i

Communicating states

Page 41: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

If state communicates with state , then state communicates

with state .

If state communicates with state , and state communicates

with state , then state communicates with state .

i j j

i

i j j

k i k

Communicating states

Page 42: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0

If communicates with and communicates with ,

then there exist some and for which 0 and 0.

0.

n mij jk

n m n m n mik ir rk ij jkr

i j j k

m n P P

P P P P P

Proof

Page 43: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Two states that communicate are said to belong to the same class.

Classification of States (continued)

Page 44: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Two states that communicate are said to belong to the same class.

Two classes are either identical or disjoint

(have no communicating states).

Classification of States (continued)

Page 45: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

Two states that communicate are said to belong to the same class.

Two classes are either identical or disjoint

(have no communicating states).

A Markov chain is said to be if it has onl

irreducible y one class

(all states communicate with each other).

Classification of States (continued)

Page 46: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1/ 2 1/ 2 0

1/ 2 1/ 2 1/ 4

0 1/ 3 2 / 3

P

Page 47: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1/ 2 1/ 2 0

1/ 2 1/ 2 1/ 4

0 1/ 3 2 / 3

P

The Markov chain with transition probability matrix P is irreducible.

Page 48: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1/ 2 1/ 2 0 0

1/ 2 1/ 2 0 0

1/ 4 1/ 4 1/ 4 1/ 4

0 0 0 1

P

Page 49: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1/ 2 1/ 2 0 0

1/ 2 1/ 2 0 0

1/ 4 1/ 4 1/ 4 1/ 4

0 0 0 1

P

The classes of this Markov chain are {0, 1}, {2}, and {3}.

Page 50: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• fi: probability that starting in state i, the process will eventually re-enter state i.

Recurrent and transient states

Page 51: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• fi: probability that starting in state i, the process will eventually re-enter state i.

• State i is recurrent if fi = 1.

Recurrent and transient states

Page 52: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• fi: probability that starting in state i, the process will eventually re-enter state i.

• State i is recurrent if fi = 1.

• State i is transient if fi < 1.

Recurrent and transient states

Page 53: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• fi: probability that starting in state i, the process will eventually re-enter state i.

• State i is recurrent if fi = 1.

• State i is transient if fi < 1.

• Probability the process will be in state i for exactly n periods is fi n-1(1- fi), n ≥ 1.

Recurrent and transient states

Page 54: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1 1State is recurrent if and transient if n n

ii iin ni P P

Page 55: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1, if

0 if n

nn

X iI

X i

Proof

Page 56: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

00

1, if

0 if

: number of periods the process is in state .

given that it starts in

nn

n

nn

X iI

X i

I X i i

i

Proof

Page 57: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

00

0 00 0

00

0

1, if

0 if

: number of periods the process is in state .

given that it starts in

[ ]

{ }

nn

n

nn

n nn n

nn

niin

X iI

X i

I X i i

i

E I X i E I X i

P X i X i

P

Proof

Page 58: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• Not all states can be transient.

Page 59: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

•If state i is recurrent, and state i communicates with state j, then state j is recurrent.

Page 60: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1 1 1

Since , there exists and for which >0 and >0.

, for any .

.

k mij ji

m n k m n kjj ji ii ij

m n k m n k m k njj ji ii ij ji ij iin n n

i j k m P P

P P P P n

P P P P P P P

Proof

Page 61: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• Not all states can be transient.

Page 62: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• If state i is recurrent, and state i communicates with state j, then state j is recurrent recurrence is a class property.

• Not all states can be transient.

Page 63: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• If state i is recurrent, and state i communicates with state j, then state j is recurrent recurrence is a class property.

• Not all states can be transient.

• If state i is transient, and state i communicates with state j, then state j is transient transience is also a class property.

Page 64: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

• If state i is recurrent, and state i communicates with state j, then state j is recurrent recurrence is a class property.

• Not all states can be transient.

• If state i is transient, and state i communicates with state j, then state j is transient transience is also a class property.

• All states in an irreducible Markov chain are recurrent.

Page 65: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0 0 1/ 2 1/ 2

1 0 0 0

0 1 0 0

0 1 0 0

P

Page 66: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

0 0 1/ 2 1/ 2

1 0 0 0

0 1 0 0

0 1 0 0

P

All states communicate. Therefore all states are recurrent.

Page 67: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1/ 2 1/ 2 0 0 0

1/ 2 1/ 2 0 0 0

0 0 1/ 2 1/ 2 0

0 0 1/ 2 1/ 2 0

1/ 4 1/ 4 0 0 1/ 2

P

Page 68: Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004

1/ 2 1/ 2 0 0 0

1/ 2 1/ 2 0 0 0

0 0 1/ 2 1/ 2 0

0 0 1/ 2 1/ 2 0

1/ 4 1/ 4 0 0 1/ 2

P

There are three classes {0, 1}, {2, 3} and {4}. The first two are recurrent and the third is transient