Markov chain intro

Preview:

DESCRIPTION

Markov chain analysis

Citation preview

Introduction to Markov chainsExamples of Markov chains:- Random walk on a line

Introduction to Markov chainsExamples of Markov chains:- Random walk on a graph

Introduction to Markov chainsExamples of Markov chains:- Random walk on a hypercube

Introduction to Markov chainsExamples of Markov chains:- Markov chain on graph colorings

Introduction to Markov chainsExamples of Markov chains:- Markov chain on graph colorings

Introduction to Markov chainsExamples of Markov chains:- Markov chain on graph colorings

Introduction to Markov chainsExamples of Markov chains:- Markov chain on matchings of a graph

Definition of Markov chains- State space - Transition matrix P, of size ||x||

- P(x,y) is the probability of getting from state x to state y- y2 P(x,y) =

This course: state space is finite

Definition of Markov chains- State space - Transition matrix P, of size ||x||

- P(x,y) is the probability of getting from state x to state y- y2 P(x,y) =

This course: state space is finite

Definition of Markov chainsExample: frog on lily pads

Definition of Markov chainsStationary distribution ¼ (on states in ):

Example: frog with unfair coins

Definition of Markov chainsOther examples:

Definition of Markov chains- aperiodic

- irreducible

- ergodic: a MC that is both aperiodic and irreducible

Thm (3.6 in Jerrum): An ergodic MC has a unique stationary distribution. Moreover, it converges to it as time goes to 1.

Gambler’s RuinA gambler has k dollars initially and bets on (fair) coin flips.If she guesses the coin flip: gets 1 dollar.If not, loses 1 dollar.Stops either when 0 or n dollars.

Is it a Markov chain ?

How many bets until she stops playing ?What are the probabilities of 0 vs n dollars ?

Gambler’s RuinStationary distribution ?

Gambler’s RuinStarting with k dollars, what is the probability of reaching n ?

What is the probability of reaching 0 ?

Gambler’s RuinHow many bets until 0 or n ?

Coupon Collecting- n coupons- each time get a coupon at random, chosen from 1,…,n (i.e., likely will get some coupons multiple times)- how long to get all n coupons ?

Is this a Markov chain ?

Coupon CollectingHow long to get all n coupons ?

Coupon CollectingHow long to get all n coupons ?

Coupon CollectingProposition:Let ¿ be the random variable for the time needed to collect all n coupons. Then,

Pr(¿ > d n log n + cn e) · e-c.

Random Walk on a HypercubeHow many expected steps until we get to a random state ?(I.e., what is the mixing time ?)

Random Walk on a HypercubeHow many expected steps until we get to a random state ?(I.e., what is the mixing time ?)

Recommended