15
->Guided By Ma la ram Si r ->Prepared By Chi nmay Patel [09BCE038]

Markov Chain Model

Embed Size (px)

Citation preview

Page 1: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 1/15

Page 2: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 2/15

     A Markov chain, named after Andrey Markov, is amathematical system that undergoes transitions from

one state to another, between a finite or countable

number of possible states.

     It is a random process characterized as memoryless:

     The next state depends only on the current state and not

on the sequence of events that preceded it. This specific

kind of "memorylessness" is called the Markov Property.

     Markov chains have many applications as statistical

models of real-world processes.

Page 3: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 3/15

     The state of the system at time t+1 depends only on the

state of the system at time t.

 X 1

X 2 X 

3X 4 X 

5

     Since the system changes randomly, it is generally

impossible to predict with certainty the state of a Markov

chain at a given point in the future. However, thestatistical properties of the system's future can be

predicted. In many applications, it is these statistical

properties that are important.

Page 4: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 4/15

     There are two ways of describing Markov chains: through

state transition diagrams or as simple graphical models.

     The changes of state of the system are called transitions,

and the probabilities associated with various state-

changes are called transition probabilities.

     A transition diagram is a directed graph over the possible

states where the arcs between states specify all allowed

transitions (those occuring with non-zero probability).

     One can also represent it in transition matrix.

Page 5: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 5/15

     Weather :-

raining today 40% rain tomorrow

60% no rain tomorrow

not raining today 20% rain tomorrow

80% no rain tomorrow

rain no rain

0.60.4 0.8

0.2

A simple two state markov chain represented by

transition diagram

¹¹ º ¸

©©ª¨!

8.02.0

6.04.0 P 

Page 6: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 6/15

     In graphical models, on the other hand, one focus on

explicating variables and their dependencies.

     At each time point the random walk is in a particular

state X(t). This is a random variable. It·s value is only

affected by the random variable X(t - 1) specifying the

state of the random walk at the previous time point.

     Graphically, we can therefore write a sequence of 

random variables where arcs specify how the values of 

the variables are influenced by others (dependent on

others).

X(t-1) X(t) X(t+1)

Page 7: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 7/15

     A game of snakes and ladders or any other game whosemoves are determined entirely by dice is a Markov chain.

     In this dice games, the only thing that matters is the

current state of the board. The next state of the board

depends on the current state, and the next roll of thedice. It doesn't depend on how things got to their current

state.

     But in a game such as blackjack, a player can gain anadvantage by remembering which cards have already

been shown (and hence which cards are no longer in the

deck), so the next state (or hand) of the game is not

independent of the past states.

Page 8: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 8/15

     A famous Markov chain is the so-called "drunkard'swalk", a random walk on the number line where, at each

step, the position may change by +1 or î1 with equal

probability.

     For example, the transition probabilities from 5 to 4 and

5 to 6 are both 0.5, and all other transition probabilities

from 5 are 0. These probabilities are independent of 

whether the system was previously in 4 or 6.

Page 9: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 9/15

     Discrete markov chain :- It is one in which the system

evolves through discrete time steps. So changes to the

system can only happen at one of those discrete time

values. Eg. Snakes and Ladder.

     Continuous-time Markov chain :- It is one in which

changes to the system can happen at any time along a

continuous interval.A

n example is the number of carsthat have visited a drive-through at a local fast-food

restaurant during the day. A car can arrive at any time

t rather than at discrete time intervals.

Page 10: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 10/15

     Ergodic (or irreducible) Markov chain:- A Markov chainwith the property that the complete set of states S is

itself irreducible.

Equivalently, one can go from any state in S to any other 

 state in S in a Finite number of steps.

     Absorbing Markov chain :- It is a Markov chain in which

every state can reach an absorbing state. An absorbing 

state is a state that, once entered, cannot be left.

Page 11: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 11/15

     Markov chains are applied in a number of ways to many

different fields. Often they are used as a mathematicalmodel from some random physical process.

     Markovian systems appear extensively in

thermodynamics and statistical mechanics, wheneverprobabilities are used to represent unknown or

unmodelled details of the system.

     Markov chain methods have also become veryimportant for generating sequences of random numbers

to accurately reflect very complicated desired probability

distributions, via a process called Markov chain Monte

Carlo (MCMC).

Page 12: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 12/15

     Markov chains are the basis for the analytical treatment

of queues (queueing theory). This makes them critical

for optimizing the performance of telecommunicationsnetworks, where messages must often compete for

limited resources (such as bandwidth).

     Markov chains are employed in algorithmic musiccomposition, particularly in software programs such as

CSound, Max or SuperCollider.

     Markov chains are used in Finance and Economics tomodel a variety of different phenomena, including asset

prices and market crashes.

Page 13: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 13/15

     Markov chains can be used to project population in

smaller geopolitical areas.

     Use for forecasting elections result from current

condition.

     Ranking of webpages generated by Google is defined via

a ¶random surfer· algorithm(markov process).

     Markov models have also been used to analyze web

navigation behavior of users. A user's web link transition

on a particular website can be modeled using Markov

models and can be used to make predictions regarding 

future navigation and to personalize the web page for an

individual user.

Page 14: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 14/15

      Markov chains: models, algorithms and applications

By Wai Ki Ching, Michael K. Ng 

      en.wikipedia

      ocw.mit

      math.ucf 

      math.colgate

      math.stackexchange

Page 15: Markov Chain Model

8/3/2019 Markov Chain Model

http://slidepdf.com/reader/full/markov-chain-model 15/15