Markov process model BY:- Kanta Anuraga Sahoo 220301120178
CONTENTS MARKOV PROCESS MODEL MARKOV CHAINS HIDDEN MARKOV MODELS LIMIT STATES OF A MARKOV PROCESS MODEL EXAMPLES
MARKOV PROCESS MODEL For randomly changing systems with the Markov property, a Markov model is a probabilistic approach. Markov property indicates that the subsequent state is simply dependent on the present state and is independent of all previous states at any one time. Two commonly applied types of Markov model are :- Markov chains and Hidden Markov models
MARKOV PROCESS A Markov process is a stochastic process where:- The set of possible outcomes are finite. The probability of next outcome only depends on the outcome before. The probabilities are constant over time.
MARKOV CHAINS These are the simplest type of Markov model and are used to represent systems where all states are observable. Markov chains display all possible states, and between states, as well as the transition rates, which is the probability of moving from one state to another per unit of time. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed. A game of snakes and ladders, blackjack and card games are examples of Markov chains
HIDDEN MARKOV MODELS These serve as representations for systems with certain unobservable states. Hidden Markov models represent observations and observation likelihoods for each state in addition to states and transition rates. Example of a Hidden Markov model is predicting the weather (hidden variable) based on the type of clothes that someone wears (observed).
Limit states of a Markov process model :- To find the limit of a Markov chain, we have to find a stationary distribution. If π=[π1,π2,⋯] is a limiting distribution for a Markov chain, then we have π=limn→∞π(n)=limn→∞[π(0) Pn ]. Similarly, we can write π= limn→∞ π( n+1) =limn→∞[ π(0) Pn+1] =limn→∞[ π(0) PnP] =[limn→∞ π(0) Pn ]P = π P.
Examples of Markov model:- “Drunkard’s walk” is an example of Markov model. It is a random walk on the number line where, at each step, the position may change by +1 or -1 with equal probability. “Gambler’s ruin” is also an example of Markov model which is expressed as follows :- A gambler playing a game with negative expected value will eventually go broke, regardless of their betting system