## Markov Chains

The probability model for a sequence of trials in which the probability of ending any trial in a state depends on the outcome of the preceding trial is called a Markov chain. Formally:

A Markov chain is a random processwhere the random variablesare integer valued. For a simple two state process the transition probabilities defining the process are given by

The transition probabilities may be written in matrix form:is called the transition matrix. In this matrix the row indicate the current stateand the columns indicate the next state

Over any period of time any realization of a Markov chain will exhibit a number of 0s and 1s. In the long term the proportion of 0s and 1s will beand respectively. Ifandthe chain is said to be stationary since

Notice the transition matrix acts on row vectors.

Examples

A famous Markov chain is the drunk's walk, a random walk on the number line where, at each step, the position may change by +1 or −1 with equal probability. From any position there are two possible transitions, to the next or previous integer. The transition probabilities depend only on the current position, not on the way the position was reached. For example, the transition probabilities from 5 to 4 and 5 to 6 are both 0.5, and all other transition probabilities from 5 are 0. These probabilities are independent of whether the system was previously in 4 or 6.

Another example is the dietary habits of a creature who eats only grapes, cheese or lettuce, and whose dietary habits conform to the following rules: it eats exactly once a day. If it ate cheese yesterday, it will not today, and it will eat lettuce or grapes with equal probability. If it ate grapes yesterday, it will eat grapes today with probability 1/10, cheese with probability and lettuce with probabilityfinally, if it ate lettuce yesterday, it won't eat lettuce again today but will eat grapes with probabilityor cheese with probabilityThis creature's eating habits can be modeled with a Markov chain since its choice depends on what it ate yesterday, not additionally on what it ate 2 or 3 (or 4, etc.) days ago. One statistical property one could calculate is the expected percentage of the time the creature will eat grapes over a long period.