Definition

Letbe a subset ofA family of random variablesindexed byis a stochastic or random process. Whenoris a discrete time process and whenit is a continuous time process.

Whenthe process is a single event and whenis finite the process is a random vectorx-i in the ith position indicates the ith outcome or state corresponding toThe vector represents the evolution of the system as time passes. Whenorthe changes occur discretely and whenthe changes may occur at any instant.

]]>For the simple birth and death process the probability of a birth isand the probability of a death is

This means that we have a simple random walk withand

If x ever becomes zero, it stays zero permanently and the population is extinct. The problem is now a gambler's ruin problem. Using results from the gambler's ruin problem we can say that if the initial population is x_0 then the probability of extinction isifand 1 if

The expected number of points (births and deaths) before the population goes extinct isifThe expected time to extinction is infinite if

]]>If a type of event occurs at the ratethen in a small interval of lengthwe haveso that

Suppose events occur and obey a probability distributionas described above, thenis a Poisson distribution with parameter

Letdenote no events in the time intervalthen

Therefore

Letto give

Integration gives

so that

In general

So

Aswe obtain

Now prove by induction.

Assumeand use from above

as

Now multiply by the integrating factorto give

Integration now gives

]]>Suppose we want to randomly generate numbers from the interval [0, 1]. The generated numbers must have the following properties:

Each number in the interval is as likely to be chosen as any other number. In practice this means that over a long period of time the number of random numbers in each subinterval [a, b] of [0, 1] should be proportional to the length of the interval b − a. This is not a sufficient requirement because if [0,1] were divided into the intervals [0,0.25] and (0.25,1] then the sequence 0,0.5,0.5,0.5,0,0.5,0.5,0.5,0,0.5,0.5,0.5,... would satisfy this requirement while not being random.

To try to remedy this deficiency we might require that ordered pairs of random numbers (x,y) are distributed uniformly over the unit square [0, 1] × [0, 1].The proportion of pairs falling

in part of the square [0, 1] × [0, 1] will be proportional to the area of that part. In fact we can make this requirement for higher dimensional unit squares [0,1]^n .The highest dimension n such that the random number generator produces uniformly distributed numbers in [0, 1]^n is called the order of the random number generator.

Random number generators will start to repeat the same sequence after a while. The number of calls it takes for a generator to start repeating called the period of the generator. Most generators use a hidden variable called the random seed which stores the last output and is used as an input to the generator the next time it is used. Using the same seed twice will result in the same number being generator, so that in practice the number of random numbers is limited by the number of seeds.

]]>
Future State |
||

Present State |
||

From this we may write down the transformation matrix

If the initial state of the system is represented by the vectorthen the state at timeis represented by

The state at timeis represented by

In general to state at timeis represented by

We can find the proportion of time in whch the system is in statesor by puttingWe obtain

Hence

From both of these equations

Also(since probabilities add to 1).

and similarly

]]>A Markov chain is a random processwhere the random variablesare integer valued. For a simple two state process the transition probabilities defining the process are given by

The transition probabilities may be written in matrix form:is called the transition matrix. In this matrix the row indicate the current stateand the columns indicate the next state

Over any period of time any realization of a Markov chain will exhibit a number of 0s and 1s. In the long term the proportion of 0s and 1s will beand respectively. Ifandthe chain is said to be stationary since

Notice the transition matrix acts on row vectors.

Examples

A famous Markov chain is the drunk's walk, a random walk on the number line where, at each step, the position may change by +1 or −1 with equal probability. From any position there are two possible transitions, to the next or previous integer. The transition probabilities depend only on the current position, not on the way the position was reached. For example, the transition probabilities from 5 to 4 and 5 to 6 are both 0.5, and all other transition probabilities from 5 are 0. These probabilities are independent of whether the system was previously in 4 or 6.

Another example is the dietary habits of a creature who eats only grapes, cheese or lettuce, and whose dietary habits conform to the following rules: it eats exactly once a day. If it ate cheese yesterday, it will not today, and it will eat lettuce or grapes with equal probability. If it ate grapes yesterday, it will eat grapes today with probability 1/10, cheese with probability and lettuce with probabilityfinally, if it ate lettuce yesterday, it won't eat lettuce again today but will eat grapes with probabilityor cheese with probabilityThis creature's eating habits can be modeled with a Markov chain since its choice depends on what it ate yesterday, not additionally on what it ate 2 or 3 (or 4, etc.) days ago. One statistical property one could calculate is the expected percentage of the time the creature will eat grapes over a long period.

]]>Letthen the general Kolmogorov equations are

whereFor the simple queuesfor allandfor allThis gives

for

This is a difficult problem and since how a queue behaves in the long term is of more interest, we analyse the steady state solutionand all the parameters are constant. The Kolmogorov equations become

The last equation giveswhich inductively saysfor

henceforwith

The sum of any probability distribution is 1 by definition henceso that

Thenfor

]]>The queuing time is then the sum ofindependent exponentially distributed random variables (the x people in the queue plus the one being served) and so is a Gamma distribution

For the queue in a state of equilibrium letbe the density of the queueing time and let be the density for a queue size of length

for

and

so

The queueing time for an individual customer is exponential with parameter

]]>The first part denotes the process of arrivals at the start of a queue. Usually the process of arrival of customers is radom or 'Markovian' and denoted 'M'.

The second part denotes the process by which elements of the queue exit the queue. If elements of the queue exit in a predetermined manner, this is labelled 'D' and if elements exit the queue according to a uniform distribution this is labelled 'U'. Other models of exit are possible.

The third part denodes the number of exit route. These are all identical and act to release members of the queue. Each exit takes some time to process.

The overall process models a queue in a bank or similar. There is a single queue, which people may join according to various models and leave to be served at any one of a numnber of tellers.

Suppose elements arrive at the queue in a random manner. This is labelled 'M' for 'Markovian'. If after leaving the queue elements are served in a fixed time T, then this is labelled 'D'. If elements leave the queue to be served by any one of five tellers the the queue is modelled by the model M/D/5.

Suppose elements arrive at the queue in a random manner. This is labelled 'M' for 'Markovian'. If after leaving the model of the time taken to serve a customer is a Uniform distribution between 10 and 20 minutes then this is labelled 'U'. If elements leave the queue to be served by any one of two tellers the the queue is modelled by the model M/U/2.

If the leaving and exiting queue processes are both random and there is only one teller then the queue model is M/M/1. This is called the simple queue.

]]>If the number of renewals in each time period is a Poisson process, with renewals occurring at rate % lambda then the length of each lifetime is the waiting time between renewals. The length of a lifetime obeys an exponential distribution, so that if renewals occur at the rate %lambda then

This means that the time until the nth renewal has adistribution

We can make the renewal process discrete by considering fixed time units, with the end of an interval of time being givien the time label 1, 2, 3,... Each lifetime is the a positive integer. Many distributions will then return a constant probability of renewal in a time unit. Suppose the probability in each time period isthen the probability of an individual surviving until the nth interval isfor n=1, 2, 3,...

The expected lifetime is then

+The expected time to the nth renewal is

]]>