A stochastic or random process as opposed to a deterministic process, includes the possibility that a system may evolve in different ways in a way that can only be predicted with probability. For example a differential equation can be solved and the solutions determine the future state of the system but in a stochastic or random process there is some indeterminacy in its future evolution described by probability distributions. This means that even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths may be more probable and others less so.
Deterministic dynamical processes are typically formulated as a set of rules which allow for the state of the system at timeto be found from the state of the system at time t. For stochastic systems, we can only specify the probability of finding the system in a given state. If this only depends on the state of the system at the previous time step, but not those before this, the stochastic process is said to be Markovian. Many stochastic processes are Markovian to a very good approximation.
The mathematical definition of a Markov process follows from the definition of the hierarchy of pdfs for a given process. This involves the joint pdfwhich is the probability that the system is in stateat timestateat timeand stateat timeand also the conditional pdfwhich is the probability that the system is in stateat timeat timegiven that it was in stateat timeat timeThese pdfs are all non-negative and normalisable, and relations exist between them due to symmetry and reduction (integration over some of the state variables). For a Markov process the history of the system, apart from the immediate past, is forgotten, and so
A direct consequence of this is that the whole hierarchy of pdfs can be determined from only two of them:andThe hierarchy of defining equations then collapses to only two:
The pdfis referred to as the transition probability and (2) as the Chapman - Kolmogorov equation. While the pdfs for a Markov process must obey (1) and (2), the converse also holds: any two non-negative functionsandwhich satisfy (1) and (2), uniquely define a Markov process.