## Markov Chains

Given a joint probability distributionthe marginal distribution of one of the variables is the probability distribution of that variable considered by itself. It is called "marginal" because it may be found for a discrete distribution of two variables presented in a table by summing values in a table along rows or columns, and writing the sum in the margins of the table. For a continuous distribution by integration.

The context here is that the theoretical studies being undertaken, or the data analysis being done, involves a wider set of random variables but that attention is being limited to a reduced number of those variables. In many applications an analysis may start with a given collection of random variables, then first extend the set by defining new ones (such as the sum of the original random variables) and finally reduce the number by placing interest in the marginal distribution of a subset (such as the sum). Several different analyses may be done, each treating a different subset of variables as the marginal variables.

Example: Given the probability distribution below, the marginal distribution is displayed on the right hand side forby summing the entries in each row and along the bottom forby summing the entries in each colums.

| |||||

0 | 1 | 2 | |||

0 | 0.05 | 0.05 | 0 | | |

1 | 0.2 | 0.05 | 0.15 | | |

2 | 0.1 | 0.06 | 0.04 | | |

3 | 0.1 | 0.08 | 0.12 | | |

| | | |

Example: Given the probability distributionthe marginal probability distribution foris

The marginal probability distribution foris