ELEC 321

Tutorial 12

Updated 2019-04-09

Suppose we have the Markov chains . The Markov property is .

For homogeneous MC, the initial probability distribution is , and that .

Chapman-Kolmogrov Equation

Example

There exists 2 white balls and 2 black balls. At each time, one ball is drawn and the color is flipped with probability , and not flipped with probability 1-. Ball is placed back into the urn.

Let be the number of black balls in the urn.

a. Is a Markov chain?

There are five possible outcomes for : . Let , so is the number of black balls. Then

Since we have shown that only depends on the last value, . Then is a Markov chain.

b. Find the transition matrix

Let the rows be the number of black balls the transition is ‘from’ and the column be the number of black balls the transition is going ‘to’.

If there are 0 black balls to begin with. Then there are two cases:

If there are 1 black ball in the urn. There are these outcomes:

Using the same arguments for the other states, we come to the final matrix:

Classes of State

Accessibility: State is accessible from if and only if

Communicating State: States and communicate if and only if is accessible from and is accessible from .

Communicating Class: The set of all communicating states.

Irreducible: A state is irreducible if and only if all states in the state space belongs to a single communicating class.

Recurrence: The state is recurring if and only if the probability of visiting state is 1.

​ To check, consider:

Transient: The state is transient if and only if the probability of visiting state is less than 1.

​ To check, consider:

Properties

  1. State machine is irreducible if and only if all states are recurrent
  2. All states in a communicating class are either transient or recurring (if one state in a communicating class is transient, all of the states in the same class are)

Example

Given transition matrix

Let’s draw the state diagram.

Since 1 is accessible from 0 and 0 is accessible from 1, then . Since 2 is also accessible from any of the states and any state is accessible from 2 given some time, we see that also.

Thus .

Example

Given transition matrix

We see that 0 is not communicating with any state but itself, and that 1 and 2 are communicating amongst each other. Thus we have two communicating classes: .

We see that both communicating classes are recurrent.

Example

Given transition matrix

We see that none of the states are communicating except for themselves:

To check if state 0 is transient, we check :

If we take the sum, it converges to . Therefore state 0 is transient.

Similarity, we could argue that state 2 is also transient.

For state 1, if we end up in state 1, the only outcome after is state 1. Therefore state 1 is recurring. We can check this by taking the summation: .

Periodicity of a State

Given a Markov chain, the period of a state is defined as

Properties

  1. All states in a communicating class has the same period
  2. A state is aperiodic if there exists a class, where all states in the class have a period of 1. From property 1, we see that if one state in the class has a period of 1, then all of the other states in the same class also has a period of 1. Thus it will make the class aperiodic.
  3. Regular: a Markov chain is regular if and only if it is irreducible and aperiodic

Example

copy diagram

All states belong to the same communicating class: and we only need to find the period of one state to determine the period of the entire class.

Observe the period of from state 0 to state 0. We see that . Thus the period is .

Example

Given transition matrix

We see that this is not regular because its one and only communicating class is periodic.

Long-Run Behavior

This only applies to regular Markov chains.

Doesn’t matter what the starting state is, after a long time, the probability approaches to a certain probability - the steady state probability.

Suppose we start the chain at state , then

can be obtained as a solution of

Example

Given a 2 state MC with the transition matrix

We first check that this MC is regular, and indeed it is.

To compute the steady state probability, :

We obtain the equations and solve for . Alas, we get and .

Suppose that and . Compute the probability at .

Because we have .

Multiplying them out using preferred methods, we see that .

Example

Given a buffer

At , the buffer contains 3 packets. Assume no more packet arrives, the packets in the buffer is transmitted. Transmission is successful with probability .

Let denote the number of packets in the buffer.

a. Is a MC?

There are four outcomes:

Let , we see that , thus only depends on and therefore is a Markov chain.

The transition matrix is

It is not regular because all states don’t belong to a single communicating class.