Various kinds of states and transitions seem to occur in Markov chain that needs to be known. We shall discuss some of the most important kind of states:
In first kind of state probability is defined as 1 because the process returns back to the initial place from where it started after a number of transition. This type of state is known as recurrent state.
Next type of state is known as transient state in which the process may or may not return back to its initial position i.e. it may not recur like the recurring state.
Next type of state is known as aperiodic state if for a number say L, then there is some method to return to the number L, L+1, …..∞ Transitions.
The reverse of aperiodic state is known as periodic state.
Some properties of Markov chain depending on the above type of states:
It is said that in Ergodic case you can reach any other state within a fixed number of steps irrespective of the initial position. There is a chain called regular chain that goes side by side with ergodic chain. Thus it is seen that regular chain should always be ergodic but the inverse is not true.
Links of Previous Main Topic:-
Links of Next Statistics Topics:-