The Markov process is one of the part of stochastic process in which it is necessary for us to satisfy a Markovian equation:
P {Xt + 1 = j | X0 = k0, X1 = k1… , Xt-1 = k t-1 ,X t= i}
= P {X t+1 = j | Xt = i} for t = 0, 1, …
This equation shows that the environment that is going to occur in future depends directly and only on the environment of the next day.
There are two of transition probability:
Further a one-step conditional probability is divided into stationary and non-stationary probabilities. The stationary one-step conditional probability is shown by the equation:
P {X t+1 = j | Xt = i} = P {X1 =j | X0 = i} for all t = 0, 1, …
Where transition remain fixed with time. This equation is denoted by Pij.
P {Xt+n=j | X, = i} = P {Xn = j | X0= i} for all t = 0, 1, ..,
And is denoted by pij(n) for all i and j.
Both the equation will further satisfy two important properties:
Matrix form of transition probabilities:
Transition of conditional probabilities can also be represented in matrix form.
Thus, we see from the above matrix forms that P denotes one-step transition probabilities and P(n) is called n-step transition probabilities.
Finite state Markov chain:
A countable number of forms, the Markovian property, stationary transition probabilities and the set of probabilities P {X0 = i} together satisfy the finite state of Markov theorem of a stochastic process {X1} (t=0,1,2,…).
Links of Previous Main Topic:-
Links of Next Statistics Topics:-