Markov Chains
Markov Chain
A Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
A Markov chain is a model that tells us about the probabilities of sequences of random variables, states, each of which can take on values from some set.
Formally, a Markov chain is specified by the following components:
-
A set of N states
Q = q1, q2, …, qN
-
A transition probability matrix A, each aij representing the probability of moving from state i to state j, such that: ∀i. Σj=1n aij = 1
A = a11, a12, …, an1, …, ann
-
An initial probability distribution over states. πi is the probability that the Markov chain will start in state i. Some states may have πj = 0, meaning that they cannot be initial states. Also, Σj=1n πi = 1
π = π1, π2, …, πN