Deriving Explicit Formulas from Markov Chains
Once we formulate the markov model correctly, we can obtain the generating function for each entry in the matrix, where there's a possibility of getting the explicit formula. Let's take a look at one such problem:
A six faced unbiased die is rolled :math:`n` times. What is the probability that we get to see all the six numbers in the sequence?
Setting up a markov chain is easy:
The states indicate the number of faces shown up. E.g. the row above the last row indicates that when 5 faces are seen, there's a probability of \(5/6\) remaining in the same state and \(1/6\) moving to the final state.
So, \(A^n[0,6]\), gives the required answer. But we can also find the generating function for that entry by computing \((I-x\, A)^{-1}\), which is
and on partial fractions it's
and the probability can be written by extracting \([x^n]G(x)\) as
which can be verified by a simulation in J:
n=:10 sim=: 3 : '6=+/~:?n#6' (+/%#)(sim"0)1e5#0 NB. about 0.27 |
and