Chea K.

asked • 06/16/20

Marchov Chain question

A Markov chain has the transition matrix shown below:

P= 0.3 0.2 0.5

0.7 0 0.3

1 0 0


If, on the first observation, the system is in state 2, what is the probability that on the next four observations it successively occupies states 3, 1, 2, and 1 (in that order)?


1 Expert Answer

By:

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.