Katie C.

asked • 12/17/15

If, on the first observation the system is in state 1, what is the probability that it is in state 1 on the third observation?

A Markov chain has the transition matrix shown below:
P=[0.2  0.4]
    [0.8  0.6]
 
1. If, on the first observation the system is in state 1, what is the probability that it is in state 1 on the third observation?
 
2. If, on the first observation, the system is in state 2, what state is the system most likely to occupy on the third observation? (If there is more than one such state, which is the first one.) 
 
3. If, on the first observation, the system is in state 2, what is the probability that it alternates between states 1 and 2 for the first four observations (i.e., it occupies state 2, then state 1, then state 2, and finally state 1 again)?
 

1 Expert Answer

By:

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.