# Question Solved1 Answer3) Given a Markov chain with one-step transition matrix: 1 0.4 0.6 P = 1 0.2 0.8 And initial probability distribution, P(Xo-i) as follows: 1 P(X0=i) 2/3 1/3 Compute a. P(X4 01X2 1) (10 points) b. PCX's= 0 Xo-1, X1- 1 , X2- 0 , X3 - 0 , X4= 1) (10 points)

AOTUKE The Asker · Probability and Statistics
Please be clear and with all the steps
Transcribed Image Text: 3) Given a Markov chain with one-step transition matrix: 1 0.4 0.6 P = 1 0.2 0.8 And initial probability distribution, P(Xo-i) as follows: 1 P(X0=i) 2/3 1/3 Compute a. P(X4 01X2 1) (10 points) b. PCX's= 0 Xo-1, X1- 1 , X2- 0 , X3 - 0 , X4= 1) (10 points)
More
Transcribed Image Text: 3) Given a Markov chain with one-step transition matrix: 1 0.4 0.6 P = 1 0.2 0.8 And initial probability distribution, P(Xo-i) as follows: 1 P(X0=i) 2/3 1/3 Compute a. P(X4 01X2 1) (10 points) b. PCX's= 0 Xo-1, X1- 1 , X2- 0 , X3 - 0 , X4= 1) (10 points)