transition probability matrix for markov

1 view (last 30 days)
how to solve if the summation of each row in transition probability matrix in markov chain not equal to one?

Accepted Answer

Ameer Hamza
Ameer Hamza on 8 Oct 2020
If you just want to make each row sum to one, then you can try this
M % matrix
M_new = M./sum(M,2)
I am not sure if this is the theoretically correct way to solve this problem.
  4 Comments

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!