transition probability matrix for markov
1 view (last 30 days)
Show older comments
how to solve if the summation of each row in transition probability matrix in markov chain not equal to one?
0 Comments
Accepted Answer
Ameer Hamza
on 8 Oct 2020
If you just want to make each row sum to one, then you can try this
M % matrix
M_new = M./sum(M,2)
I am not sure if this is the theoretically correct way to solve this problem.
4 Comments
More Answers (0)
See Also
Categories
Find more on Markov Chain Models in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!