transition probability matrix for markov
1 次查看(过去 30 天)
显示 更早的评论
how to solve if the summation of each row in transition probability matrix in markov chain not equal to one?
0 个评论
采纳的回答
Ameer Hamza
2020-10-8
If you just want to make each row sum to one, then you can try this
M % matrix
M_new = M./sum(M,2)
I am not sure if this is the theoretically correct way to solve this problem.
4 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Markov Chain Models 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!