Kullback-Leibler Divergence for NMF in Matlab
6 次查看(过去 30 天)
显示 更早的评论
I am trying to write the KLDV equation in matlab by looking at how the Euclidean distance was written.
- Euclidean distance for matrix factorization has the following structure.
![11.png](https://www.mathworks.com/matlabcentral/answers/uploaded_files/200142/11.png)
which reduces to this matlab code
f = norm(X - W * H,'fro')^2
Now I have the Kullback-Leibler Divergence with structure as below
![22.png](https://www.mathworks.com/matlabcentral/answers/uploaded_files/200143/22.png)
I wish to write this in matlab. But I am confused how to deal with the sumation. like in the Euclidean distance suddenly we are using the function norm.
Could someone help me write a decent code for this expression? Thanks.
0 个评论
采纳的回答
Matt Tearle
2019-1-16
If X and X_hat are just matrices, then I think you should be able to compute all the terms element-wise and sum the result (unless I misunderstand the formula).
div = X .* log(X ./ X_hat) - X + X_hat;
KLD = sum(div,'all'); % in R2018b onward
KLD = sum(div(:)); % in any version
I'm interpreting "log" in the formula in the math sense (natural log) rather than engineering (base 10). If it's base 10, then use the log10 function instead.
0 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Logical 的更多信息
产品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!