Am I computing cross entropy incorrectly?
显示 更早的评论
I am working on a neural network and would like to use cross entropy as my error function. I noticed from a previous question that MATLAB added this functionality starting with R2013b. I decided to test the crossentropy function by running the simple example provided in the documentation. The code is reprinted below for convenience:
[x,t] = iris_dataset;
net = patternnet(10);
net = train(net,x,t);
y = net(x);
perf = crossentropy(net,t,y)
When I run this code, I get perf = 0.0367. To verify this result, I ran the code:
ce = -mean(sum(t.*log(y)+(1-t).*log(1-y)))
which resulted in ce = 0.1100. Why are perf and ce unequal? Do I have an error in my calculation?
采纳的回答
更多回答(3 个)
Greg Heath
2014-8-21
You are using the Xent form for outputs and targets that do not have to sum to 1. The corresponding output transfer function is logsig.
For targets that are constrained to sum to 1, use softmax and the first tern of the sum.
For extensive discussions search in comp.ai.neural-nets using
greg cross entropy
Hope this helps.
Thank you for formally accepting my answer
Greg
2 个评论
Matthew Eicholtz
2014-8-21
编辑:Matthew Eicholtz
2014-8-21
Greg Heath
2014-8-21
You are welcome for the reply. It did answer your question.
The next time you check make sure that you initialize the RNG before you train so that you can duplicate your calculation.
Or Shamir
2017-9-23
ce = -t .* log(y);
perf = sum(ce(:))/numel(ce);
1 个评论
Greg Heath
2017-9-26
isn't that the same as
perf = mean(ce(:)); % ?
Tian Li
2017-10-13
0 个投票
ce = -t .* log(y); perf = sum(ce(:))/numel(ce);
This is the right answer for muti-class classification error problem
1 个评论
Greg Heath
2017-10-15
Why do you think that is different from the last 2 answers???
类别
在 帮助中心 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!