low weighted cross entropy values
10 次查看(过去 30 天)
显示 更早的评论
I am building a network for semantic segmentation, with a weighted cross entropy loss. It seems possible to add weights related to my 8 classes ( inverse-frequency and normalized weights for each class) with the crossentropy() function. My issue is that the loss values that are calculated during training seem to be lower than what i should expect (values are between 0 and 1 but I would have expected them to be between 2-3).
My class weights vector is
norm_weights =[ 0.0011 0.4426 0.0023 0.0037 0.0212 0.0022 0.0065 1.0000]
And this is how I implement my loss function:
lossFcn = @(Y,T) crossentropy(Y,T,norm_weights,WeightsFormat="UC",...
NormalizationFactor="all-elements",ClassificationMode="multilabel")*n_class_labels;
[netTrained2, info] = trainnet(augmented_ds,net2,lossFcn,options);
If anyone would have a clue about the issue, that would be helpful!
采纳的回答
Matt J
2025-8-19
There are a few possible reasons for the discrepancy that I can think of,
(1) Your norm_weights do not add up to 1
(2) You have selected the NormalizationFactor="all-elements" in crossentropy(). According to the doc, though, trainnet does not normalize with all elements. It ignores the channel dimensions
(3) Other hidden normalization factors that may be buried in the blackbox that is trainnet(). I don't know if it is possible or worthwhile trying to dig them out.
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!