Alternative to softmax function for Neural Network predicting fractions of a whole
5 次查看(过去 30 天)
显示 更早的评论
Hi, I created a feed forward Regression Neural Network to predict variables which are fractions of a whole (i.e. they sum up to 1). In order to have the network fullfil this criterion perfectly, I am using the softmax transfer function. Unfortunately, I realize that the network predicts smaller fractions very poorly, and I think this is due to the fact that the softmax transfer function normalizes my target fractions by dividing exponent of the fractions minus the largest fraction by its sum (exp(n-nmax)/sum(exp(n-nmax))), which results in much larger values for very small fractions. It wouldn't have to do that, since my fractions are already between 0 and 1. Can I change that somehow in the softmax transfer function, or is there an alternative to it that doesn't do this normailzation?
6 个评论
Ajay Pattassery
2019-8-29
Did you tried creating a custom layer which can force your output to one like the one I mentioned above instead of having a softmax layer.
Please refer the following link for creating custom layers.
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Build Deep Neural Networks 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!