How to avoid getting negative values when training a neural network?
显示 更早的评论
Is there anyway to constrain the network results when we train a feed forward neural network in Matlab?
I am trying to train a supervised feed forward neural network with 100,000 observations. I have 5 continues variables and 3 countinues responses (labels). All my values are positive (labels and variables). However, when I train the network, sometimes it predicts negative results no matter what architecture I use. Negative results does not have any physical meaning and should not apear. Is there anyway to constrain the network? I also used reLU activation function for the last layer but the network cannot generalize well.
Thanks
采纳的回答
更多回答(1 个)
Greg Heath
2020-1-18
0 个投票
Use a sigmoid for the output layer.
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG
类别
在 帮助中心 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
产品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!