Why neural network gives negative output ?
2 次查看(过去 30 天)
显示 更早的评论
I have 15000 dataset, 6 inputs and 12 outputs. Using feedforward net, I get training, validation, test and over all regression above 95%.
But when I check trained net with new inputs, I get negative values in the outputs.
(There is no negative values in the dataset)
What is the reason for it?
What could be the worng?
What should I do to overcome this issue?
0 个评论
采纳的回答
Greg Heath
2019-4-1
How different is the new data (e.g., Mahalanobis distance)?
If you know the true outputs, how do the error rates compare?
If you want positive outputs, use a sigmoid in the output layer.
Hope this helps.
*Thank you for formally accepting my answer*
Greg
4 个评论
Greg Heath
2019-4-4
It is not uncommon for new data to lie outside the bounds of training data.
Take into account whether negative values have meaning.
If not, use sigmoids in the output layer.
Greg
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!