How to display weight distribution in hidden layers of neural network?
3 次查看(过去 30 天)
显示 更早的评论
I have 8 inputs in the input layer.Now i want to display weight distribution of these 8 inputs in hidden layer to observe the importance of features.To make it more clear example is shown in figure ( https://pasteboard.co/GKCpA6Q.png ).I used `plotwb` function of Matlab it didn't display the weights of every input.
Actually i want to look at weights(weights connecting inputs to first hidden layer) . Larger the weight is, the more important the input.
0 个评论
回答(1 个)
Greg Heath
2017-9-17
That will not work. It does not account for the correlations between inputs.
The best way to rank correlated inputs is
1. Use NO HIDDEN LAYERS !
2. Run 10 or more trials each (different random initial weights)
using
a. A single input
b. All inputs except the one in a.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!