Why 'net.IW{1,1}' will return negative value and smaller value like -0.0090, 0.2447, and so on.
3 次查看(过去 30 天)
显示 更早的评论
I was doing multilayer neural network using newff function. size of input = 3 x 150 (3 input data, 150 samples) size of target = 1 x 150 net = newff([minmax(Inputs)],[10,1],{'logsig' 'logsig'}); 10 neurons in hidden layer and 1 neuron in output layer so the size of input weight in hidden layer = 3 x 10, a = weight; (the value of weight is 50 and above) a = net.IW{1,1};
Why a = net.IW{1,1}; will return negative value and the value is very small?
0 个评论
采纳的回答
Greg Heath
2015-6-23
[ 10 3 ] = size(IW)
Do not worry about the size or sign of a single weight in a single design. Typically, it is almost impossible to understand the purpose of each single weight.
The best approach for designing a real-world net is to make multiple designs to
1. Mitigate the uncertainty of using random initial weights
2. Reduce the time and effort needed to find a sufficiently low local minima in a mountainous weight space.
Using too many weights is called overfitting. If these are trained too long (overtraining), the training error can be driven very low. However, very often those designs do not work well on nontraining data.
I try to minimize the number of weights by minimizing the number of hidden nodes subject to the constraint that the variance of the error (= output-target) is less than the average variance of the target components by a factor of ~100 to 200.
Hope this helps.
Thank you for formally accepting my answer
Greg
2 个评论
Greg Heath
2019-12-9
My advice is to normalize real-valued data to [ -1 , 1 ] and use TANSIG for hidden node functions.
Greg
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!