- Design(training+validation), test and new data should all have the same summary statistics BEFORE NORMALIZATION. This may require mixing all of the data together before creating the train/val/tst subsets.
- I prefer zero-mean, unit-variance normalization. It is very helpful for spotting outliers.
- One hidden layer of tanh (aka tansig) hidden nodes is sufficient. However, occasionally, problem specifics make using 2 or more to be appropriate.
- Use as few hidden nodes as possible to reinforce stability. I start with 1 and use an outer loop to increase the number until the training MSE is less than 0.01 times the training target variance.
- An inner loop is used to obtain 10 designs that differ because of random weight initializations.This yields a training RSQUARE of 0.99.
- For each sample choice of hidden nodes design 10 or more nets that differ by initial random weights.
Why my network is not giving the desired output
1 次查看(过去 30 天)
显示 更早的评论
I'm trying to design a neural network using nntool of matlab R2015a having input layer of 27 neurons, output layer of 2 neurons and one hidden layer of 10 neurons.I have scaled the input and output data to (0,1) for logsig activation function of hidden layer with purelin in the output layer.For tansig activation function in hidden layer i have scaled the data to (-0.5,0.5). I have trained the network with 1155 training patterns. My mse and R are very good,but the network is not giving the expected result when tested with new data. I have tried almost all the combinations possible from the user inerface(nntool) and in a great confusion. It would be very helpful if answered.
Thank you.
0 个评论
采纳的回答
Greg Heath
2019-2-18
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!