why the network performance decreased??

1 次查看(过去 30 天)
hi all,
I applied NARXNET to predicting the time series. the problem is when I used the for loop to find the most optimum HN then run a new network with the selected HN, the performance is decreased ( R value ). why? (i.e from 0.9833 to 0.9663)
thank you for help

采纳的回答

Greg Heath
Greg Heath 2016-4-26
Given a value for the number of hidden nodes, using different random weight initializations AND random weight divisions will yield a spread of results. The difference you state is typical.
To keep things manageable, I typically, do not train more than 100 nets at a time: , numH = numel(Hmin:dH:Hmax) = 10 and Ntrials = 10 for each H value. I display the 100 NMSE or Rsq =1-NMSE results in a Ntrials x numH matrix. Then I display the min, median, mean, std and max of Rsq in a 5 x numH matrix.
You would be surprised how disparate some results can be.
Searching the NEWSGROUP and ANSWERS using
greg Ntrials
should bring up enough examples.
Hope this helps.
Thank you for formally accepting my answer
Greg

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by