Info
此问题已关闭。 请重新打开它进行编辑或回答。
In backpropagation. Why the error at the time of testing is very large, even when training produces a small error ?
1 次查看(过去 30 天)
显示 更早的评论
Hi, everyone
I'm new in Matlab and Neural Networks. I'm using this to make forecasting load transformer. I have just trained my neural network which has 3 input variables and 1 output / target. Training sample size is 30 and testing sample size is 18. My network using 1 hidden layer that contains 6 nodes and I used the traingdx function. My question are:
- Why is it took long time to train the data for reach the goal, eventhough my goal just 0,0001? I trained my network for 5 days, and still not reach the goal yet.
- When I stopped the training, my network just reach the goal 0,1111. The first one I simulated it and have mse 1,754. And then I tried to test it with new input that contains 18 data, and the error result is large. How can I do to fix that ?
Thanks in advance for any and all help
3 个评论
Walter Roberson
2018-1-22
"Training sample size is 30 and testing sample size is 18"
That is not enough data to do a good job of discriminating classes if they are not well separated.
Suppose, for example, that I used rand(30,3) and told you to forecast based upon it. Any patterns that might be detected with that data would be accidental and would result in large error when used with the test data.
Data that looks like it must have some pattern might even have good patterns -- but those patterns might not be determinable with only 30 samples. For example electricity load requirements might plausibly be related to solar cycles, which have been extensively studied but continue to be surprising; see https://en.wikipedia.org/wiki/Solar_cycle for some of the hypothesized cycle lengths (including one of over 6000 years.)
回答(0 个)
此问题已关闭。
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!