Couldn't fit the data using NEURAL NETWORKS IN MATLAB (fitnet function)

4 次查看(过去 30 天)
Hi,
i have been trying the fit the data to a nonlinear model using neural networks in matlab. i have several sets of data. and my code is working fine for some data sets but not for all the data sets.
for some data sets i am able to fit with good regression coefficient.for some sets of data it is giving me a constant value of output (i.e: almost '0' regression coefficient).
this is the architecture of my neural network: Feedforward neural network with back propagation. no of hidden layers-1 no of neurons in a hidden layer -i am varying to see the result in each run
CAN ANYONE POINT OUT WHAT IS GOING WRONG IN MY CODE PLEASE???
clear;
%%to load data from excel file
filename='dD.xlsx';
x=xlsread(filename);
p=x(:,2:12);
t=x(:,1);
inputs = p';
targets = t';
% rng(200,'v4');
rng(0)
%%Create a Fitting Network
hiddenLayerSize = 5;
net = fitnet(hiddenLayerSize);
%%Set up Division of Data for Training, Validation, Testing
% net.divideFcn = 'dividetrain'; % No validation or test data
net.divideParam.trainRatio = 100/100;
net.divideParam.valRatio = 0/100;
net.divideParam.testRatio = 0/100;
net.trainFcn = 'trainbr';
% net = configure(net,ptrans,tn);
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'purelin';
%%Train the Network
[net,tr] = train(net,inputs,targets);
tr.best_epoch;
effective_param = tr.gamk;
effective_no_of_parameters = effective_param(length(effective_param));
wt_IL=net.IW{1,1};
wt_HL= net.LW{2,1};
bias_IL=net.b{1};
bias_HL=net.b{2};
%%Test the Network
outputs = net(inputs);
errors = gsubtract(outputs,targets);
performance = perform(net,targets,outputs)

采纳的回答

Greg Heath
Greg Heath 2015-3-12
1. The best approach for us to help is to
a. Apply your code to the MATLAB data set that provides the best example of your problem
help nndatasets
doc nndatasets
b. Since you have 11 inputs and 1 output, consider one of the following
abalone_dataset 8 -> 1 , 4177 Abalone shell rings dataset.
bodyfat_dataset 13 -> 1 , 252 Body fat percentage dataset.
chemical_dataset 8 -> 1 , 498 Chemical sensor dataset.
house_dataset 13 -> 1 , 506 House value dataset
2. Record the initial RNG state so results can be duplicated
3. Obtain the normalized mean-square error and tabulate the corresponding Rsquare (see Wikipedia) resulting from a double loop approach to determining the number of hidden nodes (outer loop h = Hmin:dH:Hmax) and initial random weights ( inner loop i = 1:Ntrials)
MSE00 = mean(var(target',1)) % Reference MSE (optimal for constant output)
NMSE = mse(target-output)/MSE00
Rsquare = 1 - NMSE
4. Typically, my goal is to use
i) numel(Hmin:dH:Hmax) = 10; Ntrials = 10
ii) Minimize the number of hidden nodes subject to the constraint that the net models at least 99% of the mean target variance, i.e., Rsquare > 0.99.
5. You wrote that you have varied h and made multiple runs. However, you did not
a. State the size of your data set
b. Tabulate your results (Rsquare is preferred, but NMSE is ok)
6. I have posted scores of examples in both NEWSGROUP and ANSWERS. A useful search combination is
greg fitnet Ntrials
7. The no-overfitting condition Hmax << Hub in my posts is not necessary when regularization (trainbr and or msereg) is used.
Hope this helps.
Thank you for formally accepting my answer
Greg
  11 个评论
Greg Heath
Greg Heath 2015-3-17
This change stops training when the training error variance is less than 1% of the original target variance.
You can make it smaller (e.g., by a factor of 2 or 10) if you wish.
TRAINBR doesn't have a different transfer function.

请先登录,再进行评论。

更多回答(0 个)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by