Simple Feedforward NNet questions

Good day! I am new to neural networks and in Matlab nnet toolbox.
Basically, I want to re-implement a backpropagation feedforward neural network described from a journal/research paper. The info/parameters that I currently have are the ff: 1) it consists of 2 layers: 1 hidden, 1 output 2) 6 input neurons, 6 output neurons, 5 neurons in the hidden layer 3) the values in the input are all floating point numbers, and just binary numbers in the output 4) the activation functions are tansigmoid for the first layer, then logsigmoid for the second 5) I need to get a least mean square (LMS) error that is below 0.05. 6) no info provided about the learning rate or the momentum 7) I can generate training samples as many as needed (5000 was used)
and these are my trial codes so far:
inputs = rand([6, 6000]); outputs = randi([0,1],6, 6000); net = feedforwardnet(5); net = configure(net,inputs, outputs); net.layers{2}.transferFcn = 'logsig'; net.trainParam.epochs = 6000; net.trainParam.goal = 0.01; net = train(net,inputs,outputs);
I'm not using yet my data since I'm still thinking how to normalize my inputs. My problem is that when I try to use the network, the output is not a binary array but some floating point values. What should I do about it?
And any further thoughts/advices? I would really appreciate them. Thank you in advance.

 采纳的回答

Greg Heath
Greg Heath 2012-9-26
Binary numbers in the output generally indicate a classifier. In which case I advise using patternnet, unit vector column targets (help ind2vec), trainscg training function, tansig hidden and softmax output activation functions (posterior probability estimates sum to unity). When the trained net is used, the index of the assigned class is obtained from vec2ind.
If it is not a classifier, use logsig output functions. To get binary outputs you can replace logsig with hardlim AFTER training OR convert the output to unipolar {0,1} binary using round.
For a I-H-O = 6-5-6 node topology, the number of unknown weights to estimate are Nw = (I+1)*H+(H+1)*O = 35+36 = 71.
If the default data division is used, the number of training cases is Ntrn = 0.7*N and the number of training equations is Neq = Ntrn*O = 4.2*N. The rule of thumb Neq/Nw >> 1 yields N >> 71/4.2 = 17. The guideline N > 30*17 = 510 should be sufficient. N ~ 5000 may be over the top.
MSEgoal = 0.1*(Neq-Nw)*mean(var(t,0,2))/Neq
Should be sufficient as should defaults for the remaining parameters.
Hope this helps.
Thank you for formally accepting my answer.
Greg

10 个评论

The nnet creation functions automatically scale to [-1,-1]. Regardless, I usually check data for outliers using zscore before creating the net.
renz
renz 2012-9-27
编辑:renz 2012-9-27
Thanks for the thorough answer. :)
I'm not sure if I understand this correctly.
"If it is not a classifier, use logsig output functions. To get binary outputs you can replace logsig with hardlim AFTER training OR convert the output to unipolar {0,1} binary using round."
I am training the network using BINARY values as output. Is that okay, using logsig? Then I'll just round the output of the network when I use it?
BTW, any advice on how to extract the weights so that i can use the network in other platforms (e.g. Java) ?
Thank you again in advance!
Correct.
Type net into the command line without the ending semicolon. All of the net properties are displayed.
IW = net.IW{:}; LW = net.LW{:}; b = net.b{:};
h = tansig(b(1)+IW*x);
y = round(logsig( b(2)+ LW*h));
Hope this helps.
Greg
I'm sorry for the trouble, but I still have problems in using it. My codes so far:
inputs = xlsread('data.xls',1);
inputs = inputs'; % 6 x 1000
outputs = xlsread('data.xls',2);
outputs = outputs'; % 6 x 1000
net = feedforwardnet(5);
net = configure(net,inputs, outputs);
net.layers{2}.transferFcn = 'logsig';
net.trainParam.goal = 0.001;
net = train(net,inputs,outputs);
x = inputs(:,221);
IW = net.IW{:}; LW = net.LW{:}; b = net.b{:};
h = tansig(b(1)+IW*x);
y = round(logsig( b(2)+ LW*h));
I can't multiply the LW, my guess is that I'm not getting the weights matrix.
Besides that, I think that I'm not training it correctly. It is trained after just a few epochs(10-20). It seems that my network is ignoring the 0 binary values in the output, isn't it?
And what does this code do:
x = net(inputs)
My expected values in the output neurons should be binary {0,1} but I think I'm still far at getting it :(
renz
renz 2012-9-28
编辑:renz 2012-9-28
Actually, one of the output neurons (the last) is like a CLASSIFIER (in patternnet), 1 if success, and 0 if fail.. But when it's value is 1, I will STILL NEED the binary values of the other 5 output neurons. So what's the best way to deal with it?
I still have no clue how to use the network outside of matlab, for example, how did the normalization of input is done since I think I will need to normalize my input everytime I use the network right?
Thank you so much!
1. targets are a more appropriate label than outputs. y will be the output
2. [ net tr ] = ... PRESERVES TRAINING INFO
3. [ net tr Ytrn Etrn ]= ... MORE INFORMATIVE
4. x = inputs(:,221); % VERY INCORRECT !
5. h = tansig(b(1)+IW*x);% WHOOPS, HAVE TO NORMALIZE x
6. y = round(logsig( b(2)+ LW*h));
OR
y = round( sim( net, inputs) ) % unnormalized inputs
OR
y = round(net(inputs)) % unnormalized inputs
Thanks!
But I still don't know why I keep getting all 1's in my targets. What am I doing wrong with the training? Obviously, there is, since the training always stops after a few epochs. Please, I don't really understand.
Nevermind the previous comment.
My codes so far:
inputs = xlsread('data.xls',1);
inputs = inputs'; % 6 x 1000
outputs = xlsread('data.xls',2);
outputs = outputs'; % 6 x 1000
outputs = (outputs)*2 - 1;
net = feedforwardnet(15); %OR patternnet
net = configure(net,inputs, outputs);
net.layers{2}.transferFcn = 'logsig';
net.trainParam.lr =0.00;
net.biasConnect(1) = 0;
net.biasConnect(2) = 0;
net.trainParam.goal = 0.001;
%[net, tr] = train(net,inputs,outputs);
[ net tr Ytrn Etrn ] = train(net,inputs,outputs)
y = round(sim(net,inputs))
And majority of the Y values differ by 2 or more values from the actual targets, e.g. y(:,35) = [0 0 1 1 1 1] while the actual output should be [1 1 0 0 1 1] .
(1) How can I further (much further) improve this? (2) Is it always overfitting (if my understanding of overfitting is correct)? (3) Am I still doing something wrong with the training? Any rules of thumb, or advices?
Thank you very much for everything, and for being patient to me, Greg!
P.S. (4) I still have no idea how to use the network outside of Matlab. I would definitely love to know how. :)
I set the max validation fails to 100,
net.trainParam.max_fail = 100;
but the network performance hardly drops below .5 :(
(5) Any thoughts? Is there any future with this 6 In -> 6 Out (binaries) network configuration? Thank you!
The nature of my inputs and targets:
Inputs ranges (doubles): [30,1070], [40,650], [30,1070], [40,650], [10,750], [0,180]
Output: The first five binaries is for classification into groups 0-31 , the last binary for classification of success or fail.
I really want to re-implement the 6inputs-6targets topology. But it would be nice to have a Plan B. Thankyou

请先登录,再进行评论。

更多回答(1 个)

My codes so far:
inputs = xlsread('data.xls',1);
inputs = inputs'; % 6 x 1000
outputs = xlsread('data.xls',2);
1. RENAME 'OUTPUTS' AS 'TARGETS'
outputs = outputs'; % 6 x 1000
outputs = (outputs)*2 - 1;
2, WHY DID YOU DO THIS? .
a. TO USE THE WEIGHTS OUTSIDE OF MATLAB?
OR
B. TO CAPITALIZE ON THE TANSIG HIDDEN NODES?
net = feedforwardnet(15); % OR patternnet
net = configure(net,inputs, outputs);
net.layers{2}.transferFcn = 'logsig';
3. LOGSIG IS NOT COMPATIBLE WITH [-1, 1]
...COULD USE TANSIG. THEN
y = round( (1+net(inputs)) / 2)
4. NO ...BETTER YET:
a. KEEP ORIGINAL {0,1} AND LOGSIG OUT
b. DISABLE INTERNAL OUTPUT DEFAULT NORMALIZATION {-1,1} VIA MAPMINMAX
net.trainParam.lr =0.00;
net.biasConnect(1) = 0;
net.biasConnect(2) = 0;
5. DELETE ABOVE 3 STATEMENTS
net.trainParam.goal = 0.001;
TRY = 0.01*mean(var(output'))
%[net, tr] = train(net,inputs,outputs);
[ net tr Ytrn Etrn ] = train(net,inputs,outputs)
y = round(sim(net,inputs))
6. YOU DIDN'T UNNORMALIZE THE PRETRAINING NORMALIZATION
WHAT ARE MIN AND MAX OF net(inputs) ?
% And majority of the Y values differ by 2 or more values from the actual % targets, e.g. y(:,35) = [0 0 1 1 1 1] while the actual output should be % [1 1 0 0 1 1] .
7. SEE 6
% (1) How can I further (much further) improve this? (2) Is it always % overfitting (if my understanding of overfitting is correct)?
UDERSTANDING IS INCORRECT. SEE THE COMP.AI.NEURAL-NETS FAQ.
(3) Am I % still doing something wrong with the training? Any rules of thumb, or % advices?
7. TO FIND SOME OF MY SAMPLE CODE, SEARCH NEWSGROUP AND ANSWERS USING
NEQ NW NTRIALS

1 个评论

Thanks, as always.
My codes:
inputs = xlsread('data.xls',1);
inputs = inputs'; % 6 x 1000
targets = xlsread('data.xls',2);
targets = targets'; % 6 x 1000
H = 6;
TF = {'tansig','logsig'};
BTF = 'trainlm';
BLF = 'learngdm';
PF = 'mse';
IPF = {'fixunknowns','removeconstantrows','mapminmax'};
OPF = {'removeconstantrows'};
DDF = 'dividerand';
net = newff(inputs,targets,H,TF,BTF,BLF,PF,IPF,OPF,DDF);
MSE = mean(var(targets'))/100 ;
net.trainParam.goal = MSE;
net.trainParam.max_fail=40;
stream = RandStream.getGlobalStream;
%s1= stream.State;
%stream.State=s1;
s1= stream.State;
net = init(net);
[net, tr] = train(net,inputs,targets);
y=round(sim(net,inputs))
Performance is around 0.1 - 0.15 now, and I'm trying different H, each for several initializations and trainings, and hoping to further improve the perf.
I don't get this 6.YOU DIDN'T UNNORMALIZE THE PRETRAINING NORMALIZATION. WHAT ARE MIN AND MAX OF net(inputs) ?
Target 6 is a success or fail identifier, if it is 0 (fail), the other 5 targets are supposed to be "don't care" values. Are there any ideal values for these when the 6th target is 0 to improve the training/learning of the network? I'm currently setting them to all 0's, e.g. 0 0 0 0 0 if target 6 is 0. Or any is just fine?
Any further advices/thoughts? :)

请先登录,再进行评论。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by