Simple Feedforward NNet questions
显示 更早的评论
Good day! I am new to neural networks and in Matlab nnet toolbox.
Basically, I want to re-implement a backpropagation feedforward neural network described from a journal/research paper. The info/parameters that I currently have are the ff: 1) it consists of 2 layers: 1 hidden, 1 output 2) 6 input neurons, 6 output neurons, 5 neurons in the hidden layer 3) the values in the input are all floating point numbers, and just binary numbers in the output 4) the activation functions are tansigmoid for the first layer, then logsigmoid for the second 5) I need to get a least mean square (LMS) error that is below 0.05. 6) no info provided about the learning rate or the momentum 7) I can generate training samples as many as needed (5000 was used)
and these are my trial codes so far:
inputs = rand([6, 6000]); outputs = randi([0,1],6, 6000); net = feedforwardnet(5); net = configure(net,inputs, outputs); net.layers{2}.transferFcn = 'logsig'; net.trainParam.epochs = 6000; net.trainParam.goal = 0.01; net = train(net,inputs,outputs);
I'm not using yet my data since I'm still thinking how to normalize my inputs. My problem is that when I try to use the network, the output is not a binary array but some floating point values. What should I do about it?
And any further thoughts/advices? I would really appreciate them. Thank you in advance.
采纳的回答
更多回答(1 个)
Greg Heath
2012-10-2
My codes so far:
inputs = xlsread('data.xls',1);
inputs = inputs'; % 6 x 1000
outputs = xlsread('data.xls',2);
1. RENAME 'OUTPUTS' AS 'TARGETS'
outputs = outputs'; % 6 x 1000
outputs = (outputs)*2 - 1;
2, WHY DID YOU DO THIS? .
a. TO USE THE WEIGHTS OUTSIDE OF MATLAB?
OR
B. TO CAPITALIZE ON THE TANSIG HIDDEN NODES?
net = feedforwardnet(15); % OR patternnet
net = configure(net,inputs, outputs);
net.layers{2}.transferFcn = 'logsig';
3. LOGSIG IS NOT COMPATIBLE WITH [-1, 1]
...COULD USE TANSIG. THEN
y = round( (1+net(inputs)) / 2)
4. NO ...BETTER YET:
a. KEEP ORIGINAL {0,1} AND LOGSIG OUT
b. DISABLE INTERNAL OUTPUT DEFAULT NORMALIZATION {-1,1} VIA MAPMINMAX
net.trainParam.lr =0.00;
net.biasConnect(1) = 0;
net.biasConnect(2) = 0;
5. DELETE ABOVE 3 STATEMENTS
net.trainParam.goal = 0.001;
TRY = 0.01*mean(var(output'))
%[net, tr] = train(net,inputs,outputs);
[ net tr Ytrn Etrn ] = train(net,inputs,outputs)
y = round(sim(net,inputs))
6. YOU DIDN'T UNNORMALIZE THE PRETRAINING NORMALIZATION
WHAT ARE MIN AND MAX OF net(inputs) ?
% And majority of the Y values differ by 2 or more values from the actual % targets, e.g. y(:,35) = [0 0 1 1 1 1] while the actual output should be % [1 1 0 0 1 1] .
7. SEE 6
% (1) How can I further (much further) improve this? (2) Is it always % overfitting (if my understanding of overfitting is correct)?
UDERSTANDING IS INCORRECT. SEE THE COMP.AI.NEURAL-NETS FAQ.
(3) Am I % still doing something wrong with the training? Any rules of thumb, or % advices?
7. TO FIND SOME OF MY SAMPLE CODE, SEARCH NEWSGROUP AND ANSWERS USING
NEQ NW NTRIALS
类别
在 帮助中心 和 File Exchange 中查找有关 Define Shallow Neural Network Architectures 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!