Simple Linear Neural Network Weights from Training , are not compatable with training results, cant understand why… Matlab 2011a

1 次查看(过去 30 天)
I have a very strange problem The weights that I get from training, when implied directly on input , return diffrent results!!! Ill show it on a very simple examaple lets say we have an input vector x= 0:0.01:1; and target vector t=x^2 (I know it better to use non linear network ) after training ,2 layer ,linear network ,with one neuron at each layer, we get:
sim(net,0.95) = 0.7850 (some error in training - thats ok and should be ) weights from net.IW,net.LW,net.b:
IW =
0.4547 LW =
2.1993 b =
0.3328 -1.0620 if I use the weights : Out = purelin(purelin(0.95*IW+b(1))*LW+b(2)) = 0.6200! , I get different result from the result of the sim!!! how can it be? whats wrong? after several hours of exploring , Im desperate:-(
the code:
%Main_TestWeights close all clear all clc
t1 = 0:0.01:1; x = t1.^2;
hiddenSizes = 1; net = feedforwardnet(hiddenSizes);
[Xs,Xi,Ai,Ts,EWs,shift] = preparets(net,con2seq(t1),con2seq(x)); net.layers{1,1}.transferFcn = 'purelin'; [net,tr,Y,E,Pf,Af] = train(net,Xs,Ts,Xi,Ai); view(net);
IW = cat(2,net.IW{1}); LW = cat(2,net.LW{2,1}); b = cat(2,[net.b{1,1},net.b{2,1}]);
%REsult from Sim t2=0.95; Yk = sim(net,t2)
%Result from Weights x1 = IW*t2'+b(1) x1out = purelin(x1) x2 = purelin(x1out*(LW)+b(2))

采纳的回答

R
R 2012-8-5
thank you very much that realy helped I used mapminmax to get the real values of data(on inout and otpput)

更多回答(1 个)

Greg Heath
Greg Heath 2012-8-4
Feedforwardnet automatically uses mapminmax to transform inputs and targets to [-1,1].
help feedforwardnet
doc feedforwardnet
Therefore, the weights should be applied to the normalized input and the normalized output should be unnormalized.
Hope this helps.
Greg

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by