MIMO Neural Network Performance

4 次查看(过去 30 天)
Ji Hyeon Cho
Ji Hyeon Cho 2021-6-23
Hi, I am developing MIMO Neural Network for Integrated Model-Predictive Control.
I have some problems in developing Model.
First, I can't optimize all of outputs from MIMO Neural Network.
Though I normalized all targets and inputs, MIMO Neural Network's Each Performance does not get better.
even, I made initialize function of all layerWeights and InputWeights, Biases same.
and I choose network performance parameter as 'standard'.
Second, so I think the way to copy weights and biases from MISO Neural net to MIMO Neural net.
but it has difference performance of each target. (after copying, I didn't train MIMO Neural net.)
is there anyone who can solve this problems?
this is sample code1 .
rng(0,'twister')
x1 = [linspace(1,10,100);linspace(11,20,100)];
x2 = [linspace(11,20,100);linspace(21,30,100)];
t1 = linspace(101,200,100);
t2 = linspace(201,300,100)*100;
for i = 1:2
x1(i,:) = (x1(i,:)-min(x1(i,:)))/(max(x1(i,:))-min(x1(i,:)));
x2(i,:) = (x2(i,:)-min(x2(i,:)))/(max(x2(i,:))-min(x2(i,:)));
end
q = linspace(0.001,0.01,10);
r = linspace(0.001,0.1,10);
for i = 1:10
for s =1:10
net=fitnet(10,'trainscg');
net.performParam.normalization ='standard';
net.numLayers=4;
net.layerConnect(4,3)=1;
net.layers{3}.transferFcn = 'tansig';
net.layers{3}.initFcn = 'initnw';
net.layers{4}.initFcn = 'initnw';
net.layers{3}.size = 10;
net.biasConnect(3)=1;
net.biasConnect(4)=1;
net.numInputs=2;
net.outputConnect(4)=1;
net.inputConnect(3,2)=1;
net.inputConnect(3,1)=1;
net.divideFcn ='';
net.inputs{1}.processFcns={};
net.inputs{2}.processFcns={};
net.outputs{2}.processFcns={};
net.outputs{4}.processFcns={};
net.layerWeights{4,3}.learnFcn ='learngdm';
net.inputWeights{3,2}.learnFcn ='learngdm';
net.inputWeights{3,1}.learnFcn ='learngdm';
net.inputWeights{1,1}.learnParam.lr =q(i);
net.inputWeights{1,1}.learnParam.mc =r(s);
net.inputWeights{3,2}.learnParam.lr =q(i);
net.inputWeights{3,2}.learnParam.mc =r(s);
net.layerWeights{4,3}.learnParam.lr =q(i);
net.layerWeights{4,3}.learnParam.mc =r(s);
net.layerWeights{2,1}.learnParam.lr =q(i);
net.layerWeights{2,1}.learnParam.mc =r(s);
net.biases{3}.learnFcn = 'learngdm';
net.biases{4}.learnFcn = 'learngdm';
net.biases{1}.learnParam.lr =q(i);
net.biases{1}.learnParam.mc =r(s);
net.biases{2}.learnParam.lr =q(i);
net.biases{2}.learnParam.mc =r(s);
net.biases{3}.learnParam.lr =q(i);
net.biases{4}.learnParam.lr =q(i);
net.biases{3}.learnParam.mc =r(s);
net.biases{4}.learnParam.mc =r(s);
net=train(net,{x1;x2},{t1;t2});
p = net({x1;x2});
first_error1(i,s) = sqrt(mean((t1-p{1,:}).^2))/mean(t1)*100;
first_error2(i,s) = sqrt(mean((t2-p{2,:}).^2))/mean(t2)*100;
disp(first_error1(i,s))
disp(first_error2(i,s))
end
end
the sample code2 is
rng(0,'twister')
x1 = [linspace(1,10,100);linspace(11,20,100)];
x2 = [linspace(11,20,100);linspace(21,30,100)];
t1 = linspace(101,200,100);
t2 = linspace(201,300,100)*100;
for i = 1:2
x1(i,:) = (x1(i,:)-min(x1(i,:)))/(max(x1(i,:))-min(x1(i,:)));
x2(i,:) = (x2(i,:)-min(x2(i,:)))/(max(x2(i,:))-min(x2(i,:)));
end
net1 = fitnet(10,'trainscg');
net1.divideFcn ='';
net1.inputs{1}.processFcns={};
net1.outputs{2}.processFcns={};
net2 = fitnet(10,'trainscg');
net2.numInputs=2;
net2.inputConnect(1,1) = 1;
net2.inputConnect(1,2) = 1;
net2.divideFcn ='';
net2.inputs{1}.processFcns={};
net2.inputs{2}.processFcns={};
net2.outputs{2}.processFcns={};
net1 = train(net1,x1,t1);
net2 = train(net2,{x1;x2},t2);
p = net1(x1);
net1_error1 = sqrt(mean((t1-p).^2))/mean(t1)*100;
p = net2({x1;x2});
net2_error2 = sqrt(mean((t2-cell2mat(p)).^2))/mean(t2)*100;
net=fitnet(10,'trainscg');
net.trainParam.showWindow=false;
net.performParam.normalization ='standard';
net.numLayers=4;
net.layerConnect(4,3)=1;
net.layers{3}.transferFcn = 'tansig';
net.layers{3}.initFcn = 'initnw';
net.layers{4}.initFcn = 'initnw';
net.layers{3}.size = 10;
net.biasConnect(3)=1;
net.biasConnect(4)=1;
net.numInputs=2;
net.outputConnect(4)=1;
net.inputConnect(3,1)=1;
net.inputConnect(3,2)=1;
net.divideFcn ='';
net.inputs{1}.processFcns={};
net.inputs{2}.processFcns={};
net.outputs{2}.processFcns={};
net.outputs{4}.processFcns={};
net.layerWeights{4,3}.learnFcn ='learngdm';
net.inputWeights{3,2}.learnFcn ='learngdm';
net.inputWeights{3,2}.learnParam.lr =0.01;
net.inputWeights{3,2}.learnParam.mc =0.09;
net.layerWeights{4,3}.learnParam.lr =0.01;
net.layerWeights{4,3}.learnParam.mc =0.09;
net.biases{3}.learnFcn = 'learngdm';
net.biases{4}.learnFcn = 'learngdm';
net.biases{1}.learnParam.lr =0.01;
net.biases{1}.learnParam.mc =0.09;
net.biases{2}.learnParam.lr =0.01;
net.biases{2}.learnParam.mc =0.09;
net.biases{3}.learnParam.lr =0.01;
net.biases{4}.learnParam.lr =0.01;
net.biases{3}.learnParam.mc =0.09;
net.biases{4}.learnParam.mc =0.09;
net = configure(net,{x1;x2},{t2;t1}); % During Output Connecting, Output's order gets different
net.IW{1,1}=net1.IW{1,1};
net.IW{3,1}=net2.IW{1,1};
net.IW{3,2}=net2.IW{1,2};
net.LW{2,1} = net1.LW{2,1};
net.LW{4,3} = net2.LW{2,1};
net.b(1) = net1.b(1);
net.b(2) = net1.b(2);
net.b(3) = net2.b(1);
net.b(4) = net2.b(2);
p = net({x1;x2});
net_error1 = sqrt(mean((t1-p{2,:}).^2))/mean(t2)*100;
net_error2 = sqrt(mean((t2-p{1,:}).^2))/mean(t2)*100;

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

产品


版本

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by