How to increase the performance and accuracy of Deep neural network ?

2 次查看(过去 30 天)
I am a beginner in Neural network world. I want to make a seismic impedance inverse network. The code I used is given below. The accuracy and the performance is very low. What should I do to improve my network, any help will be extremely appreciable.
clc
clear
% Solve an Input-Output Fitting problem with a Neural Network
% Script generated by Neural Fitting app
% Created 14-May-2019 07:54:35
%
% This script assumes these variables are defined:
%
% x - input data.
% t - target data.
%Ricker_wave
f = 25;
fs = 150;
xval = -0.05:1/fs:0.05;
psi = (1-(2*(pi^2)*(f^2)*(xval.^2))).*(exp(-(pi^2)*(f^2).*(xval.^2)));
% % For visualise
% figure;
% plot(xval,psi)
% view([90 90])
% title('Mexican Hat Wavelet');
% grid on
%reflectivity
R1 = linspace(0,0.5,239580);
% For shuffling
[m1,l1] = size(R1) ;
idx1 = randperm(l1) ;
b1 = R1 ;
b1(1,idx1) = R1(1,:);
R2 = linspace(0,-1,239580);
[m2,l2] = size(R2) ;
idx2 = randperm(l2) ;
b2 = R2 ;
b2(1,idx2) = R2(1,:);
R3 = linspace(0.5,1,239580);
[m3,l3] = size(R3) ;
idx3 = randperm(l3) ;
b3 = R3 ;
b3(1,idx3) = R1(1,:);
RR = [R1;R2;R3];
% To make the Reflectivity in random places
RR1 = zeros(30,239580);
for j = 1:30
T = randi(30,60,3);
for k = 1:3
n1 = round(T(k));
if j==1
RR1(n1,[1:7986]) = RR(k,[1:7986]);
else
RR1(n1,[((j-1)*7986)+1:j*7986]) = RR(k,[((j-1)*7986)+1:j*7986]);
end
end
end
% convolution
RR1000 = zeros(150,239580);
for i = 1:30
if i == 1
RR1000(30,:) = RR1(i,:);
else
RR1000(30+((i-1)),:) = RR1(i,:);
end
end
ST = zeros(165,239580);
for i = 1:239580
ST(:,i) = conv(RR1000(:,i),psi);
end
% To visualise the traces
% ST1 = ST(:,66200);
% figure;
% wiggle(ST1)
% grid on
x = ST(10:end,:);
t = RR1000;
% Choose a Training Function
% For a list of all training functions type: help nntrain
trainFcn = 'traingd';% Gradient-desent optimisation algorithm with momentum
% Create a Fitting Network
hiddenLayerSize = 200;
hiddenLayerSize1 = 100;
hiddenLayerSize2 = 50;
net = feedforwardnet([hiddenLayerSize hiddenLayerSize1...
hiddenLayerSize2],trainFcn);
% maximum epoch as default is 1000
% performance goal is 0
% Minimum performance gradient is 1e-5
% Show training GUI is true
% Maximum time to train in seconds is inf
net.layers{1}.transferFcn = 'poslin'; %Activation layer
net.trainParam.max_fail = 6; %validation fail maximum number
net.trainParam.lr = 0.05; %learning rate
net.performParam.regularization = 0.001; %regularisation parametr
net.trainParam.mc = 0.1; %moment constant
% Training stops when any of these conditions occurs:
%
% The maximum number of epochs (repetitions) is reached.
% The maximum amount of time is exceeded.
% Performance is minimized to the goal.
% The performance gradient falls below min_grad.
% Validation performance has increased more than max_fail times since the
% last time it decreased (when using validation).
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotfit(net,x,t)
%For testing data
y = net(ST(10:end,216200));
TT = 1:150;
TT = TT';
figure;
subplot(1,3,3)
stem(TT,y,'r')
view([90 90])
title('Computed reflectivity')
grid on
subplot(1,3,2)
stem(TT,RR1000(:,216200),'r')
view([90 90])
title('Actual reflectivity')
grid on
subplot(1,3,1)
ST1 = ST(10:end,216200);
plot(ST1)
view([90 90])
title('Trace')
grid on
title('Trace')
grid on

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Pattern Recognition and Classification 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by