create an Evolutionary Neural network ???
1 次查看(过去 30 天)
显示 更早的评论
I need to create an Evolutionary Neural network and i used the function
net = patternnet(hn); but i tuned the weights manually but what i need is to have the output value of the neural network when the input passes through the layer until the output node and get the result without any back propagation of the errors neither adjutement of the weights .So how can i do that ? Can i define my training function as input of the patternet(hn,@my_fun) or newff(...,@my_fun) is it possible ? the second idea is to create the neural network with the function patternet or any one else , but what i need here is to stop the neural network after his first ieration which means don't let teh neural network adjusting the weights by itself .
1 个评论
Walter Roberson
2012-9-19
Stopping after one iteration was discussed in your (still-active) question http://www.mathworks.co.uk/matlabcentral/answers/47034-how-can-i-stop-the-neural-network-after-just-one-iteration
采纳的回答
Greg Heath
2012-9-19
Since this is a pattern recognition algorithm, for c classes or categories, your targets should be columns of the c-dimensional unit matrix. For an I-H-O (O=c) node topology, use the layer equations
h = tansig(IW*x+b1); % Output of hidden layer
y = logsig(LW*h+b2); % Output of output layer
e = t-y; % error
NMSE = mse(e)/var(t,1,2)% Normalized mean-square-error
NMSEgoal = 0.005
Genetic Search: Consult a reference because I'm making this up.
[ I N ] = size(x)
[ O N ] = size(t)
Neq = N*O % Total number of scalar training equations
1. Standardize x
2. Initialize the random number generator
3. Choose the number of hidden nodes, H
This will yield Nw = (I+1)*H+ (H+1)*O unknown weights to be estimated from
Neq equations. Require Neq > Nw but desire Neq >> Nw . So choose
H << Hub = (Neq-O)/(I+O+1)
4. Generate M random sets of Nw weights
5. Use the weights in the equations and choose the best B nets.
6. Randomly mutate m% of the B weight sets
7. Generate (M-B) new random sets of Nw weights
8. Repeat 4-7 until the NMSEgoal is reached or the number of repetitions has exceeded a specified limit.
9 . If NMSEgoal is reached, STOP. Otherwise go to 3 and increase H
Again, see a genetic algorithm reference,
Thank you for officially accepting my answer.
Greg
2 个评论
Greg Heath
2012-10-6
After you have obtained the weights via a genetic algorithm, you can load them into a net and use sim or net to obtain the output as explained in my 2nd answer.
As explained in that answer, avoid using any of the NNTB functions until the genetic algorithm has converged to a final set of weights.
更多回答(2 个)
Greg Heath
2012-10-6
编辑:Greg Heath
2012-10-6
My combined equations for y and h are equivalent to the sim and net functions when calculating the output
y = sim(net,x);
or
y = net(x);
If you follow my directions above, there is no reason to use the Toolbox functions patternnet, configure, train or sim.
However, once you have converged to a final set of weights using the genetic code above, you can load them into a net using
net = patternnet(H);
net.IW{1,1} = IW;
net.LW{2,1} = LW;
net.b{1} = b1;
net.b{2} = b2;
Otherwise, you will have to load weights into the net at EACH step for EACH set of candidate weights. Obviously, you might be retired before the design converges.
Hope this helps.
2 个评论
Greg Heath
2020-1-31
Definitely not. Genetic solutions of ANY problem typically take much much longer.
Greg
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!