How can I extract the values of weights and biases after each training epoch?
14 次查看(过去 30 天)
显示 更早的评论
I need to extract the values of weights and biases after each training epoch. I can easily extract these values after the training is finished, but not during the training.
1 个评论
Image Analyst
2015-7-19
Depends on where the data is, like in a file, or coming in from an instrument on a serial line, or wherever. Where is it and in what form or format is it in?
回答(3 个)
Nick Hobbs
2015-7-21
I am going to assume you are referring to the Neural Network Toolbox due to your reference to weights, biases, and epochs. One way to see the weights after every epoch is to set the network to only train one epoch at a time, and then to use the 'getwb' command. The following code is an example on how to do this with a sample dataset and a feedforward network.
[x,t] = simplefit_dataset;
net = feedforwardnet(20);
net.trainParam.epochs = 1;
weights = []
for i = 1:10
net = train(net,x,t);
weights = [weights getwb(net)]
end
You can also save this matrix instead of printing it out. Do note, however, if you use this method your performance plot, and others, will only appear for the last epoch. So you may need to create your own function to monitor the validation and test data. You will also need to determine when to stop training using a function of your own design.
2 个评论
Greg Heath
2015-7-23
Another problem is that every time train is called, certain internal parameters (mu?) are reinitialized. Therefore, you will not get the same final set of weights as if you just used one call of train.
I recall saving the aforementioned parameters at the end of every epoch and using them to reinitialize train so that the training was equivalent to not stopping every epoch. Unfortunately I don't remember the name or date of the post.
Salman Habib
2017-4-7
Hi Greg, I am trying to train a neural network using for loop, 1 epoch at at time, and I want matlab to continue training with the weights and biases from the previous training. Is there any way to save the weights during the current iteration of the loop, and use them to initialize the neural network weights and biases in the next loop iteration ? That is, I want the number of epochs to be the same as the number of iterations of the for loop (say N), and call the training function N times.
Thank you in advance.
Mark Hudson Beale
2015-7-24
Greg is right, the function to get weights outside of a training function is getwb.
Within a training function it is slightly different. In recent versions of the Neural Network Toolbox, each training function has a trainingIteration helper function. (I.e. edit trainlm for an example.) And within a training function the most reliable way to get weights is calcLib.getwb(calcNet). (Also, see trainlm code for example of this being used.)
So you might insert the following code snippet at the end of trainingIteration in trainlm to get and save a record of the weights in a workspace variable "weightRecord".
try
wr = evalin('base','weightRecord');
catch
wr = {};
end
wr{end+1} = calcLib.getwb(calcNet);
assignin('base','weightRecord',wr);
If you then train with TRAINLM you can get the weight record in the base workspace:
>> [x,t] = house_dataset;
>> net = feedforwardnet(10,'trainlm'); % Has 151 weights and biases
>> net = train(net,x,t);
>> weightRecord
weightRecord =
Columns 1 through 4
[151x1 double] [151x1 double] [151x1 double] ...
1 个评论
Zheng Chai
2017-12-18
Hi Mark, I have the same question: how to record the weights within training. I saw calcLib.getwb but could not find the trainingIteration in trainlm. I am using Matlab R2103b. Could you please help me with where to past your code? Many thanks.
Salma Hassan
2018-1-23
编辑:Salma Hassan
2018-1-23
this option 'CheckpointPath' in trainoptions function save the parameters value after each epoch
1 个评论
Andrea Daou
2020-8-18
Hello,
After using 'CheckpointPath', many .mat files will be saved. How can I visualize the parameters ?
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!