Convolutional neural network toolbox

2 次查看(过去 30 天)
Luca G
Luca G 2017-10-27
编辑: Greg Heath 2017-11-20
Hi, I use convolutional neural network toolbox.This is my code:
network1WB(1).Weights = (randn([5 5 1 1]) * 0.01);
network1WB(1).Bias = (randn([1 1 1])*0.01);
network1WB(2).Weights = (randn([5 5 1 20]) * 0.01);
network1WB(2).Bias = (randn([1 1 20])*0.01);
network1WB(3).Weights = (randn([40 320]) * 0.01);
network1WB(3).Bias = (randn([40 1])*0.01);
network1WB(4).Weights =( randn([150 40]) * 0.01);
network1WB(4).Bias = (randn([150 1])*0.01);
network1WB(5).Weights =( randn([10 150]) * 0.01);
network1WB(5).Bias = (randn([10 1])*0.01);
layers = [imageInputLayer([28 28 1])
convolution2dLayer(5,1,'Stride',1)
reluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(5,20,'Stride',1)
reluLayer
maxPooling2dLayer(2,'Stride',2)
fullyConnectedLayer(40)
fullyConnectedLayer(150)
fullyConnectedLayer(10)
softmaxLayer
classificationLayer()];
layers(2).Bias=network1WB(1).Bias;
layers(2).Weights=network1WB(1).Weights;
layers(5).Bias=network1WB(2).Bias;
layers(5).Weights=network1WB(2).Weights;
layers(8).Bias=network1WB(3).Bias;
layers(8).Weights=network1WB(3).Weights;
layers(9).Bias=network1WB(4).Bias;
layers(9).Weights=network1WB(4).Weights;
layers(10).Bias=network1WB(5).Bias;
layers(10).Weights=network1WB(5).Weights;
options = trainingOptions('sgdm','ExecutionEnvironment','gpu',...
'Shuffle','never',...
'CheckpointPath','.\Model1',...
'L2Regularization',reg,...
'InitialLearnRate',0.01,...
'LearnRateSchedule','piecewise',...
'LearnRateDropFactor',0.9993,...
'LearnRateDropPeriod',1,...
'MaxEpochs',epoch, ...
'Momentum',momentum,...
'MiniBatchSize',minibatch);
[convnet,traininfo] = trainNetwork(imtr,categorical(labelstra),layers,options);
where imtr are training set composed by images and labelstra is labels.If I run the code for two times with the same weights and the same training set ,the convolutional neural network obtain different result.Is possible?Or there are something wrong?

回答(3 个)

Steven Lord
Steven Lord 2017-10-27
  1 个评论
Luca G
Luca G 2017-10-27
Thank you for reply! I explain better.In first run, i perform all code that i have put on post and evalue this model. So in my workspace, there are weights. In the second run, I always perform this code but commenting these lines that are on post :
if true
network1WB(1).Weights = (randn([5 5 1 1]) * 0.01);
network1WB(1).Bias = (randn([1 1 1])*0.01);
network1WB(2).Weights = (randn([5 5 1 20]) * 0.01);
network1WB(2).Bias = (randn([1 1 20])*0.01);
network1WB(3).Weights = (randn([40 320]) * 0.01);
network1WB(3).Bias = (randn([40 1])*0.01);
network1WB(4).Weights =( randn([150 40]) * 0.01);
network1WB(4).Bias = (randn([150 1])*0.01);
network1WB(5).Weights =( randn([10 150]) * 0.01);
network1WB(5).Bias = (randn([10 1])*0.01);
end
I think it is the same thing.Sorry,if i bad explain

请先登录,再进行评论。


Javier Pinzón
Javier Pinzón 2017-11-16
Hello Luca,
As far as I know, and with some test that I have performed before, if two trainings have the same initial weights, the ConvNet may not behaves in the same way, however, the behavior should converge in a similar way.
In the other hand, when I test the two trained networks, with a validation dataset, one gave me the epoch 120 as the best with one, with the another, the epoch 210, and the "training accuracy" has very similar behavior.
It may occurs because the network, in any time, may star to learn some different small features.
I hope this small explanation helps.
Regards,
Javier

Greg Heath
Greg Heath 2017-11-16
编辑:Greg Heath 2017-11-20
As alluded to above:
You will only get duplicate results if the RNG is initialized to the same initial state!
In particular, to repeat the result
You have to RESET the RANDOM NUMBER GENERATOR to THE SAME initial STATE
For details, read
From browser:
help rng
doc rng
From website:
https://www.mathworks.com/help/matlab/ref/rng.html
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 个评论
Salma Hassan
Salma Hassan 2017-11-20
plz sir i have the same problem can you explain this in more simple details...thank
Steven Lord
Steven Lord 2017-11-20
Call rng before calling rand, randn, randi, or another random number function to initialize the weights.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by