Neural Network in Bayes Optimization

4 次查看(过去 30 天)
Hey, I am reading this article "Deep Learning using Bayes Optimization".
I got it that we want to optimize the hyperparameters of a neural network, like section depth. But where is the neural network defined? Is it defined in "Object Function"?

回答(1 个)

Srivardhan Gadila
Srivardhan Gadila 2020-11-23
The network is defined in the "valErrorFun" function, which is inside the "makeObjFcn" function. Refer to the Objective Function for Optimization section for more information.
function ObjFcn = makeObjFcn(XTrain,YTrain,XValidation,YValidation)
ObjFcn = @valErrorFun;
function [valError,cons,fileName] = valErrorFun(optVars)
imageSize = [32 32 3];
numClasses = numel(unique(YTrain));
numF = round(16/sqrt(optVars.SectionDepth));
layers = [
imageInputLayer(imageSize)
% The spatial input and output sizes of these convolutional
% layers are 32-by-32, and the following max pooling layer
% reduces this to 16-by-16.
convBlock(3,numF,optVars.SectionDepth)
maxPooling2dLayer(3,'Stride',2,'Padding','same')
% The spatial input and output sizes of these convolutional
% layers are 16-by-16, and the following max pooling layer
% reduces this to 8-by-8.
convBlock(3,2*numF,optVars.SectionDepth)
maxPooling2dLayer(3,'Stride',2,'Padding','same')
% The spatial input and output sizes of these convolutional
% layers are 8-by-8. The global average pooling layer averages
% over the 8-by-8 inputs, giving an output of size
% 1-by-1-by-4*initialNumFilters. With a global average
% pooling layer, the final classification output is only
% sensitive to the total amount of each feature present in the
% input image, but insensitive to the spatial positions of the
% features.
convBlock(3,4*numF,optVars.SectionDepth)
averagePooling2dLayer(8)
% Add the fully connected layer and the final softmax and
% classification layers.
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer];
miniBatchSize = 256;
validationFrequency = floor(numel(YTrain)/miniBatchSize);
options = trainingOptions('sgdm', ...
'InitialLearnRate',optVars.InitialLearnRate, ...
'Momentum',optVars.Momentum, ...
'MaxEpochs',60, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',40, ...
'LearnRateDropFactor',0.1, ...
'MiniBatchSize',miniBatchSize, ...
'L2Regularization',optVars.L2Regularization, ...
'Shuffle','every-epoch', ...
'Verbose',false, ...
'Plots','training-progress', ...
'ValidationData',{XValidation,YValidation}, ...
'ValidationFrequency',validationFrequency);
pixelRange = [-4 4];
imageAugmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandXTranslation',pixelRange, ...
'RandYTranslation',pixelRange);
datasource = augmentedImageDatastore(imageSize,XTrain,YTrain,'DataAugmentation',imageAugmenter);
trainedNet = trainNetwork(datasource,layers,options);
close(findall(groot,'Tag','NNET_CNN_TRAININGPLOT_UIFIGURE'))
YPredicted = classify(trainedNet,XValidation);
valError = 1 - mean(YPredicted == YValidation);
fileName = num2str(valError) + ".mat";
save(fileName,'trainedNet','valError','options')
cons = [];
end
end
  2 个评论
Jyoti Nautiyal
Jyoti Nautiyal 2021-7-10
Why number of filters is
numF = round(16/sqrt(optVars.SectionDepth)); ?
Also, why number of filters are getting doubled at every convolution block?

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by