Sparse Autoencoder with Adam optimization
6 次查看(过去 30 天)
显示 更早的评论
Hello!
I have a data set that contains 4 parts 1- Train Attribute( 121x125973 double ) , 2- Train Label (1x125973 double ), 3- Test Attribute(121x22544 double ) , 4- Test Label (1x22544 double) for NSL KDD dataset and it is ready to implement algorithem.
I applied sparse autoencoder and works with out any problem
options.Method = 'lbfgs' ;
options.maxIter = maxIter ;
options.useMex = 0 ;
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
But when I try to optimize the result using Adam optimizer I faced this problem " Unrecognized property 'GRADIENTDECAYFACTOR' for class 'nnet.cnn.TrainingOptionsADAM'.
this is my code
options = trainingOptions('adam', ...
'InitialLearnRate',3e-4, ...
'SquaredGradientDecayFactor',0.99, ...
'MaxEpochs',20, ...
'MiniBatchSize',64, ...
'Plots','training-progress');
[opttheta, cost] = minFunc( @(p)sparseAutoencoderCost(p, inputSize, ...
hs, l1, sp, beta, trainAttr), theta, options) ;
trainFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
trainAttr) ;
testFeatures = feedForwardAutoencoder(opttheta, hs, inputSize, ...
testAttr) ;
I wonder how can apply sparse autoencoder with adam optimization ?
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Eigenvalue Problems 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!