Insufficient memory capacity when using autoencoder

2 次查看(过去 30 天)
When I use 6000 images of 64X64 size for training, the following error message appears. Is there any other way besides using smaller size images for training?

回答(1 个)

Sai Pavan
Sai Pavan 2023-10-5
Hi ChiaWei,
I understand that you are trying to resolve the out of memory error raised while training an autoencoder in MATLAB.
  • The standard method of resolving this error is to reduce the size of training images. However, as you are looking for other ways of tackling the issue, one way is to try decreasing the “hiddenSize” parameter so that the number of learnable parameters of the model reduces, thereby decreasing the need for higher memory requirement.
  • You can also try decreasing the parameter “MaxEpochs”.
  • Also, try to increase the GPU resources allocated for training the model.
Please refer to the below documentation to learn more about the different parameters that can be tweaked in the autoencoder model: https://www.mathworks.com/help/deeplearning/ref/trainautoencoder.html
Hope it helps.
Regards,
Sai Pavan

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

产品


版本

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by