how to reduce memory cost for transfer learning with semantic segmentation

1 次查看(过去 30 天)
I am trying to use transfer learning with semantic segmentation to classify 960x720x3 images. I am using matlab on my surface with 8GB RAM and a build in GPU. When I try to run the code I get the message that the memory is not sufficient. To resolve the issue I thought about using PCA on my pixel data but I only found a comparable example that used parallel computing.
Therefore I would like to ask if there are any other solutions to reduce the memory cost of my code or if there is any way to get acces to more memory via matlab? (perhaps something like cloud computing)

回答(1 个)

Srivardhan Gadila
Srivardhan Gadila 2021-4-15
Try reducing the mini-batch size using the 'MiniBatchSize' option of trainingOptions.
If reducing the mini-batch size does not work, then try using a smaller network, reducing the number of layers, or reducing the number of parameters or filters in the layers.
If the gpu memory is still not sufficient then you can train the network on the cpu by using the 'ExecutionEnvironment' option of trainingOptions.
You can refer to Deep Learning in Parallel and in the Cloud & Deep Learning in the Cloud for information related to cloud computing.

类别

Help CenterFile Exchange 中查找有关 Parallel and Cloud 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by