GPU out of memory issue appears with trainNetwork.
9 次查看(过去 30 天)
显示 更早的评论
I have a Tesla P100 with 16 GB RAM. Yesterday, I ran the trainNetwork() with different layer achitectures and few different input data. It worked. Then I tried a larger input data set, but get the out of memory error:
Error using trainNetwork
GPU out of memory. Try reducing 'MiniBatchSize' using the trainingOptions function.
Error in A1_B1_C1a_D2 (line 152)
[net,netinfo] = trainNetwork(trainInput,trainTarget,Layers,options);
Caused by:
Error using gpuArray/hTimesTranspose
Out of memory on device. To view more detail about available memory on the GPU, use 'gpuDevice()'. If the problem persists, reset the GPU by calling 'gpuDevice(1)'.
I try to do what is suggested, but it doesn't help. I have tried many different less intensive approaches, done a reboot, and I even have returned to the scripts that used to work fine.
Now nothing works.
Any suggestions to troubleshoot hardware faults or a protective status somewhere?
0 个评论
采纳的回答
Matt J
2023-5-3
编辑:Matt J
2023-5-3
Then I tried a larger input data set, but get the out of memory error:
If you make your data larger and larger, you will eventually run out of memory. Maybe reduce the MiniBatchSize setting.
13 个评论
Joss Knight
2023-5-13
编辑:Joss Knight
2023-5-14
Seems fairly clearcut to me. In your first image, fc2 alone takes up 7.4GB so you're definitely going to struggle, especially for training because you need 8GB for weights, 8GB for their gradients, and probably 8 more for temporaries while you're updating the weights. You need a smaller network. Try adding more convolution layers rather than relying on a massive fully connected layer to do most of the work. Look at the Total Number of Learnables at the top of the Network Analyzer window and multiply it by 4 to get the number of bytes your network will need.
Your other network is much smaller, a 'mere' 1.4GB for the fully connected layers.
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Parallel and Cloud 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!