How do I use multiple GPU for GAN
2 次查看(过去 30 天)
显示 更早的评论
In the example metioned on MATLAB "Train Generative Adversarial Network (GAN) - MATLAB & Simulink (mathworks.com)" how/where should the code be changed so as to use multiple GPU's?
Though, "auto" is used, all the GPU's are not being used by default. I have 4 gpu's and want to use them all.
1 个评论
Shuaibin WAN
2021-11-25
Hi Shaw,
I also encounter this problem. Did you have any solution now?
Many thanks!
回答(2 个)
Antti
2021-10-12
编辑:Antti
2021-10-12
Hi! You should change 'ExecutionEnvironment' option to 'multi-gpu'. More info here. Before doing that, you might want to check if your GPU's are detected b
>> numGPU = gpuDeviceCount("available")
you don't get 4 as a result, then your GPU's are not supported by MATLAB, or there's a driver issue. Please accept my answer formally if this worked for you.
Antti
2021-10-12
It appears that when using custom training loops (as in the example), "multi-gpu" option is not supported. However, you can still take advantage multiple GPUs, by launching parallel MATLAB workers, where each worker will use a GPU of its own. See this example: https://se.mathworks.com/help/deeplearning/ug/train-network-in-parallel-with-custom-training-loop.html. Please formally accept my answer if this solves your problem.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 GPU Computing 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!