Multi GPU option for LSTM/GRU Layers

2 次查看(过去 30 天)
Hello,
I know that right now it is not possible to use LSTM Layers and the multi-gpu option for the training process in Deep Learning. Is this a function that will be implemented in near future? I would realy like to use Matlab for my current research but the calculations are taking just too long with the size of the data and the current restriction of only one Geforce 1080TI.
Thanks,
Barry

采纳的回答

Bhargavi Maganuru
Bhargavi Maganuru 2020-7-10
Hi,
Parallel training is not currently supported for networks with LSTM layers. This has been brought to the concerned people. It might be considered in any future release.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by