predict fucntion in deep learning toolbox does not use gpu
5 次查看(过去 30 天)
显示 更早的评论
I use a pre-trained network from tensorflow 2.0 to predict a depth image from an RGB image. The code is:
dlX = dlarray(double(I)./255,'SSCB');
dlY = predict(dlnet,dlX);
The code works fine, but it is very slow. I find that it seems that the code only use the cpu core instead of gpu.
From the online help document, I find the following explanation:
It seems that the default way to run predict is to use a gpu. I find my gpu seems to be avaliable in MATLAB by running the gpu test function like:
gpuDevice;
A = gpuArray([1 0 1; -1 -2 0; 0 1 -1]);
e = eig(A);
It works fine with my gpu:
Name: 'GeForce RTX 2060'
Index: 1
ComputeCapability: '7.5'
SupportsDouble: 1
DriverVersion: 11.2000
ToolkitVersion: 11
MaxThreadsPerBlock: 1024
MaxShmemPerBlock: 49152
MaxThreadBlockSize: [1024 1024 64]
MaxGridSize: [2.1475e+09 65535 65535]
SIMDWidth: 32
TotalMemory: 6.4425e+09
AvailableMemory: 4.9872e+09
MultiprocessorCount: 30
ClockRateKHz: 1200000
ComputeMode: 'Default'
GPUOverlapsTransfers: 1
KernelExecutionTimeout: 1
CanMapHostMemory: 1
DeviceSupported: 1
DeviceAvailable: 1
DeviceSelected: 1
Any way to deal with this problem? Thank you very much.
2 个评论
回答(1 个)
Joss Knight
2021-8-14
That is the documentation for DAGNetwork, not dlnetwork. dlnetwork does not have an ExecutionEnvironment, it chooses its environment in the same way that other GPU operations do, by reacting to the incoming data. As KSSV points out, converting to a gpuArray is the correct solution in this case.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Image Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!