Out of memory on device using GPU

1 次查看(过去 30 天)
I'm trying to classify images using VGG19 and I have this error: "Out of memory on device".
How can I fix it?
  2 个评论
Felipe Assunção
Felipe Assunção 2019-9-21
ans =
CUDADevice with properties:
Name: 'GeForce RTX 2060'
Index: 1
ComputeCapability: '7.5'
SupportsDouble: 1
DriverVersion: 10.1000
ToolkitVersion: 9.1000
MaxThreadsPerBlock: 1024
MaxShmemPerBlock: 49152
MaxThreadBlockSize: [1024 1024 64]
MaxGridSize: [2.1475e+09 65535 65535]
SIMDWidth: 32
TotalMemory: 6.4425e+09
AvailableMemory: 4.6098e+09
MultiprocessorCount: 30
ClockRateKHz: 1710000
ComputeMode: 'Default'
GPUOverlapsTransfers: 1
KernelExecutionTimeout: 1
CanMapHostMemory: 1
DeviceSupported: 1
DeviceSelected: 1

请先登录,再进行评论。

回答(1 个)

neal paze
neal paze 2021-12-12
Have you solved the problem? I have the same problem.
  1 个评论
Felipe Assunção
Felipe Assunção 2021-12-12
I don't remember if I solved this problem, but I think that you need to do a review of your dataset size and characteristics. I stayed so disoriented with Matlab that I preferred migrate to Python, using Google Colab. My life change for better 🙏🏼

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 GPU Computing 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by