Solve out of memory problem

1 次查看(过去 30 天)
Hello I have laptop, Dell precision M4800, processor Intel core i7 , 2.8GHz , RAM 16 GB. Then, I lntall SSD hard and 8 GB Ram.
When I use it for classification using large scale data, It runs out of memory.
I tried to solve this problem by increasing system swap space, changing double variables to single precision, clear unused variables But the problem still appears.
Is there another solutions I can use to solve this problem
Thanks in advance

采纳的回答

Ameer Hamza
Ameer Hamza 2020-11-5
编辑:Ameer Hamza 2020-11-5
This shows that the dataset is too large to fit in the available RAM. The solution is not to read all dataset at once. Instead, create an image datastore: https://www.mathworks.com/help/matlab/ref/matlab.io.datastore.imagedatastore.html. Functions such as trainNetwork() accept image datastore as input. This link will also be useful: https://www.mathworks.com/matlabcentral/answers/291597-best-way-to-deal-with-large-data-for-deep-learning.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by