Matlab program out of memory on 64GB RAM Linux but not on 8GB RAM Windows
3 次查看(过去 30 天)
显示 更早的评论
I get unexpected out of memory errors when using Matlab 2013a-64bit on a supercomputer. My program uses no more than 5GB of memory, much less than is available on a supercomputer on either RAM or swap space. The RAM on one node of this cluster is 64GB. The same program runs fine on a personal Windows computer with just 8GB RAM. I am unable to check how much memory Matlab can use as command 'memory' is unavailable on Unix platform. My stack space is set to unlimited although I am not sure if this has any impact on Matlab. Could you offer me any assistance? Is there any way to check how much memory can Matlab use and if there is a way to expand it?
To be more specific I get out of memory errors when using "train" function of the neural network toolbox. The error path points to some function with "mex" in its name which stands for Matlab executable like a C program. I wonder if Matlab lets C run some calculations and C runs out of memory. I thought such a scenario would be prevented by my stack space being unlimited. If someone has more experience running Matlab on Linux from a terminal window, I will appreciate any advice.
2 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Introduction to Installation and Licensing 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!