Why Interactive MATLAB job require less memory compared to non-interctive job on cluster
2 次查看(过去 30 天)
显示 更早的评论
Hi everyone
I am running matlab on school's cluster (linux system). The original data read into matlab is up to 4 GB, and there is also a array needs 24 GB for calculation in my code. I required 12 cores and 24 GB memory by this command (qsh -pe smp 12 -l h_vmemm=2 matlab=12) for Interactive MATLAB job on school's cluster. The job can run successfully.
However, I required 12 cores with 50 GB for non-interctive job, but it failed somewhere of my code. Then I increased the memory to 80 GB, it can run further.But it would stop as well. Even I used clear command to clear the big arrays, it did not work!
Can any one tell me what is wrong for the non-interctive job?
2 个评论
Kojiro Saito
2018-1-13
What a function do you use for non-interactive job? parfor, batch or spmd? One point is that there's a transparency concern in parfor so, please take a look at this document of Transparency in parfor.
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Parallel Computing Fundamentals 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!