Memory grows using parallel

Hi, It's my first approach to the parallel computing, and I wrote a matlab code to take advantage this feature in Matlab. My problem is that the memory usage during the parallel run (using a parfor call) grows until I get an out of memory error.
My code is something like:
DATA=rand(1000,1000);
for k=1:1000
A=DATA(k,:);
B=DATA(k,:);
parfor i=1:100 do_something; end
end
I tried to clear out all the unused variables and pass to the function inside the parfor only the necessary part of data to ensure the smallest overhead, but it seems that in some way the code occupy even more memory for each run. Furthermore the whos output during each run didn't reveal the puzzle (all the variables have the same size).
Any ideas/suggestions to help me solve the problem?
Regards d

5 个评论

You example is incomplete and does not work since the size of DATA is 100 by 100 and you are trying to access the 1000th row of DATA in the outer for loop.
Of course it was a typp that I corrected.
Thanks.
We cannot reproduce your problem unless you define what "do_something" is doing.
Yes you are right. I'm sorry but the code is a bit more complex, it was an attempt to simplify it.
do_something is a function where A and B are cross-correlated (using xcorr).
The cross-correlation results are added to a text file (one for each A-B couple).
Once the for loop is ended the text files are then readed, processed and deleted (I'm planning to use the memmapfile function for this part).
Alvaro
Alvaro 2023-1-25
编辑:Alvaro 2023-1-25
In this case the code might be too simplified to figure out what the issue is, could you post more of the actual code? It would be good to know how you are writing to that text file and what variables you are using to store the results.

请先登录,再进行评论。

回答(0 个)

类别

帮助中心File Exchange 中查找有关 Large Files and Big Data 的更多信息

产品

版本

R2022b

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by