How to avoid memory leaks with py.pandas.DataFrame objects
25 次查看(过去 30 天)
显示 更早的评论
I was thrilled to see that pandas data frames allow us to pass matlab table to python and to read in pandas data frames into Matlab. However, these objects lead to memory leaks that run out of control when dealing with large amounts of data.
For example, the code below causes the memory use to increase and never go down
for ii = 1:1000
T = array2table(rand(100000,10));
pyT = py.pandas.DataFrame(T);
clear pyT
end
I've tried this to no avail:
for ii = 1:1000
T = array2table(rand(100000,10));
pyT = py.pandas.DataFrame(T);
clear pyT
py.gc.collect();
end
I've also tried this:
for ii = 1:1000
T = array2table(rand(100000,10));
pyT = py.pandas.DataFrame(T);
delete(pyT)
end
I know this is still a very fresh release and its a new feature but does anyone have ideas on how to make use of DataFrames in matlab without causing memory issues? When I close Matlab the memory is deallocated just fine.
2 个评论
cui,xingxing
2024-4-3
I reproduced your problem, and one solution is that using python's data directly avoids infinite memory growth.
for ii = 1:1000
% T = array2table(rand(100000,10));
T = py.numpy.random.rand(int32(100000),int32(10)); % use this instead
pyT = py.pandas.DataFrame(T);
end
As for the reason for passing large arrays from MATLAB to python, ask tech support. If there is something new, please post it below.
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Call Python from MATLAB 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!