matlab 2016a (linux) memory leak

3 次查看(过去 30 天)
mikhailo
mikhailo 2016-9-15
评论: mikhailo 2021-4-25
Dear community,
several months ago I installed ver. 2016a on out Linux (Ubuntu) Workstation. With the time I saw the this version continuously slows down while using it and become unusable. Matlab restart helps.
Tracing down the issue, I saw the this Matlab version uses more and more memory with the time (~100GB ram in less than 10 mins while analyzing the data, even after clear all).
This is a very strong hint to the memory leak, as in addition to that ver. 2014b does not have this problem.
I did not track down where exactly this happens, but my scripts use a lot of binary and text file r/o operations, such as fopen, fwrite etc. This may be the cause.
So far, using ver. 2014b helps.
Did anybody else experience that and aware about patches?
  1 个评论
José-Luis
José-Luis 2016-9-15
编辑:José-Luis 2016-9-15
Maybe you could try narrowing it down to a specific function? If it is actually a memory leak, that might be valuable information for TMW.

请先登录,再进行评论。

回答(4 个)

Vassilis Lemonidis
Vassilis Lemonidis 2021-4-25
Do we have any news on that issue? It seems still to affect R2020b parallel processing on Linux, clear all does not release workers memory and Matlab needs to be manually closed and restarted after every code execution to forcibly release it. If you believe that it is a problem with the code I am using, could you please direct me to a method that I could add to the classes I have constructed and I am using, which could handle the garbage collection process? Also, is there a tool to identify weak pointers that clear all cannot handle well?
  3 个评论
Vassilis Lemonidis
Vassilis Lemonidis 2021-4-25
Unfortunately I am bound to use a more updated version of Matlab, as I have mat-files from newer versions that will lose information if reverted to an older one, while I am also using toolboxes that were previously aso not available... Thank you whatsoever for the suggestion, it is a bit annoying that nothing has been done to fix this behavior, after so many versions and years having passed....
mikhailo
mikhailo 2021-4-25
if you can nail down to problem you could share the code to the support. In my case, I was not an option for me.
Worst case, Matlab is not the only option for the scientific computing nowadays.

请先登录,再进行评论。


Haiying Tang
Haiying Tang 2017-11-7
编辑:Haiying Tang 2017-11-7
We got this error too!
Actually, It occurred both in R2015b and R2016b for Linux, but R2014b is OK!
We tested the same codes in R2014b, R2015b and R2016b on CentOS 7. Our Matlab code(exported to a Java package by MCC) uses the Parallel Computing Toolbox and runs under the MCR with 16-core CPU and 16G RAM servers. In R2014b, each worker keeps about 500M and runs normally, while in R2015b and R2016b, it increase the memory until the worker exit(killed by the OS) after running about 24 hours.
Because of this memory issue, we have to keep using the R2014b at present!
Hope someone can help us and thanks a lot!
(Posted by my technical workmate: Pingmin Fenlly Liu)
  3 个评论
Steven Lord
Steven Lord 2017-11-7
Please contact Technical Support using the Contact Us link in the upper-right corner of this page. Provide them a code segment or workflow with which you can reproduce this memory increase and work with them to determine if this is actually a bug and if so how to resolve it.
Pingmin Liu
Pingmin Liu 2017-11-8
This is my another account with my company email.

请先登录,再进行评论。


mikhailo
mikhailo 2017-11-7
编辑:mikhailo 2017-11-7
Same for me. I just keep using 2014b. I remember trying to find the problem but it does not seem to be an easy one.
Guys from Mathworks: please, as you see it is not a fake, repair it. 2 years and no response. After all, that is your responsibility. And the issue is serious.
  7 个评论
mikhailo
mikhailo 2017-11-9
I confirm the issue. I had troubles while using parfor loop. Never found the reason as it is not debuggable from within parfor. No issue after replacing parfor to for-loop.
Code cannot be unfortunately shared as commercial. I remember spending about an hour searching for the reason but did not find it. And ended up keep using the 2014b version.
Steven Lord
Steven Lord 2017-11-9
I searched the Bug Reports for any bugs that mention "parfor" and existed in release R2016a, but none seemed to deal with memory leaking.
As I stated above, I strongly recommend that if possible you send a small sample of the code that shows this behavior to Technical Support using the Contact Us link in the upper-right corner of this page so they (and the development team) can investigate.
Even if you can't isolate the problem down to a particular segment of code, you might want to contact them and ask what steps they recommend to use to investigate the problem. They may be able to help you narrow down the location of the problem to the point where they can pinpoint what's going on.

请先登录,再进行评论。


Zhuo Chen
Zhuo Chen 2017-11-9
Hi,
I understand that your MATLAB is running slowly. I have a few things for you to try regarding this issue.
First, please navigate to the Java Heap Memory preferences and increase the allocated value to 2048MB. Then restart MATLAB and see if the lag occurs on start-up and throughout your work.
Secondly, please disable the source control integration for MATLAB. You can find this at Preferences > MATLAB > General > Source Control and select "None". Restart MATLAB and then see if there is a change in the performance.
I strongly recommend that if possible you post a small sample of the code that shows this behavior here.
  2 个评论
Pingmin Fenlly Liu
Pingmin Fenlly Liu 2017-11-10
It's not the slow response of Matlab, but the memory leak (almost) in MCR. Thank you all the same.
Pingmin Fenlly Liu
Pingmin Fenlly Liu 2017-11-15
OK, I'll try it as you said someday and thanks again.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Parallel for-Loops (parfor) 的更多信息

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by