How to solve Matlab memory issue
2 次查看(过去 30 天)
显示 更早的评论
Hi all
I am working with huge data file of matrix size 1024*736 and has 367 such files. The approximate RAM required for processing this is 16 GB. I am using Matlab 64-bit version in Linux of 16 GB capacity. But matlab freezes even at 4 GB. What is causing the problem? Is Matlab is not configured to utilise the system memory at all. Are the matlab and system are not working in hands to solve memory problems. How to get rid of this?
Thanks for all your support.
2 个评论
Walter Roberson
2012-11-21
Are you sure about the processing memory? If the matrices are double precision, then the total memory of what you outline is a bit over 2 Gb to load them. What kind of operation do you do on them after they are loaded?
回答(2 个)
Jan
2012-11-21
What is the output of:
[a,b,c] = computer
The memopry can be exhausted by other problems also, e.g. when a pre-allocation is forgotten. Then the finals amount of memory might be some GB only, but temporarily thousands of GB could be allocated. So please post a minimal example which reproduces the problem. Usually rand is sufficient to simulate reading large files.
3 个评论
Matt J
2012-11-22
编辑:Matt J
2012-11-22
To do FDK reconstruction, you don't have to load the entire data set into memory at the same time. Traditionally, it is done by processing one projection view at a time. If your 367 files each contain a projection view, why not just load one of them into RAM at a time, do the processing needed to update your image volume, ad then throw it away?
Also, how do you rationalize
memo = 8*(4*nx*ny*nz + 2*nu*nv);
If your image volume is being held as 4-byte singles, shouldn't it just be
memo = 4*nx*ny*nz;
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Entering Commands 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!