Best practices for large memory requirements: how to read/write to file

8 次查看(过去 30 天)
I'm trying to scale up a FEM problem that a coworker of mine developed. The way the code runs now, we open a vector of N nodes x 413 timesteps, and record the concentration for each node and time step. Previously the problem was tractable to where we could easily store all N x 413 in memory until the main function end, and then write object to file as a .mat file. However, the scale I'm looking at is much larger (~2GB per time step). It is completely unfeasible to store all time steps in memory, so I need to change the code to dump to a file instead of just store it in memory for a dump at the very end of the run.
To do this, what is a good write method? Ideally I would like to write to the same file to avoid a lot of file I/O opening/closing operations, as that is not optimized on the platform I'm working on. Instead, I'd like to open one file and write to it once per time-step. I could write one unique file to each timestep, but I feel that that would be stressful on the system. Are there any good file I/O methods which can handle writing/reading only portions of a very large file at a time? I know this is getting out of MATLAB's strengths, and we should probably move to an environment better suited to our problem, but learning a little MATLAB I/O is much easier than re-writing something that already works in a new FEM framework.

回答(1 个)

Walter Roberson
Walter Roberson 2016-7-28
If the nodes are all exactly consistent in the length and type of each field (relative to the other nodes) then you could use memmapfile() or you could fwrite()
In terms of programming it is a lot easier to use fixed length fields and memmapfile()
  2 个评论
Ryan Woodall
Ryan Woodall 2016-7-28
Yes, the dimensions of the data stay the same for each iteration, so I think memmapfile will do very nicely. Reading the documetation for that function, what is the best way to instantiate that object? I don't want to make a memory allocation and then save that to file. Can you tell the memmap the size of the structure you are ultimately going to feed it without first putting that structure into memory?
Would this work, or is there a better way to initialize the memmap object?
n = <really big number>;
nsteps = 413;
m = memmapfile('C_datadump.mat',...
'Format',{'double', [n nsteps], 'C'},...
'Writable',true);
%Then change the data using the following
for t=1:413
C=[1:n];
m.Data.C(:,t)=C;
end
Walter Roberson
Walter Roberson 2016-7-28
memmapfile() will extend the file if necessary. I suspect however that you would gain a little efficiency if you write something out to the maximum location, thus creating a file of the appropriate size that it could position in without having to stop to grow the file all the time.

请先登录,再进行评论。

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by