Appending to a very large file
6 次查看(过去 30 天)
显示 更早的评论
Hi,
I'm having trouble writing very large files to disk. I'm appending 64 smaller files (each ~1 GB) into a sinlge giant matrix. I expect the file to be ~64 GB, and I'm running into an "Out of memory" problem during processing. I'm wondering if there's a more efficient way to do this without needing to load all of the smaller files into memory before writing one monster file to disk. Is there a way for me to load each one at a time and append that to the file, then clear memory and load the next?
Current code looks like this:
clear
close all
clc
% Make a for loop to import every channel
for i=1:64
fprintf('i = %f\n', i);
[Samples, Header] = Nlx2MatCSC(['CSC' num2str(i,'%02.f') '.ncs'],...
[0 0 0 0 1], 1, 1, [] );
%temp_1 = Samples';
temp_2 = reshape(Samples,[],1)';
if exist('signal_mat')
signal_mat = vertcat(signal_mat,temp_2);
else
signal_mat = temp_2;
end
clear Samples Header temp_2
end
clear i
% Demedian the data
fprintf('Demedian data');
signal_med = median(signal_mat);
signal_mat_demed = signal_mat - signal_med;
%% Write to file for KS2
fprintf('Write data');
fid = fopen('myNewFile.dat', 'w');
fwrite(fid,signal_mat, 'int16');
fclose(fid);
fid = fopen('myNewFile_demed.dat', 'w');
fwrite(fid,signal_mat_demed, 'int16');
fclose(fid);
clear
fprintf('Done');
采纳的回答
Jan
2021-1-7
This line increases the problem:
signal_mat = vertcat(signal_mat,temp_2);
In e.g. the last step, you concatenate a 63 GB array with a 1 GB array and copy it to a new 64 GB array. This requires 63+64 GB of RAM.
Pre-allocation would avoid this problem. In your case it could work with 64 + X GB RAM, where X might be 8 or 20. But even then this is a huge signal. How much RAM do you have?
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Data Import and Analysis 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!