Importdata fails to import large files

5 次查看(过去 30 天)
Hi everybody.
I've have a bunch of large files as the following one:
# x 1 2 3 4 ...
# y 5 6 7 8 ...
# z 10 11 12 13 ...
# Time
1 87 85 82 81 ...
2 67 86 19 34 ...
... ... ... ... ...
Since i don't know either the number of rows or the number of columns, usually I import them using
data=importdata(filePath,' ',4);
but for some of them (the biggest ones 1.32 and 1.7 GB) this command doesn't work and data is an empty variable.
I managed to solve this with an ad hoc solution deleting the first 4 rows and using the command
data=importdata(filePath,' ');
Since I have a large number of them I would like to have a solution that works for all of them. What can i do?
Thank you

采纳的回答

Walter Roberson
Walter Roberson 2013-10-7
fid = fopen('YourFile.txt');
for K = 1 : 4 %skip 4 lines
fgetl(fid);
end
here = ftell(fid); %remember where we are
fields = regexp( fgetl(), '\s+', 'split'); %read line, split it into columns
numcols = length(fields); %count them
fseek(fid, here, 'bof');; %reposition to prior line
fmt = repmat('%f', 1, numcols); %maybe %d if entries are integral
datacell = textscan( fid, fmt, 'CollectOutput', 1); %read file
fclose(fid); %we are done with it
data = datacell{1};
  1 个评论
Luca Amerio
Luca Amerio 2013-10-7
That's PERFECT!!!
Just for future readers there's a very little error in
fields = regexp( fgetl(), '\s+', 'split');
that must be corrected in
fields = regexp( fgetl(fid ), '\s+', 'split');
except for that it's one of the best advice I've ever received. I reduced the memory usage and the time required for importing data by almost 60-70%.
That's awesome!
Thank you sooooo much!
Luca

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 String Parsing 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by