saving variables in a single .mat file

3 次查看(过去 30 天)
Hello,
I have 360 .mat files containing same variable in with different data (row vectors) each of size in(1x3800000) stored in them. They are of size 9.84GB (all 360 files).
Now I want to save them all in 1 .mat file as a matrix out(360x3800000).
How can I do it?
  2 个评论
Daniel Shub
Daniel Shub 2012-3-26
What problems are you running into?
zozo
zozo 2012-3-26
I need to extract some data from each of these 360 files(row vectors) and do further processing. So, I do not want to load them 1 by 1,extract data, load again and so on. Having them all together in a cell/array makes it far easier.

请先登录,再进行评论。

采纳的回答

ndiaye bara
ndiaye bara 2012-3-26
Try this code m=zeros(3800000,360); for k=1:360, eval(sprintf('load F*_%d.mat',k)); % F*=name of all your files .mat% eval(sprintf('y=F*_%d(:,1);',k)); disp(k); clear F* if k==1, m=y; else m=[m,y]; end end
save File m t %save the new file .mat
  4 个评论
zozo
zozo 2012-3-27
Same variable 'in' was saved each time, but with different file names as data(1).mat,data(2).mat,data(3).mat.......data(360).mat
Please suggest the syntax for my case.
Jan
Jan 2012-3-27
Dear zozo, accepting an answer means, that it solves your problem.
Daniel and I have warned you that you cannot load a 11GB of data (360*3800000*8 byte per double) efficiently, if you have only 4GB of RAM. I assume you need 32GB RAM to work efficiently with such large data, 64GB is safer.
The above EVAL approach is cruel.
You currently did not specify in which format you want to store the data, DOUBLEs or SINGLEs, an integer type, as cell or matrix. Anyhow, I'm convinced, that it is the wrong approach due to the limited memory.

请先登录,再进行评论。

更多回答(2 个)

Jan
Jan 2012-3-26
Do you have a 64-bit Matlab version? How many RAM do you have installed? Do you want to store the values in one 360 x 3'800'000 array, a most likely more useful 3'800'000 x 360 array, of as separate vectors e.g. in a {1 x 360} cell. The later has the advantage, that it does not need a contiguos free block of memory.
  4 个评论
Jan
Jan 2012-3-26
4GB RAM is very lean for such a big chunk of data. If it is really necessary to keep all values in the RAM simultaneously, buy more RAM. Implementing workarounds to process the data in pieces will be more expensive.
Jan
Jan 2012-3-26
@Siva: Does you comment concern the current topic? If so, please explain the connection. If not, please delete the comment and post it as a new question - with more details. Thanks.

请先登录,再进行评论。


Daniel Shub
Daniel Shub 2012-3-26
In a comment to Jan you say you have 4 GB of RAM. Loading 9+ GB of data is going to bring your computer to a screeching halt.
Try and create an array of the required size and see what happens ...
x = randn(360, 38000000);
  2 个评论
zozo
zozo 2012-3-26
Yes, I think I will load 7x(50x3800000)+10 at a time. But Iam having problem loading them into 1 file as suggested by @ndiaye
Daniel Shub
Daniel Shub 2012-3-27
Why? Nobody wants a 1+GB data file. Leave the files small and load them as needed. I doubt there is much of a benefit of doing a single huge load.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Environment and Settings 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by