About manipulating very large array

1 次查看(过去 30 天)
Hello everybody,
I have 3500 mat files that contain a 128*4408 matrix each (128 is the dimension and 4408 is the number of features), and I have to reduce the dimensionality of the features by PCA (from 128 to 64, for example).
To do this, I append all the 3500 matrices to obtain a 128*15428000 matrix X (15428000 = 3500*4408) and then use
[~, Xnew] = pcares(X,64);
for PCA reconstruction.
The problem is, the amount of memory needed to store X and Xnew is about 128*15428000*8 bytes ~ 15Gb each, while I only have 4Gb of RAM.
Could anybody please suggest me a way to overcome this problem?
I tried to use VVAR but did not really understand how it works. I tried first to create a 10000*10000 matrix:
vvar.createFile(10000*10000);
x=vvar(10000,10000);
but got the following error:
Error using vvar (line 251)
The file created by "vvar.createFile(N)" has insufficient space left
Thank you in advance for your help.

采纳的回答

John D'Errico
John D'Errico 2014-1-5
编辑:John D'Errico 2014-1-5
I think it is time for you to start learning about PCA. For example, how you can create the necessary covariance matrix from those pieces, rather than ever building one HUGE array.
Start reading. I like Ted Jackson's book on PCA, but we worked together, so I might be biased.
In addition, I would get more RAM. It is cheap, so why waste your own time?
Finally, make sure you are using a 64 bit version of MATLAB, to make better use of your memory, and allow you to go past the hard limits.
  1 个评论
f10w
f10w 2014-3-20
I'm so sorry John, I completely forgot this post. Thanks for your answer.

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Dimensionality Reduction and Feature Extraction 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by