Huge array size problem

1 次查看(过去 30 天)
Hello,
i am trying to create a matrix of size 110 000 x 110 000 (91.1 GB) that should containt small double values.
i will be using the values in this huge matrix as inputs to a specific algorithm . i am aware that this exceedes the maximum array size and that it could not be insirted into memory. So i am looking for a fast (the algorithm accesses the data an enormus amount of times) method to have access to this data while overcoming the size problem.
Could you propose a solution for this problem ?
i am using MATLAB2016b
this is the code i used for the generation of the matrix
clc
clear
load('Gw8_matrices.mat','Gw8');
size_group=110592;
distance_table_gw8=zeros(size_group,size_group);
%now fill the table
for i=1:size_group
i
for j=1:size_group
temp1=Gw8(:,:,i) - Gw8(:,:,j);
temp2= norm (temp1,'fro');
distance= round(temp2,4);%Most important
distance_table_gw8(i,j)=distance;
end
end
save('distance_table_gw8.mat','distance_table_gw8','-v7.3');
thank you
  1 个评论
rough93
rough93 2019-9-25
Could you not just have several sub-matrices and access the one you need? You could set up two for loops to create a square of submatrices and just insert your matrix creation in the middle (with small parameters of course).

请先登录,再进行评论。

采纳的回答

Matt J
Matt J 2019-9-25
编辑:Matt J 2019-9-25
You probably should not pre-store this matrix. You should probably just (re)compute chunks of the matrix as you need them. Note, for example, that the complete j-th column of your matrix can be obtained quite efficiently as follows,
G=reshape(Gw8,[],size_group);
jthColumn= round( vecnorm(G-G(:,j),2,1) ,4).';
and will be even faster on the GPU if you have the Parallel Computing Toolbox.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Matrices and Arrays 的更多信息

产品


版本

R2016b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by