How to average more than 50 3D matrices using nanmean

1 次查看(过去 30 天)
Hi, I am trying to average a lot of 3D matrices using NaNmean. I have tried using cat but my 3D matrices are huge (351x400x400) which is using a lot of memory. Is there a better way to do this ?
  7 个评论
Adam Danz
Adam Danz 2019-11-14
编辑:Adam Danz 2019-11-14
Hmmmm... concatenating 50 arrays that each have more than 56 million elements isn't going to happen.
Off the bat I can think of a couple ideas.
1) Using 2 loops, you can loop through each file and partially load each 351 x 400 slice so you have 50 of those matricies which would make ~7m data points. If that's still too large you could partially load in each 351x1 column. Then you can do element-wise averaging and store the values as you proceed through the loops. That would involve 50 x 400 loops which isn't a big deal.
2) you can reorganize your data as tall arrays which are designed for large amounts of data.

请先登录,再进行评论。

采纳的回答

Matt J
Matt J 2019-11-14
编辑:Matt J 2019-11-15
Here's what I would do, I suppose. It assumes each of your .mat files stores the volume under the name 'a'.
Summation=0;
NCounter=0;
files=dir(fullfile('yourFolder','*.mat'));
for i=1:numel(files)
S=load(fullfile('yourFolder',files(i).name));
map=isnan(S.a);
S.a(map)=0;
Summation = Summation + S.a;
NCounter = NCounter + (~map);
end
result = Summation./Ncounter;

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Creating and Concatenating Matrices 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by