MATLAB Answers

0

To do arithmetic operations with large arrays going out of memory space

Asked by manishika rawat on 18 Nov 2019 at 12:25
Latest activity Commented on by manishika rawat on 19 Nov 2019 at 10:39
Hi, I am working in code that generates large array values which increases with every iteration of monte carlo simulation. To get accurate result, I need to run the code for large iteration values. However, somehere whlike running, the code stops working stating "out of memory". At the end of the loop I need to perform some arithmatic operation on the values of the array generated. Can you please help me on how can I work with large arrays?

  3 Comments

We can't help unless you give us more details of exactly what you are doing, what your variables are, and how you are building up your arrays.
manishika rawat's "Answer" moved here:
I would like to rephrase the question for better understanding. I have MATLAB code that generates samples of random values for a given set of iterations. I save them in a cell array as the number of values generated over every iterations are different. At the end of the loop, I need to convolve all the rows of the cell array with each other to fing the final result.
However, since each cell array has significant size, convolution leads to a large array which increases after every successive convolution with the next cell array. At this stage the MATLAB stops and gives "out of memory" error. Thus I wanted to find some way so that the array is not saved in memory and can be used for arithmatic operations. I tried using "matfile" option but it seems that for saving a "matfile", I need to declare a variable first which would occupy memory. Thus I wanted to find some solution to this problem since I cannot restrict the size of array as it would lead to loss of information.

Sign in to comment.

1 Answer

Answer by Walter Roberson
on 19 Nov 2019 at 5:51

It is not possible to tell MATLAB to store variables on disk, at least not in the way you are describing.
You can tell your system to increase your available swap space, and your operating system will then automatically swap to and from memory. However, this tends to be slow.
You might be able to convert your cell array entries into tall() arrays and convolve those. Remember, though, that your thesis is that your convolved arrays are too big for memory, so the final result is probably going to be too big for memory. And then what do you do with it?

  3 Comments

The final value that I get are the mean values of the Mixture model. I will get the final probability distribution by adding the individual distribution along those values. Converting cell arrays into tall arrays doesnot help in reducing the arrays size or probably restricting the memory space required.
You convert the individual cell arrays into tall arrays. When you convolve the tall arrays, the result will be a tall array, and MATLAB would automatically manage the RAM and disk use for that. Leave the result of the first step as a tall array and convolve with the next cell-turned-into-tall-array, and MATLAB will manage the RAM and disk use for that operation, and so on.
As long as the cell array elements fit into memory long enough to build tall arrays from them, then the convolution of them would not need to fit into memory.
Thank you for your suggestion. I tried to work with tall arrays but am not able to convert cell array to tall array in a loop for successive iterations. Since cell array can handle diiferent sizes of rows in successive iterations which is not the case with tall array. Do I need to convert each cell array individually to a tall array. This would not be feasible when I need to run large number of iterations.

Sign in to comment.