Running a large array
26 次查看(过去 30 天)
显示 更早的评论
Hi,
I am trying to load in an array (the array is 362x332x75x1032 (34.7GB)), which exceeds maximum array size preference (16.0GB). How do i go about loading it in.
It is a variable which is part of a .nc dataset
0 个评论
回答(3 个)
Star Strider
about 8 hours 前
I have rarely needed to use them, so I have little experience with them.
2 个评论
Star Strider
about 5 hours 前
I was not aware that it was a CDF file. There is a set of functions to work with netCDF Files, and another set of functions to work with CDF Files. They may have options that would work. (I rarely use CDF files, so I do not have extensive experience with them and functions using them.)
John D'Errico
about 4 hours 前
编辑:John D'Errico
about 4 hours 前
Memory is cheap. Get more memory.
I'm sorry, but if you want to work with big data, you will often need sufficient capabilities to handle that data. No matter what, working with huge arrays, lacking sufficient will be slow. So your next question will be, how can I make my code run faster. Again ... get sufficient memory. Or, solve smaller problems.
2 个评论
Walter Roberson
4 minutes 前
Did the professor assign the hardware and say "you must run it on this hardware" ?
If not, then:
- if it is your own hardware, then there is always the option of upgrading it
- if it is university hardware, then there is always negotiating with the university to obtain an upgrade
Voss
about 2 hours 前
You can read and process the file in sections, one section at a time, in a loop, by specifying the start and count input arguments to ncread.
And/or specify the stride argument to read only every so many values instead of all of them.
0 个评论
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!