unable to import large data txt file due to memory

2 次查看(过去 30 天)
Hi,
I have been struggeling to import the data I need to process in MATLAB. It contains 3 fundamental cycles each of 20 ms, and I just need the last 20 ms. However, MATLAB wil still not allow me to import that data section. I have tried to increase the JAVA Heap size, but still not enough. Is there a way I can process my data without loosing too many valuable data points?
The file is 18.142.928 KB, each variable in the set has 101799346 values. The data goes from 0 to 60 ms, so I guess i only need 2/3 og the dataset and I only need 3 variables. Again, I have tried to just take one variable at at time, but 2/3 of the variable is still too large.

回答(1 个)

Venkat Siddarth Reddy
编辑:Venkat Siddarth Reddy 2024-5-6
Hi,
You can try using "datastore," which is designed to store data that is too large to fit into memory. This enables you to read data in smaller portions that fit in memory, i.e., it facilitates the incremental import of data into memory, allowing users to access portions of the data sequentially.
To learn more about datastore, refer to the following documentation:
I hope it helps!

类别

Help CenterFile Exchange 中查找有关 Large Files and Big Data 的更多信息

产品


版本

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by