working with 1 TB or greater data files

7 次查看(过去 30 天)
I would like to know what is the best way to work with very large data files that range from 1 TB to 30 TB. The files include in-phase and quadrature (IQ) data captured from a real-time signal analyzer. I am running Matlab 2014b on a Windows 8 64-bit computer with 64 GB of RAM.
I would like to be able to read in the data files to conduct basic signal processing and analysis such as FFTs and advanced routines more specific to RF analysis such as error vector magnitude, adjacent channel power ration, etc.
I am not familiar with Matlab's parallel computing capabilities or other 'bid data' capabilities such as mapreduce, memmapfile, or datastore.
Any information, feedback or suggestions as to recommended practices would be most welcome.
thanks, JimB
  3 个评论
yashwanth annapureddy
Yes, it would be good to know what type of files you are dealing with. datastore and mapreduce work with tabular text and mat files of a specific format.
Please do refer to the documentation of datastore and mapreduce and let us know for any questions using them.
James Buxton
James Buxton 2014-12-5
the file is a binary file. I can easily read data from a small file using 'fread'.

请先登录,再进行评论。

回答(1 个)

Darek
Darek 2014-11-14
Don't use Matlab. It's a waste of your time. Use AWS Kinesis with Redshift.

类别

Help CenterFile Exchange 中查找有关 Large Files and Big Data 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by