Huge Data file Import Export

1 次查看(过去 30 天)
Saurav Agarwal
Saurav Agarwal 2013-8-6
I have a huge data file as .txt. Each element is separated by a ; and each row by a new line character. The data has been generated by a c++ program. Now, the no. of columns in each row is not constant. I need to read each row in MATLAB. I converted the file to .xlsx and used the following code:
A=xlsread('file.xlsx','Sheet1','2:2');
The command takes a lot of time to execute. The data is 30,000 X 500 approx.
1. How do I read the data in a faster way? I tried using csvread but could not implement it.
2. How do I implement the xlsread code in a loop? I need it to run from '2:2' to '30000:30000'. If I import the whole data, it causes memory failure.
3. In what format should I generate data from c++ so that it is fast to import in excel?
  1 个评论
Cedric
Cedric 2013-8-6
编辑:Cedric 2013-8-6
A 3E4x3E4 matrix size in memory is 7.2GB if stored as double.
These huge files should be stored/managed using a binary format (your own, HDF5, netCDF, etc) if you wanted to be efficient.

请先登录,再进行评论。

回答(0 个)

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by