Reading large excel file using "read" command takes ~5 minutes, is this expected performance?
4 次查看(过去 30 天)
显示 更早的评论
I am reading a simulation/test output data in a .xlsx file into a matlab table through a datastore variable. The test data contains 450+ variables, each with 20000+ samples (i.e.) 450+ columns and 20000+ rows but all are numbers. I created a datastore on the excel file, modified the selected variables and variable type properties and used read command to read the file into a matlab table, it took about ~5 minutes. When I tried readtable command on the excel file directly, it took about the same time as well. However when I tried reading the file interactively using matlab export dialog, it took less than 30 seconds, so I am wondering if there's any way to achieve the same level of efficiency programmatically?
0 个评论
采纳的回答
J. Alex Lee
2020-9-6
Try manually creating the import options with spreadsheetimportoptions().
2 个评论
J. Alex Lee
2020-9-7
Yes, the idea is to fully specify the import parameters so that they don't have to be auto-detected.
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Spreadsheets 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!