Error with using xlsread on 9000 files
显示 更早的评论
Hello dear friends,
I have a problem with using xlsread on a large amount of files (over 9000 excel files). When I try to read every single one of them, I get a random error after about 5000 files. When I manually continue with the exact same file that causes the error, it will work just fine. But I am working on automatically reading all the files without having to manually correct each time I try to load them into my workspace.
So what I tried to do is that I used try and catch, something like this (pseudo-code):
i=1;
while i<MaxNumberOfFiles
try
data(i) = xlsread(...);
i=i+1;
catch
end
end
But it won't work since it's caught in the loop without end (always catches the same error). I think it might have something to do with restarting the whole "Matlab-running-the-process" when I manually correct the error and it works. Did anyone experience something like that before and could I avoid the error by using something like a timeout or memory-refreshing?
Looking forward to your ideas
Roman
采纳的回答
更多回答(2 个)
Walter Roberson
2016-2-1
0 个投票
Some MATLAB versions have failed to close xls files after reading them, which causes the session to run out of open file descriptors. If that is happening to you then a periodic close('all') can be useful (but of course that will close any file you might have wanted left open, such as if you were writing results to a file, so you need to take that into account.)
15 个评论
Image Analyst
2016-2-1
I'm pretty certain the "non-closing" of Excel between calls to xlsread() started with R2015b, because that's when it started breaking my apps.
What version do you have and what operating system? If you're using Windows, you can use ActiveX to read the files and close the workbook and use the same file descriptor (I think) if what Walter says is true.
Roman
2016-2-1
Guillaume
2016-2-1
If the problem is indeed that workbooks are not being closed, then it's fairly easy to force close them (assuming they've not been modified):
excel = actxserver('Excel.Application');
excel.Workbooks.Close
will close all open workbooks.
It would be useful to know what the actual error is, though.
Walter Roberson
2016-2-1
Image Analyst, R2015b is the version where they started deliberately caching Excel file connections. But versions before that have sometimes just forgotten to close open files.
Roman
2016-2-1
Roman
2016-2-1
Jan
2016-2-1
@Roman: Please explain "didn't work" with details. Did you get an error message? Where did you insert these 2 lines?
Stephen23
2016-2-1
Roman's "Answer" moved here:
Thanks for the answers so far, but neither close all nor excel.Workbooks.Close works.
The first produces the same random error (after about 5000 files), the latter produces the same random error, just in an earlier stage (after about 2000 files).
Any more ideas?
Roman
2016-2-1
Roman
2016-2-1
What file format are the files? What delimiter is used? Are the columns fixed width? Are there leading spaces? Are the strings surrounded by quotation marks? Do the date strings contain space characters? What format are the date strings? Are there trailing spaces?
So much that we don't know, because a "description" does not give enough information. That is why I asked for an upload of a sample file.
You don't have to upload one of your restricted files: just create one that has the same layout, with some fake data. It does not even have to be the same number of rows, just fake a bit of data that looks the same and is arranged in the same way. Then we have something to test our suggestions on.
Roman
2016-2-1
Roman
2016-2-2
Stephen23
2016-2-2
Thank you. I wrote an Answer using your CSV file.
Image Analyst
2016-2-1
0 个投票
Processing 9000 files with xlsread() will basically take .... well .... forever. This is because it must launch Excel, do the reading, and then shut down Excel. You know how long it takes to launch Excel? Well imagine doing that 9000 times. There is no way you should be doing that with R2015a. I use Active X for anything over about 3 files. You must use ActiveX if you want to get done this year. I attach a demo.
2 个评论
Roman
2016-2-1
Image Analyst
2016-2-1
You might try readtable() instead of xlsread(). It generates a table instead of a cell array, which has way less memory so it might be faster than xlsread(). Worth trying anyway. Time with tic and toc.
类别
在 帮助中心 和 File Exchange 中查找有关 Text Data Preparation 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!