Would appreciate someone pointing me in the right direction:
Currently I am using urlread to feed in data from a password protected database. I read in all the data at once as a block and then start breaking it up with textscan. Currently, I place a limit on the amount of data I'm pulling in. You can see it below (password removed):
url = 'http://assa104/eurofot/csv';
block = urlread(url, 'Authentication', 'Basic', 'Post', {'login', '***', 'passwd', '***', 'trips', '1332244409', 'signals', '1.AVL_BRTORQ_SUM.AVL_BRTORQ_SUM_DVCH', 'limit', '300000','content_type', 'text'});
but when I remove this limit, I get the following errors:
Error using urlreadwrite (line 97)
Error downloading URL. Your network connection may be down
or your proxy settings improperly configured.
Error in urlread (line 36)
[s,status] =
urlreadwrite(mfilename,catchErrors,url,varargin{:});
My question is two-fold: 1. Any idea why I am returning such an error? 2. This method takes a while (almost a minute with my limit at 300k) and I want to read in even more data in the future. Is there an alternative to my current method? Maybe one that doesn't save the whole set in a block at once but one that reads in rows at a time?
I'm not very experienced with urlread, so any assistance would be greatly appreciated.
Thanks in advance.