limits in array size for filtfilt?
5 次查看(过去 30 天)
显示 更早的评论
just wondering if there is a hard size limit for the filtfilt function? I was looking at a ECG time series collected at 500Hz over several hours, with an array of ~14e7 entries in double. In the process of extracting the QRS complex i first run through a Butterworth bandpass filter, using N = 3, Wn = [5 15]*2/fs in the butter(N,Wn) function. It runs fine up to 6810691 entries, anything over it gives all NaN output. Just wondering have I missed something obvious? I am running on a machine with 16gb ram, i7-4xxx, and during the process it haven't hit the memory/cpu limits. Thank you!
Regards,
Alan
0 个评论
采纳的回答
Star Strider
2019-3-5
Use the isnan function to check to be certain that none of the data are NaN:
NaNsum = nnz(isnan(your_signal))
The first NaN will propagate throughout the rest of the filtered signal. If this result is 0, I have no other suggestions or an explanation.
If there are any, use:
NaNidx = find(isnan(your_signal));
to discover their locations.
As a side note, the normal EKG (without arrhythmias) has a spectrum of 0 to 50 Hz, so the passband you’re using will eliminate a significant amount of detail.
4 个评论
Star Strider
2019-3-6
I never encountered the problem you’re reporting. The documentation for filtfilt doesn’t mention anything about support for Tall Arrays (link), assuming your data meets those criteria, although filtfilt is not listed amongst the Functions That Support Tall Arrays (A - Z) (link) either. The documentation also doesn’t mention any sort of length restriction.
If filtfilt works with shorter vectors, consider concatenating the various segments you successfully filter, providing your code does not generate any significant transients on either end of any segment. There may also be other options, such as designing your own overlap-and-add approach.
I have no idea what the problem could be.
更多回答(1 个)
Walter Roberson
2019-3-6
Yes, there is a hard size limit in filtfilt. Data less than 10000 points is put through a different routine that handles endpoints differently. I suspect the routine for the larger arrays exists to be more space efficient, but I am not positive.
Other than that: NO, there is no built-in limit other than what your memory can hold.
Note that several temporary arrays are needed during filtering, so do not expect to be able to process anything more than roughly 1/5th of your memory limit.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Time-Frequency Analysis 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!