How to speed up large data calculations

15 次查看(过去 30 天)
Hi, I'm working on a big array, data, up to N=10,000,000 points, as well as its time index, t. Then I am trying to form a new 2*N data, Intp, involving interpolation of the original array. I am using a Gaussian kernel to select the sub-array (only 100 to 200 data point) to perform the calculation, as the code shows. Since the data size is so large, it may take days to finish the calculation. It looks like a convolution, at the first look, but not. If the data is small, I may use meshgrid, etc. Right now I'm using for loop and in my Windows 7 system with Matlab R013a, 8gb memory computer, it takes 2.5 hours to calculate only 1M data point. I am wondering if I can use any vector or matrix to replace these codes, or using PCT, etc. My code is listed as below(here I use a sinuous signal to replace my raw data). The program needs too much computation time so I want to increase the speed. Anybody have a better idea to speed up the code? Thanks a lot for your help.
clear all N=1e5; f=500; t = linspace(0,1,N); NA=5e+05; data=sin(2*pi*f*t); k=1:2*N; Comp=0*k; Intp=0*k; for k=1:2*N, Comp=abs(t-(k-1)/(2*N)); idx = find(0.0025-abs(t-(k-1)/(2*N)) >= 0); Intp(k)=sum(data(idx).*exp(-0.00005*Comp(idx).^2)); end %figure(1) %plot(Intp)

回答(1 个)

Sagar Zade
Sagar Zade 2019-12-12

类别

Help CenterFile Exchange 中查找有关 Matrix Indexing 的更多信息

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by