I need to insert multiple 32x32 arrays into 1024x1024 array at predetermined random locations. How to do this efficiently?
3 次查看(过去 30 天)
显示 更早的评论
I need to insert many (10^6) 32x32 arrays into single 1024x1024 array and sum them. Insertion location is random, but predetermined. Simple for loops result in long execution time. How can I optimize? I do have parallel processing and gpu available.
Nx=ceil(rand(1e6,1)*(1024-32)); %avoiding partial arrays at edges
Ny=ceil(rand(1e6,1)*(1024-32));
smallArray=ones(32);
largeArray=zeros(1024);
for ii=1:1e6
largeArray(Nx(ii):Nx(ii)+31,Ny(ii):Ny(ii)+31)=largeArray(Nx(ii):Nx(ii)+31,Ny(ii):Ny(ii)+31)+smallArray;
end
2 个评论
Ahmet Cecen
2016-5-29
On the other hand, I am not entirely sure why this would take any longer than 2 seconds. Is that slow?
采纳的回答
Joss Knight
2016-6-8
Use accumarray (and randi):
N = 1024;
Nx = randi([1 (N-32)], 1e5, 1);
Ny = randi([1 (N-32)], 1e5, 1);
linIndStart = sub2ind([N N], Nx, Ny);
smallArray = gpuArray.ones(32);
[I, J] = ndgrid(1:32);
linOffsets = sub2ind([N N], I(:), J(:)) - 1;
linInd = bsxfun(@plus, linIndStart, linOffsets');
values = repmat(smallArray(:)', numel(Nx), 1); % Replace with actual vals
largeArray = accumarray(linInd(:), values(:), [N*N 1]);
largeArray = reshape(largeArray, N, N);
If your submatrices are genuinely all just ones, then just replace the values(:) with 1.
2 个评论
Joss Knight
2016-6-9
Oh well done - I knew I'd answered basically the same question before but I couldn't unearth the old answer!
更多回答(1 个)
Ahmet Cecen
2016-5-29
If your small array will always be the same, you might be able to do this lightning fast with a convolution. Instead of using the random ok indices as corners, you would use them as centers. Do something like:
LargeArray(randindexvector)=1;
You can separately find repetitions using unique and add them if you happen to get the same center twice. Much smaller for loop.
Then you convolve with a 16x16 ones matrix.
There might be some rough edges in this approach that needs some smoothing, but if you can get it to work, it would be milliseconds.
2 个评论
Ahmet Cecen
2016-5-29
You can cascade on stages of uniqueness and vectorize the fill. Say you created 1000 random values, and say 100 of them are duplicates. You first work on the 900, then 100, then any duplicates within the 100 etc. You do something like:
for i = 1:BiggestNumberofRepetations
WORK HERE
end
Hopefully this is an orders of magnitude smaller iteration. This allows you to iterate over the dimensions of the smaller matrix instead of the number of random numbers. I will burst the memory here but you can work around it. It should look something like:
RandInd = sub2ind([1024 1024],Nx,Ny);
for i = 1:32*32
largeArray(RandInd+floor((i-1)/32)*1024)+mod(i,32)) = largeArray(RandInd+floor((i-1)/32)*1024)+mod(i,32)) + smallArray(i);
end
Now there is some nasty indexing there so double check that. Now your iteration count is MaximumRepetationsCount*numel(smallArray), which in most cases should be still orders of magnitude smaller.
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!