Reducing running time in image processing

2 次查看(过去 30 天)
I wrote a code to find circles in an image by using imfindcircles and do some other calculations on the detected circles. I plan to apply the code to 250000 images. My current code takes 0.8 seconds per image. Processing of each image is completely independent from other images. I am aware of parfor commands but I do my best not to use it because my code is complex enough and I do not like to make it more complex. Is there any way that I can run the script in a parallel way to reduce the total time (and not the running time for each which is 0.8 seoconds)? It should be noted that in some parts of the code I take advantage of GPU as well.

采纳的回答

Walter Roberson
Walter Roberson 2013-8-31
parfor() and related commands such as spmd() are the main approach. Otherwise, especially if you are on Linux or OS-X, run a script that hives off a number of different MATLAB processes, each with slightly different parameters. Though if you are keeping a GPU busy, it is not certain that running multiple such routines would be any faster.
The usual method is to (A) optimize the algorithm; and (B) vectorize the code.

更多回答(0 个)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by