Reducing running time in image processing
显示 更早的评论
I wrote a code to find circles in an image by using imfindcircles and do some other calculations on the detected circles. I plan to apply the code to 250000 images. My current code takes 0.8 seconds per image. Processing of each image is completely independent from other images. I am aware of parfor commands but I do my best not to use it because my code is complex enough and I do not like to make it more complex. Is there any way that I can run the script in a parallel way to reduce the total time (and not the running time for each which is 0.8 seoconds)? It should be noted that in some parts of the code I take advantage of GPU as well.
采纳的回答
更多回答(0 个)
类别
在 帮助中心 和 File Exchange 中查找有关 Big Data Processing 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!