Classification of a matrix and binary mask
1 次查看(过去 30 天)
显示 更早的评论
Hello everyone,
With the aim to optimize the ability to discriminate between two peaks (apes obtained a correlation between a scene containing two letters E and F, and the target that has the letter F) and from a theoretical calculation, I came to this matrix has the form):
M (u, v) = | D (u, v) | cos (Phi (d) - Phi (t))-epsilon | T (u, v) |
with D (u, v) is the FT of the letter E centered;
T (u, v) is the FT of the letter F-centered;
Phi (d) the phase of E centered and Phi (t) is the F center.
* The summation matrix (u, v = 0 to image size / 2)
1-I want to classify this way matrix, where the sum is greater than 0, must I eliminate from the positive area, and if the amount is less than 0, I eliminate from the negative region.
So run the program until you have a sum equal to 0. 2 - How to code binary mask that blocks certain frequencies to optimize the correlation and on what basis, I have to do this?
Thank you in advance
0 个评论
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Image Segmentation and Analysis 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!