Main Content

matchFeaturesInRadius

Find matching features within specified radius

Since R2021a

Description

indexPairs = matchFeaturesInRadius(features1,features2,points2,centerPoints,radius) returns the indices of the features most likely to correspond between the input feature sets within the specified radius or radii around each expected match location.

example

[indexPairs,matchMetric] = matchFeaturesInRadius(___) also returns the distance between the features in a matched pair in indexPairs.

[indexPairs,matchMetric] = matchFeaturesInRadius(___,Name,Value) specifies options using one or more name-value arguments in addition of the input arguments in previous syntaxes.

Examples

collapse all

Load a MAT file containing an image and camera data into the workspace.

data = load('matchInRadiusData.mat');

Convert the camera pose to extrinsics.

orientation = data.cameraPose2.Rotation;
location = data.cameraPose2.Translation;
[rotationMatrix,translationVector] = cameraPoseToExtrinsics(orientation,location);

Project the 3-D world points associated with feature set one onto the second image.

centerPoints = worldToImage(data.intrinsics,rotationMatrix,translationVector,data.worldPoints);

Match features between the two feature sets within spatial constraints.

indexPairs1 = matchFeaturesInRadius(data.features1,data.features2, ...
        data.points2,centerPoints,data.radius,'MatchThreshold',40, ...
        'MaxRatio',0.9);

Match features between the two feature sets without using spatial constraints.

indexPairs2 = matchFeatures(data.features1,data.features2, ...
        'MatchThreshold',40,'MaxRatio',0.9);

Visualize and compare the results between the two ways of matching features.

figure
subplot(2,1,1)
showMatchedFeatures(data.I1,data.I2,data.points1( ...
    indexPairs1(:,1)),data.points2(indexPairs1(:,2)));
title(sprintf('%d pairs matched with spatial constraints',size(indexPairs1,1)));

subplot(2,1,2)
showMatchedFeatures(data.I1,data.I2,data.points1( ...
    indexPairs2(:,1)),data.points2(indexPairs2(:,2)));
title(sprintf('%d pairs matched without spatial constraints',size(indexPairs2,1)));

Figure contains 2 axes objects. Hidden axes object 1 with title 144 pairs matched with spatial constraints contains 4 objects of type image, line. One or more of the lines displays its values using only markers Hidden axes object 2 with title 130 pairs matched without spatial constraints contains 4 objects of type image, line. One or more of the lines displays its values using only markers

Input Arguments

collapse all

Feature set one, specified as a binaryFeatures object or an M1-by-N matrix. The matrix contains M1 features, and N corresponds to the length of each feature vector.

You can obtain the binaryFeatures object using the extractFeatures function with the fast retina keypoint (FREAK), oriented fast and rotated brief (ORB), or binary robust invariant scalable keypoints (BRISK) descriptor method.

Data Types: logical | int8 | uint8 | int16 | uint16 | int32 | uint32 | single | double | binaryFeature object

Feature set two, specified as a binaryFeatures object or an M2-by-N matrix. The matrix contains M2 features, and N corresponds to the length of each feature vector.

You can obtain the binaryFeatures object using the extractFeatures function with the fast retina keypoint (FREAK), oriented fast and rotated brief (ORB), or binary robust invariant scalable keypoints (BRISK) descriptor method.

Data Types: logical | int8 | uint8 | int16 | uint16 | int32 | uint32 | single | double | binaryFeature object

Feature points of feature set two, specified as either an M2-by-2 matrix in the format [x y] or an M2-element feature point array. For a list of point feature types, see Point Feature Types.

Data Types: single | double | point feature type

Expected matched locations in the second image that correspond to the feature points from features1, specified as an M1-by-2 matrix of coordinates in the format [x y].

Data Types: single | double

Search radius associated with the center points, specified as a scalar or an M1-element vector. When you specify the radius as a scalar value, the function uses the same search radius for all center points.

Data Types: single | double

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: 'Metric','SSD' specifies the sum of squared differences metric for feature matching.

Matching threshold, specified as a scalar value in the range (0,100]. The default values are 10.0 for binary feature vectors or 1.0 for nonbinary feature vectors. You can use the matching threshold to select the strongest matches. The threshold represents a percent of the distance from a perfect match.

Two feature vectors match when the distance between them is less than the threshold set by 'MatchThreshold'. The function rejects a match when the distance between the features is greater than the value of 'MatchThreshold'. Increase the value to return more matches.

Inputs that are binaryFeatures objects typically require a larger value for the match threshold. The extractFeatures function returns a binaryFeatures object when extracting FREAK, ORB, or BRISK descriptors.

Ratio threshold, specified as a scalar value in the range (0, 1]. Use the ratio to reject ambiguous matches. Increase this value to return more matches.

Feature matching metric, specified as either 'SAD' or 'SSD'.

'SAD'Sum of absolute differences
'SSD'Sum of squared differences

This property applies when you specify the input feature sets, features1 and features2, as matrices. When you specify the features as binaryFeatures objects, the function uses the Hamming distance to compute the similarity metric.

Unique matches, specified as a logical 0 (false) or 1 (true). Set this value to true to return only unique matches between features1 and features2.

When you set Unique to false, the function returns all matches between features1 and features2. Multiple features in features1 can match to one feature in features2.

When you set Unique to true, the function performs a forward-backward match to select a unique match. After matching features1 to features2, it matches features2 to features1 and keeps the best match.

Output Arguments

collapse all

Indices of corresponding features between the two input feature sets, returned as a P-by-2 matrix. Pis the number of matched pairs of features. Each index pair corresponds to a matched feature between the features1 and features2 inputs. The first element indexes the feature in features1. The second element indexes the matching feature in features2.

Distance between matching features, returned as a P-by-1 vector. The ith element in matchMetric corresponds to the ith row in the indexPairs output matrix. The values of the distances are based on the metric selected, but a perfect match is always 0. When Metric is set to either SAD or SSD, the feature vectors are normalized to unit vectors before computation. The function returns matchMetric as a double data type when features1 and features2 are of type double. Otherwise, the returned vector is of type single.

MetricRange
SAD[0, 2*sqrt(size(features1,2))].
SSD[0, 4]
Hamming[0, features1.NumBits]

Note

You cannot select the Hamming metric. The metric is selected automatically when the features1 and features2 inputs are binaryFeatures.

Data Types: single | double

Tips

  • Use this function when the 3-D world points that correspond to feature set one features1, are known. centerPoints can be obtained by projecting a 3-D world point onto the second image. You can obtain the 3-D world points by triangulating matched image points from two stereo images.

  • You can specify a circular area of points in feature set two to match with feature set one. Specify the origin as centerPoints with a radius specified by radius. Specify the points to match from feature set two as points2

    Matched projected points from image 1 and image 2.

References

[1] Fraundorfer, Friedrich, and Davide Scaramuzza. “Visual Odometry: Part II: Matching, Robustness, Optimization, and Applications.” IEEE Robotics & Automation Magazine 19, no. 2 (June 2012): 78–90. https://doi.org/10.1109/MRA.2012.2182810.

[2] Lowe, David G. “Distinctive Image Features from Scale-Invariant Keypoints.” International Journal of Computer Vision 60, no. 2 (November 2004): 91–110. https://doi.org/10.1023/B:VISI.0000029664.99615.94.

[3] Muja, Marius, and David G. Lowe. “Fast Approximate Nearest Neighbors With Automatic Algorithm Configuration:” In Proceedings of the Fourth International Conference on Computer Vision Theory and Applications, 331–40. Lisboa, Portugal: SciTePress - Science and Technology Publications, 2009. https://doi.org/10.5220/0001787803310340.

[4] Muja, Marius, and David G. Lowe. "Fast Matching of Binary Features." In 2012 Ninth Conference on Computer and Robot Vision, 404–10. New York: Institute of Electrical and Electronics Engineers, 2012. https://doi.org/10.1109/CRV.2012.60.

Extended Capabilities

Version History

Introduced in R2021a