Main Content

Automatically Find Image Rotation and Scale

This example demonstrates how to automatically determine the geometric transformation between two images. Specifically, when one image is distorted in relation to another due to rotation and scaling, the functions detectSURFFeatures and estimateGeometricTransform2D can be employed to identify the rotation angle and scale factor. Subsequently, these parameters can be used to transform the distorted image back to its original appearance.

Read Image

Load an image into the workspace.

original = imread("cameraman.tif");
imshow(original);
text(size(original,2),size(original,1)+15, ...
    "Image courtesy of Massachusetts Institute of Technology", ...
    FontSize=7,HorizontalAlignment="right");

Resize and Rotate the Image

Vary the scale factor.

scale = 0.7;
J = imresize(original,scale);

% Adjust the rotation angle, |theta|. The function |imrotate| rotates images
% counterclockwise for positive values of |theta|. To rotate the image
% clockwise, use a negative value for |theta|.
theta = 30;
distorted = imrotate(J,-theta);
figure
imshow(distorted)

Experimenting with different scales and rotations of the input image can enhance results. However, overly drastic changes in scale might hinder the feature detector's capability to recognize enough features for accurate analysis.

Find Matching Features Between Images

Detect features in both images.

ptsOriginal = detectSURFFeatures(original);
ptsDistorted = detectSURFFeatures(distorted);

Extract feature descriptors from the original and distorted features.

[featuresOriginal,validPtsOriginal] = extractFeatures(original,ptsOriginal);
[featuresDistorted,validPtsDistorted] = extractFeatures(distorted,ptsDistorted);

Match features by using their descriptors.

indexPairs = matchFeatures(featuresOriginal,featuresDistorted);

Retrieve locations of corresponding points for each image.

matchedOriginal = validPtsOriginal(indexPairs(:,1));
matchedDistorted = validPtsDistorted(indexPairs(:,2));

Show putative point matches.

figure
showMatchedFeatures(original,distorted,matchedOriginal,matchedDistorted);
title("Putatively matched points (including outliers)");

Estimate Transformation

Identify a transformation based on matching point pairs with the M-estimator Sample Consensus (MSAC) algorithm, a robust variant of RANSAC. This algorithm excludes outliers to compute the transformation matrix accurately. Due to its reliance on random sampling, the MSAC algorithm may produce varying results in the transformation computation.

[tform,inlierIdx] = estgeotform2d(matchedDistorted,matchedOriginal,"similarity");
inlierDistorted = matchedDistorted(inlierIdx,:);
inlierOriginal = matchedOriginal(inlierIdx,:);

Display matching point pairs used in the computation of the transformation.

figure;
showMatchedFeatures(original,distorted,inlierOriginal,inlierDistorted);
title("Matching points (inliers only)");
legend("ptsOriginal","ptsDistorted");

Solve for Scale and Angle

Use the geometric transform, tform, to recover the scale and angle. Since the transformation was computed from the distorted to the original image, its inverse must be computed to recover the distortion.

Let sc = s*cos(theta)
Let ss = s*sin(theta)
Then, Ainv = [sc  ss  tx;
             -ss  sc  ty;
               0   0   1]
where tx and ty are x and y translations, respectively.

Compute the inverse transformation matrix.

invTform = invert(tform);
Ainv = invTform.A;

ss = Ainv(1,2);
sc = Ainv(1,1);
scaleRecovered = hypot(ss,sc);
disp(["Recovered scale: ",num2str(scaleRecovered)])

% Recover the rotation in which a positive value represents a rotation in
% the clockwise direction.
thetaRecovered = atan2d(-ss,sc);
disp(["Recovered theta: ",num2str(thetaRecovered)])
    "Recovered scale: "    "0.70126"

    "Recovered theta: "    "29.3451"

The recovered values should match the scale and angle values selected during the image resizing and rotation process. Additionally, the scale and rotation angle can be found in the scale and rotation angle properties of the simtform2d object.

disp(["Scale: " num2str(invTform.Scale)])
disp(["RotationAngle: " num2str(invTform.RotationAngle)])
    "Scale: "    "0.70126"

    "RotationAngle: "    "29.3451"

Recover the Original Image

Recover the original image by transforming the distorted image.

outputView = imref2d(size(original));
recovered = imwarp(distorted,tform,OutputView=outputView);

Compare recovered to original by looking at them side-by-side in a montage.

figure,imshowpair(original,recovered,"montage")

The recovered (right) image quality does not match the original (left) image because of the distortion and recovery process. In particular, the image shrinking causes loss of information. The artifacts around the edges are due to the limited accuracy of the transformation. Detecting more points in the process of finding matching features between images would enhance the transformation's accuracy. For instance, incorporating a corner detector, such as detectFASTFeatures, could complement the SURF feature detector that identifies blobs. The content and size of the image also affect the number of features detected.