Based on the surf algorithm, I am stitching three images with overlapping regions.
9 次查看(过去 30 天)
显示 更早的评论
Based on the surf algorithm, I am stitching three images with overlapping regions. I have code that successfully stitches img1 and img2 into result1, and img2 and img3 into result2. However, when I attempt to stitch result1 and result2 together, issues arise. How should I proceed?
基于surf算法,拼接三张具有重叠区域的图像。我有拼接两张的代码,问题是三张图片img1,img2,img3,img1和img2有重叠区域,img2和img3有重叠区域。img1和img2拼接没有问题,结果是result1,img2和img3拼接没有问题,结果是result2,但将result1和result2拼接时却出现了问题,我该怎么办
0 个评论
回答(1 个)
Garmit Pant
2024-7-12
Hello yanxin
From what I gather, you are trying to stitch three images together using feature-based image registration and are not getting an appropriate output.
Kindly refer to the following MATLAB Example on creating panoramas using feature-based stitching:
Please use the exact workflow and code to stitch together your three images. The code snippet below uses the workflow in the example to stitch together your three images.
imageFolder = fullfile(pwd); % Assuming images are in the current directory
imageFiles = {'b1.png', 'b2.png', 'b3.png'};
imds = imageDatastore(fullfile(imageFolder, imageFiles));
% Read the first image from the image set.
I = readimage(imds,1);
% Initialize features for I(1)
grayImage = im2gray(I);
points = detectSURFFeatures(grayImage);
[features, points] = extractFeatures(grayImage,points);
% Initialize all the transformations to the identity matrix. Note that the
% projective transformation is used here because the building images are fairly
% close to the camera. For scenes captured from a further distance, you can use
% affine transformations.
numImages = numel(imds.Files);
tforms(numImages) = projtform2d;
% Initialize variable to hold image sizes.
imageSize = zeros(numImages,2);
% Iterate over remaining image pairs
for n = 2:numImages
% Store points and features for I(n-1).
pointsPrevious = points;
featuresPrevious = features;
% Read I(n).
I = readimage(imds, n);
% Convert image to grayscale.
grayImage = im2gray(I);
% Save image size.
imageSize(n,:) = size(grayImage);
% Detect and extract SURF features for I(n).
points = detectSURFFeatures(grayImage);
[features, points] = extractFeatures(grayImage, points);
% Find correspondences between I(n) and I(n-1).
indexPairs = matchFeatures(features, featuresPrevious, 'Unique', true);
matchedPoints = points(indexPairs(:,1), :);
matchedPointsPrev = pointsPrevious(indexPairs(:,2), :);
% Estimate the transformation between I(n) and I(n-1).
tforms(n) = estgeotform2d(matchedPoints, matchedPointsPrev,...
'projective', 'Confidence', 99.9, 'MaxNumTrials', 2000);
% Compute T(1) * T(2) * ... * T(n-1) * T(n).
tforms(n).A = tforms(n-1).A * tforms(n).A;
end
% Compute the output limits for each transformation.
for i = 1:numel(tforms)
[xlim(i,:), ylim(i,:)] = outputLimits(tforms(i), [1 imageSize(i,2)], [1 imageSize(i,1)]);
end
avgXLim = mean(xlim, 2);
[~,idx] = sort(avgXLim);
centerIdx = floor((numel(tforms)+1)/2);
centerImageIdx = idx(centerIdx);
Tinv = invert(tforms(centerImageIdx));
for i = 1:numel(tforms)
tforms(i).A = Tinv.A * tforms(i).A;
end
for i = 1:numel(tforms)
[xlim(i,:), ylim(i,:)] = outputLimits(tforms(i), [1 imageSize(i,2)], [1 imageSize(i,1)]);
end
maxImageSize = max(imageSize);
% Find the minimum and maximum output limits.
xMin = min([1; xlim(:)]);
xMax = max([maxImageSize(2); xlim(:)]);
yMin = min([1; ylim(:)]);
yMax = max([maxImageSize(1); ylim(:)]);
% Width and height of panorama.
width = round(xMax - xMin);
height = round(yMax - yMin);
% Initialize the "empty" panorama.
panorama = zeros([height width 3], 'like', I);
blender = vision.AlphaBlender('Operation', 'Binary mask', ...
'MaskSource', 'Input port');
% Create a 2-D spatial reference object defining the size of the panorama.
xLimits = [xMin xMax];
yLimits = [yMin yMax];
panoramaView = imref2d([height width], xLimits, yLimits);
% Create the panorama.
for i = 1:numImages
I = readimage(imds, i);
% Transform I into the panorama.
warpedImage = imwarp(I, tforms(i), 'OutputView', panoramaView);
% Generate a binary mask.
mask = imwarp(true(size(I,1),size(I,2)), tforms(i), 'OutputView', panoramaView);
% Overlay the warpedImage onto the panorama.
panorama = step(blender, panorama, warpedImage, mask);
end
figure
imshow(panorama)
I hope you find the above explanation and suggestions useful!
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Feature Detection and Extraction 的更多信息
产品
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!