Image undistortion with fixed camera position and single calibration image

17 次查看(过去 30 天)
Dear all,
I want to undistort an image. The camera is fixed on a tripod. The camera captures the reflected light of particles within a thin lightsheet. Usually, the camera is positioned perpendicular to this light sheet, and a high quality non-distortion lens is used. In this case, there is no need for undistortion. But it may also happen, that a lower quality lens is used or that the camera cannot be positioned perrpendicular to the light sheet. In this case, I would take a picture of a calibration target, positioned at the location of the light sheet. I would like to track the pattern of the calibration target, and use the information to undistort images of the light sheet.
What approach should I use in Matlab? The camera calibrator expects at least two calibration images, taken from different perspectives. But my camera is fixed, and the position of the light sheet and the calibration target is fixed too. The image of the calibration target might e.g. look like this (I am free to use any pattern I want):
Thanks for your input!
  2 个评论
Matt J
Matt J 2023-2-26
编辑:Matt J 2023-2-26
From your posted image, it is hard to understand why you say the light sheet is unmovable. It looks like you could change its position by hand easily.
William Thielicke
William Thielicke 2023-2-27
编辑:Matt J 2023-2-27
Yes, I could change it, but I don't want to. Maybe this picture explains the experimental setup a bit better. I want to be able to compensate the distortion when the camera is not perpendicular to the light sheet. What is the best approach? I think this is called orthorectification. And I think this is different from lens distortion. Probably both compensations have to be done one after the other. How would you do this in Matlab?

请先登录,再进行评论。

采纳的回答

Matt J
Matt J 2023-2-27
编辑:Matt J 2023-2-27
My recommendation is that you first calibrate your camera for intrinsic and lens distortion parameters. These parameters do not depend on camera position, so they can be done independently of the apparatus you have posted. In other words, you should be able to disconnect the camera from your appartus, take it into a different room, calibrate, then put it back and the lens distortion and intrinsic parameters will still be valid. Moreover, the calibration fixture that you use for this step need not be part of the apparatus in your post. If you can obtain a chequerboard like the examples in the documentation, you can use the camera calibrator app just like in those example.
Once the camera has been calibrated for intrinsics and lens distortion, you can reconnect it to your apparatus and image a fixture of points like you were originally planning. You then use undistortPoints to obtain their coordinates with distortion corrected. Then you can use fitgeotrans(___,'projective') to figure out the transform that rectifies your image.
  3 个评论
William Thielicke
William Thielicke 2024-3-22
Dear @Matt J , I finally found some time to pick this up again (and hopefully finish implementing something soon). I applied your hints with code copy pasted from Matlab help like this:
clc
clearvars
images = imageDatastore(fullfile(toolboxdir("vision"),"visiondata","calibration","gopro")); %get some worst case data
imageFileNames = images.Files;
% Detect calibration pattern in images
[imagePoints,boardSize] = detectCheckerboardPoints(imageFileNames,'HighDistortion',true);
% Read the first image to obtain image size
originalImage = imread(imageFileNames{6}); %this image will be undistorted and rectified
[mrows, ncols, ~] = size(originalImage);
% Generate world coordinates for the planar pattern keypoints
squareSize = 250; %why is this needed....? This somehow controls output image size...
worldPoints = generateCheckerboardPoints(boardSize,squareSize);
% Calibrate the camera
[cameraParams, imagesUsed, estimationErrors] = estimateCameraParameters(imagePoints, worldPoints, ...
'EstimateSkew', false, 'EstimateTangentialDistortion', false, ...
'NumRadialDistortionCoefficients', 2, 'WorldUnits', 'millimeters', ...
'InitialIntrinsicMatrix', [], 'InitialRadialDistortion', [], ...
'ImageSize', [mrows, ncols]);
points = detectCheckerboardPoints(originalImage,'PartialDetections',false); %must be false, otherwise nans, and next step doesnt work
undistortedPoints = undistortPoints(points,cameraParams.Intrinsics);
%% Here, a distorted image will be undistorted by interpolation, result saved in variable "undistortedImage"
[undistortedImage, newIntrinsics] = undistortImage(originalImage,cameraParams.Intrinsics,'interp','cubic');
tform = fitgeotform2d(undistortedPoints,worldPoints,'Projective');
%% Here, and interpolated image will be interpolated a second time
undistorted_rectified = imwarp(undistortedImage,tform);
imshow(undistorted_rectified)
This is just a first attempt to get an image that is undistorted and has a proper alignment. It works, but I wonder if there are ways to get this with less interpolation. Currently a pixel image is interpolated two times, this will introduce a lot of artifacts in the data I am planning to work with (images with particles that are 3 - 10 pixels of diameter). Is there a way to cobine the two steps (undistortimage and imwarp) into a single interpolation step? How would I determine the output resolution of my undistorted and rectified image? Thanks!
Matt J
Matt J 2024-3-23
编辑:Matt J 2024-3-23
The only thing I can think of is that you forego imwarp, and just implement your own warping using griddedInterpolant or interp2. In that case, you would have control over how the deformed image pixel locations get computed. You can compute them as the succession of two point transforms - the undistortion and the rectification. Once the final deformed pixel locations (incorporating both transforms), have been computed you can do a single interpolation to get the final image.

请先登录,再进行评论。

更多回答(1 个)

Matt J
Matt J 2023-2-27
编辑:Matt J 2023-2-27
You could also browse the File Exchange for submissions that do single image calibration. Here is one example, though I have not used it myself.
An important thing to keep in mind, however, is that if you are going to pursue calibration from a single image, it is necessary that the landmark points in your calibration fixture not be coplanar in 3D. A single plane of points is not enough to determine the camera parameters. This is an old theoretical result.

类别

Help CenterFile Exchange 中查找有关 MATLAB Support Package for USB Webcams 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by