xyzPoints = mapPoints(vslam)
builds a sparse 3-D map of world points, xyzPoints, from the input
images in the visual simultaneous localization and mapping (vSLAM) object
vslam.
Perform monocular visual simultaneous localization and mapping (vSLAM) using the data from the TUM RGB-D Benchmark. You can download the data to a temporary directory using a web browser or by running this code:
baseDownloadURL = "https://cvg.cit.tum.de/rgbd/dataset/freiburg3/rgbd_dataset_freiburg3_long_office_household.tgz";
dataFolder = fullfile(tempdir,"tum_rgbd_dataset",filesep);
options = weboptions(Timeout=Inf);
tgzFileName = dataFolder+"fr3_office.tgz";
folderExists = exist(dataFolder,"dir");
% Create a folder in a temporary directory to save the downloaded fileif ~folderExists
mkdir(dataFolder)
disp("Downloading fr3_office.tgz (1.38 GB). This download can take a few minutes.")
websave(tgzFileName,baseDownloadURL,options);
% Extract contents of the downloaded file
disp("Extracting fr3_office.tgz (1.38 GB) ...")
untar(tgzFileName,dataFolder);
end
Create an imageDatastore object to store all the RGB images.
Process each image frame, and visualize the camera poses and 3-D map points. Note that the monovslam object runs several algorithm parts on separate threads, which can introduce a latency in processing of an image frame added by using the addFrame function.
for i = 1:numel(imds.Files)
addFrame(vslam,readimage(imds,i))
if hasNewKeyFrame(vslam)
% Display 3-D map points and camera trajectory
plot(vslam);
end% Get current status of system
status = checkStatus(vslam);
end
Plot intermediate results and wait until all images are processed.
while ~isDone(vslam)
if hasNewKeyFrame(vslam)
plot(vslam);
endend
After all the images are processed, you can collect the final 3-D map points and camera poses for further analysis.
xyzPoints = mapPoints(vslam);
[camPoses,addedFramesIdx] = poses(vslam);
% Reset the system
reset(vslam)
Compare the estimated camera trajectory with the ground truth to evaluate the accuracy.