Constructing Horizon Lines for a Surface Robot Field of View in Multiple Object Detection

4 次查看(过去 30 天)
1. Background
I am working on a project that deals with detecting floating wastes in ocean, rivers, or lakes using an autonoumous floating robot. I've attached a figure below to illustrate a typical target environment scene to be considered.
The USV of course will be in a lower height position than the one shown in the image above, and a stereo camera will be attached to it. We will apply an edge-detection algorithm - mainly "Canny" in order to detect the floating debris. Running a simulation on this image after filtering out some noise would yield the result below:
2. Problem Statement
My goal is to detect the objects that are in the front field of view of the USV and neglect the objects that are located at the back and located out of the line of sight. In other words, I hope to segment the image by only considering the objects in the front (shown with a blue circle) and remove others from the detection scheme
The way I thought about it is to construct the horizon lines (the two intersecting red lines) that exist in the field of view and in the direction of motion (red arrow) of the USV which should intersect into what is known as the vanishing point
Thus, anything that belongs inside the horizon lines should be kept in the detection framework and everything else should be neglected. In this way, we would prioritize the detection of front row object as required.
I have two questions:
1. Is there a better alternative, or an added suggestion to my idea that can make it more effecient to achieve this purpose?
2. How can the construction of the horizon lines be done on MATLAB?

回答(1 个)

Akshat Dalal
Akshat Dalal 2023-9-12
Hi Tarek,
I understand you wanted some inputs on how to construct the horizon lines using MATLAB. You can do so by using the following approach:
  1. Import the image as an RGB Matrix.
  2. Find the relative position of the robot in the matrix, such as the centre column in the first row.
  3. You could then define the base points for your horizon lines based on the robot position and then go upwards in the matrix by manipulating indices until the horizon lines intersect.
  4. You could use the area between the horizon lines for your detection framework.
I hope this helps in achieving your goal. Apart from this, you could also use depth sensors to filter out objects that are far away from the robot.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by