High-Level Workflow
This flowchart provides a high-level workflow for using the target specification, sensor specifications, and the multi-target multi-sensor task-oriented JIPDATracker.
Define and Configure Target Specifications
The toolbox provides out-of-the-box target specifications which you can use to configure the targets. This hierarchy shows the list of pre-built target specifications available in the toolbox as of R2024b.
You can create and configure target specifications using the trackerTargetSpec
function. You can use tab-completion at each input to go through the available fields.
Define Target Specification
In this example, you track cars and trucks on a highway.
Create a car specification.
carSpec =
HighwayCar with properties:
ReferenceFrame: 'ego'
MaxSpeed: 50 m/s
MaxAcceleration: 4 m/s²
MaxYawRate: 5 deg/s
MaxYawAcceleration: 20 deg/s²
YawLimits: [-10 10] deg
LengthLimits: [3.6 5.6] m
WidthLimits: [1.7 2] m
HeightLimits: [1.4 2] m
Create a truck specification.
truckSpec =
HighwayTruck with properties:
ReferenceFrame: 'ego'
MaxSpeed: 40 m/s
MaxAcceleration: 3 m/s²
MaxYawRate: 4 deg/s
MaxYawAcceleration: 10 deg/s²
YawLimits: [-10 10] deg
LengthLimits: [16 22] m
WidthLimits: [2 2.6] m
HeightLimits: [3.5 4.2] m
Configure Target Specification for Application
We use a fixed stationary global reference frame for this example. The cars on the highway of interest typically move at a maximum speed of 80 mph and have a maximum turn-rate of 4.5 degrees per second. Use the default settings for trucks.
Define and Configure Sensor Specifications
The toolbox also provides out-of-the-box sensor specifications which you can use to configure the sensors. This hierarchy shows the list of pre-built sensor specifications available in the toolbox as of R2024b.
Define Sensor Specification
You can create and configure sensor specifications using the trackerSensorSpec
function. You can use tab-completion at each input to go through the available fields.
In this example, you track vehicles using a radar reporting clustered detections of objects and a camera reporting bounding boxes in the image space.
Create a camera specification.
cameraSpec =
AutomotiveCameraBoxes with properties:
ReferenceFrame: 'ego'
MaxNumMeasurements: 64
MountingLocation: [0 0 0] m
MountingAngles: [0 1 0] deg
EgoOriginHeight: 0.3 m
Intrinsics: [3⨯3 double]
ImageSize: [480 640] pixels
MaxRange: 100 m
CenterAccuracy: 10 pixels
HeightAccuracy: 10 pixels
WidthAccuracy: 10 pixels
DetectionProbability: 0.9
NumFalsePositivesPerImage: 0.01
NumNewTargetsPerImage: 0.01
Create a radar specification.
radarSpec =
AutomotiveRadarClusteredPoints with properties:
ReferenceFrame: 'ego'
MaxNumMeasurements: 64
MountingLocation: [0 0 0] m
MountingAngles: [0 0 0] deg
FieldOfView: [60 20] deg
MaxRange: 120 m
MaxRangeRate: 30 m/s
DetectionProbability: 0.9
NumFalsePositivesPerScan: 1
NumNewTargetsPerScan: 1
HasElevation: 1
Configure Sensors for Application
Configure the camera specification.
Configure the radar specification.
Configure Tracker
Use the multiSensorTargetTracker
function to create a multi-sensor multi-object JIPDATracker.
tracker =
fusion.tracker.JIPDATracker with properties:
TargetSpecifications: {[1×1 HighwayCar] [1×1 HighwayTruck]}
SensorSpecifications: {[1×1 AutomotiveCameraBoxes] [1×1 AutomotiveRadarClusteredPoints]}
MaxMahalanobisDistance: 5
ConfirmationExistenceProbability: 0.9000
DeletionExistenceProbability: 0.1000
Understand Sensor Data Format
Use the dataFormat
function to determine the format of inputs required by the tracker.
The camera requires bounding boxes in the image space along with the position and orientation of the ego vehicle at the time at which it was observed by the camera.
cameraData = struct with fields:
Time: 0
BoundingBox: [4×64 double]
The radar requires positions and range rate for the clusters in the radar coordinates along with the position and orientation of the ego vehicle at the time at which it was observed by the camera.
radarData = struct with fields:
Time: 0
Azimuth: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
Elevation: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
Range: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
RangeRate: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
AzimuthAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
ElevationAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
RangeAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
RangeRateAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
EgoPosition: [0 0 0]
EgoVelocity: [0 0 0]
EgoAngularVelocity: [0 0 0]
EgoOrientation: [0 0 0]
Continuously Update Tracker
In this section, you iterate through the recorded sensor data and continuously update the tracker.
Supporting Functions
function viewer = createTrackingViewer(cameraSpec, radarSpec)
% Colors
clrs = lines(7);
% Create theater plot in the panel
tp = theaterPlot('XLimits',[0 105],...
'YLimits',[-25 25],...
'ZLimits',[-25 25]);
view(tp.Parent,-90,90);
% Track plotter
trp = trackPlotter(tp,'DisplayName','Tracks','ConnectHistory','off',...
'ColorizeHistory','off','MarkerFaceColor',clrs(1,:),...
'HistoryDepth',10,'FontSize',10,'LabelOffset',[2 0 0]);
% Radar detection plotter
dp = detectionPlotter(tp,'DisplayName','Radar Clusters','MarkerFaceColor',clrs(2,:));
% Plot radar coverage
radarCoverage = struct('Index',1,...
'LookAngle',[0 0],...
'FieldOfView',radarSpec.FieldOfView,...
'ScanLimits',[-radarSpec.FieldOfView(1)/2 -radarSpec.FieldOfView(1)/2],...
'Range',radarSpec.MaxRange,...
'Position',radarSpec.MountingLocation,...
'Orientation',radarSpec.MountingAngles);
cp = coveragePlotter(tp,'DisplayName','Radar Coverage','Alpha',[0.1 0.1]);
cp.plotCoverage(radarCoverage);
% Plot camera coverage
cameraCoverage = struct('Index',2,...
'LookAngle',[0 0],...
'FieldOfView',[27.8523 15.0668],...
'ScanLimits',[0 0],...
'Range',cameraSpec.MaxRange,...
'Position',cameraSpec.MountingLocation,...
'Orientation',cameraSpec.MountingAngles);
cp = coveragePlotter(tp,'DisplayName','Camera Coverage','Alpha',[0.1 0.1]);
cp.plotCoverage(cameraCoverage);
% Assemble viewer as a struct
viewer = struct;
viewer.TheaterPlot = tp;
viewer.TrackPlotter = trp;
viewer.DetectionPlotter = dp;
end