Main Content

JIPDATracker

Joint integrated probabilistic data association tracker

Since R2024b

Description

The JIPDATracker System object™ is a tracker capable of processing detections of multiple targets from multiple sensors using the joint integrated probabilistic data association (JIPDA) assignment algorithm. The tracker applies a soft assignment where multiple detections can contribute to each track. The tracker initializes, confirms, corrects, predicts (performs coasting), and deletes tracks. Inputs to the tracker are data reported by the sensors, and you can determine the data format required by the tracker by using the dataFormat function of the sensor specification objects. The tracker estimates the state vector and state estimate error covariance matrix for each track. If the tracker cannot assign a detection to any existing track, it creates a new track.

Track confirmation and deletion is based on the probability of track existence. Any new track starts in a tentative state. If the existence probability of a tentative track exceeds the threshold specified by the ConfirmationExistenceProbability property, the status for the tentative track changes to confirmed. When the tracker confirms a track, it considers the track as representing a physical object. If the existence probability of a track drops below the threshold specified by the DeletionExistenceProbability property, the tracker deletes the track.

To track targets using this object:

  1. Create the JIPDATracker object and set its properties.

  2. Call the object with arguments, as if it were a function.

To learn more about how System objects work, see What Are System Objects?

Creation

To create a JIPDATracker System object, use the multiSensorTargetTracker function with "jipda" algorithm. For example:

tracker = multiSensorTargetTracker(carSpec,cameraSpec,"jipda")

Properties

expand all

Unless otherwise indicated, properties are nontunable, which means you cannot change their values after calling the object. Objects lock when you call them, and the release function unlocks them.

If a property is tunable, you can change its value at any time.

For more information on changing property values, see System Design in MATLAB Using System Objects.

Target specifications, specified as a cell array of target specification objects. You can use the trackerTargetSpec function to create pre-built target specifications provided in the toolbox.

Note

The target specification properties are tunable, but you cannot change the target specification type.

Sensor specifications, specified as a cell array of sensor specification objects. You can use the trackerSensorSpec function to create pre-built sensor specifications provided in the toolbox.

Note

The sensor specification properties are tunable, but you cannot change the sensor specification type.

Threshold for track confirmation, specified as a scalar in the range (0,1). The tracker confirms a tentative track if its probability of existence is greater than or equal to the confirmation threshold.

Tunable: Yes

Data Types: single | double

Threshold for track deletion, specified as a scalar in the range (0,1). The tracker deletes a track if its probability of existence is smaller than or equal to the deletion threshold.

Tunable: Yes

Data Types: single | double

Maximum Mahalanobis distance for detection assignment, specified as a positive real scalar. The tracker can only assign a detection to a track if its Mahalanobis distance is less than this value. For more information on the Mahalanobis distance, see Mahalanobis Distance (Statistics and Machine Learning Toolbox).

Tunable: Yes

Data Types: single | double

Usage

To process detections and update tracks, call the tracker with arguments, as if it were a function (described here).

Description

confirmedTracks = tracker(sensor1Data,...,sensorNData) returns a list of confirmed tracks that are updated from the detection data of multiple sensors. You can use the dataFormat function to determine the input sensor data format.

example

confirmedTracks = tracker(sensor1Data,...,sensorNData,targetData) additionally lets you specify any targetData required by the target specification. You can use the dataFormat function to determine the input target data format.

example

[confirmedTracks,tentativeTracks,allTracks] = tracker(___) also provides a list of tentative tracks and a list of all tracks.

Input Arguments

expand all

Sensor data reported by each sensor, specified as comma-separated structures. You can use the dataFormat function to determine the input sensor data format.

Target data required by the target specifications, specified as a structure. You can use the dataFormat function to determine the input target data format.

Output Arguments

expand all

Confirmed tracks, returned as an array of objectTrack objects in MATLAB® or as an array of structures in code generation. In code generation, the field names of the returned structure are identical to the property names of objectTrack.

The tracker confirms a track if it satisfies the confirmation threshold specified in the ConfirmationExistenceProbability property. In that case, the IsConfirmed property of the object or field of the structure is true.

The target specification with which the tracker is initialized determines the state convention of the tracks.

Data Types: struct | object

Tentative tracks, returned as an array of objectTrack objects in MATLAB or as an array of structures in code generation. In code generation, the field names of the returned structure are identical to the property names of objectTrack.

A track is tentative if it does not satisfy the confirmation threshold specified in the ConfirmationExistenceProbability property. In that case, the IsConfirmed property of the object or field of the structure is false.

The target specification with which the tracker is initialized determines the state convention of the tracks.

Data Types: struct | object

All tracks, returned as an array of objectTrack objects in MATLAB or as an array of structures in code generation. In code generation, the field names of the returned structure are identical to the property names of objectTrack. allTracks consists of confirmed and tentative tracks.

The target specification with which the tracker is initialized determines the state convention of the tracks.

Data Types: struct | object

Object Functions

To use an object function, specify the System object as the first input argument.

expand all

stepRun System object algorithm
releaseRelease resources and allow changes to System object property values and input characteristics
resetReset internal states of System object
isLockedDetermine if System object is in use
cloneCreate duplicate System object

Examples

collapse all

High-Level Workflow

This flowchart provides a high-level workflow for using the target specification, sensor specifications, and the multi-target multi-sensor task-oriented JIPDATracker.

Define and Configure Target Specifications

The toolbox provides out-of-the-box target specifications which you can use to configure the targets. This hierarchy shows the list of pre-built target specifications available in the toolbox as of R2024b.

You can create and configure target specifications using the trackerTargetSpec function. You can use tab-completion at each input to go through the available fields.

Define Target Specification

In this example, you track cars and trucks on a highway.

Create a car specification.

carSpec = trackerTargetSpec("automotive","car","highway-driving")
carSpec = 
  HighwayCar with properties:

        ReferenceFrame: 'ego'              
              MaxSpeed: 50           m/s   
       MaxAcceleration: 4            m/s²  
            MaxYawRate: 5            deg/s 
    MaxYawAcceleration: 20           deg/s²
             YawLimits: [-10 10]     deg   
          LengthLimits: [3.6 5.6]    m     
           WidthLimits: [1.7 2]      m     
          HeightLimits: [1.4 2]      m     

Create a truck specification.

truckSpec = trackerTargetSpec("automotive","truck","highway-driving")
truckSpec = 
  HighwayTruck with properties:

        ReferenceFrame: 'ego'              
              MaxSpeed: 40           m/s   
       MaxAcceleration: 3            m/s²  
            MaxYawRate: 4            deg/s 
    MaxYawAcceleration: 10           deg/s²
             YawLimits: [-10 10]     deg   
          LengthLimits: [16 22]      m     
           WidthLimits: [2 2.6]      m     
          HeightLimits: [3.5 4.2]    m     

Configure Target Specification for Application

We use a fixed stationary global reference frame for this example. The cars on the highway of interest typically move at a maximum speed of 80 mph and have a maximum turn-rate of 4.5 degrees per second. Use the default settings for trucks.

carSpec.ReferenceFrame = "global";
truckSpec.ReferenceFrame = "global";
carSpec.MaxSpeed = 80*0.44704; % m/s
carSpec.MaxYawRate = 4.5; % deg/s

Define and Configure Sensor Specifications

The toolbox also provides out-of-the-box sensor specifications which you can use to configure the sensors. This hierarchy shows the list of pre-built sensor specifications available in the toolbox as of R2024b.

Define Sensor Specification

You can create and configure sensor specifications using the trackerSensorSpec function. You can use tab-completion at each input to go through the available fields.

In this example, you track vehicles using a radar reporting clustered detections of objects and a camera reporting bounding boxes in the image space.

Create a camera specification.

cameraSpec = trackerSensorSpec('automotive','camera','bounding-boxes')
cameraSpec = 
  AutomotiveCameraBoxes with properties:

               ReferenceFrame: 'ego'                 
           MaxNumMeasurements: 64                    
             MountingLocation: [0 0 0]         m     
               MountingAngles: [0 1 0]         deg   
              EgoOriginHeight: 0.3             m     
                   Intrinsics: [3⨯3 double]          
                    ImageSize: [480 640]       pixels
                     MaxRange: 100             m     
               CenterAccuracy: 10              pixels
               HeightAccuracy: 10              pixels
                WidthAccuracy: 10              pixels
         DetectionProbability: 0.9                   
    NumFalsePositivesPerImage: 0.01                  
        NumNewTargetsPerImage: 0.01                  

Create a radar specification.

radarSpec = trackerSensorSpec('automotive','radar','clustered-points')
radarSpec = 
  AutomotiveRadarClusteredPoints with properties:

              ReferenceFrame: 'ego'         
          MaxNumMeasurements: 64            
            MountingLocation: [0 0 0]    m  
              MountingAngles: [0 0 0]    deg
                 FieldOfView: [60 20]    deg
                    MaxRange: 120        m  
                MaxRangeRate: 30         m/s
        DetectionProbability: 0.9           
    NumFalsePositivesPerScan: 1             
        NumNewTargetsPerScan: 1             
                HasElevation: 1             

Configure Sensors for Application

Configure the camera specification.

cameraSpec.MountingLocation = [3.7920 0 1.1];
cameraSpec.MountingAngles = [0 1 0];
cameraSpec.Intrinsics = [1814.81         0     320
                            0        1814.81   240
                            0         0         1];
cameraSpec.ImageSize = [480 900];
cameraSpec.MaxRange = 100;

Configure the radar specification.

radarSpec.ReferenceFrame = "global";
radarSpec.MountingLocation = [3.7290 0 0.2];
radarSpec.MountingAngles = [0 0 0];
radarSpec.FieldOfView = [20 5];
radarSpec.MaxRange = 100;
radarSpec.MaxRangeRate = 100;
radarSpec.DetectionProbability = 0.9;
radarSpec.NumFalsePositivesPerScan = 0.08;
radarSpec.NumNewTargetsPerScan = 0.08;

Configure Tracker

Use the multiSensorTargetTracker function to create a multi-sensor multi-object JIPDATracker.

tracker = multiSensorTargetTracker({carSpec truckSpec}, {cameraSpec radarSpec}, "jipda")
tracker = 
  fusion.tracker.JIPDATracker with properties:

                TargetSpecifications: {[1×1 HighwayCar]  [1×1 HighwayTruck]}
                SensorSpecifications: {[1×1 AutomotiveCameraBoxes]  [1×1 AutomotiveRadarClusteredPoints]}
              MaxMahalanobisDistance: 5
    ConfirmationExistenceProbability: 0.9000
        DeletionExistenceProbability: 0.1000

Understand Sensor Data Format

Use the dataFormat function to determine the format of inputs required by the tracker.

The camera requires bounding boxes in the image space along with the position and orientation of the ego vehicle at the time at which it was observed by the camera.

cameraData = dataFormat(cameraSpec)
cameraData = struct with fields:
           Time: 0
    BoundingBox: [4×64 double]

The radar requires positions and range rate for the clusters in the radar coordinates along with the position and orientation of the ego vehicle at the time at which it was observed by the camera.

radarData = dataFormat(radarSpec)
radarData = struct with fields:
                  Time: 0
               Azimuth: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
             Elevation: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
                 Range: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
             RangeRate: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
       AzimuthAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
     ElevationAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
         RangeAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
     RangeRateAccuracy: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
           EgoPosition: [0 0 0]
           EgoVelocity: [0 0 0]
    EgoAngularVelocity: [0 0 0]
        EgoOrientation: [0 0 0]

Continuously Update Tracker

In this section, you iterate through the recorded sensor data and continuously update the tracker.

% Create a viewer for visualizing data 
viewer = createTrackingViewer(cameraSpec, radarSpec);

% Load recorded data
load('RecordedSensorData.mat','radarDataLog','cameraDataLog','timestamps');

for i = 1:numel(timestamps)
    % Captured camera and radar data
    cameraData = cameraDataLog(i);
    radarData = radarDataLog(i);    
    % Update tracker
    [tracks,tentativeTracks,allTracks] = tracker(cameraData, radarData);
    % Update visualization
    updateViewer(viewer, radarSpec, radarData, tracks)
end

Figure contains an axes object. The axes object with xlabel X (m), ylabel Y (m) contains 9 objects of type line, patch, text. One or more of the lines displays its values using only markers These objects represent Tracks, (history), Radar Clusters, Radar Coverage, Camera Coverage.

Supporting Functions

function viewer = createTrackingViewer(cameraSpec, radarSpec)

% Colors
clrs = lines(7);

% Create theater plot in the panel
tp = theaterPlot('XLimits',[0 105],...
    'YLimits',[-25 25],...
    'ZLimits',[-25 25]);

view(tp.Parent,-90,90);

% Track plotter
trp = trackPlotter(tp,'DisplayName','Tracks','ConnectHistory','off',...
    'ColorizeHistory','off','MarkerFaceColor',clrs(1,:),...
    'HistoryDepth',10,'FontSize',10,'LabelOffset',[2 0 0]);

% Radar detection plotter
dp = detectionPlotter(tp,'DisplayName','Radar Clusters','MarkerFaceColor',clrs(2,:));

% Plot radar coverage
radarCoverage = struct('Index',1,...
    'LookAngle',[0 0],...
    'FieldOfView',radarSpec.FieldOfView,...
    'ScanLimits',[-radarSpec.FieldOfView(1)/2 -radarSpec.FieldOfView(1)/2],...
    'Range',radarSpec.MaxRange,...
    'Position',radarSpec.MountingLocation,...
    'Orientation',radarSpec.MountingAngles);
cp = coveragePlotter(tp,'DisplayName','Radar Coverage','Alpha',[0.1 0.1]);
cp.plotCoverage(radarCoverage);

% Plot camera coverage
cameraCoverage = struct('Index',2,...
    'LookAngle',[0 0],...
    'FieldOfView',[27.8523 15.0668],...
    'ScanLimits',[0 0],...
    'Range',cameraSpec.MaxRange,...
    'Position',cameraSpec.MountingLocation,...
    'Orientation',cameraSpec.MountingAngles);
cp = coveragePlotter(tp,'DisplayName','Camera Coverage','Alpha',[0.1 0.1]);
cp.plotCoverage(cameraCoverage);

% Assemble viewer as a struct
viewer = struct;
viewer.TheaterPlot = tp;
viewer.TrackPlotter = trp;
viewer.DetectionPlotter = dp;

end
function updateViewer(viewer, radarSpec, radarData, tracks)

% Plot track data on birds eye view
[pos, posCov] = getTrackPositions(tracks, 'ctrv');
vel = getTrackVelocities(tracks, 'ctrv');
egoPosition = radarData.EgoPosition;
egoOrient = rotmat(quaternion(flip(radarData.EgoOrientation),'eulerd','ZYX','frame'),'frame');
pos = (pos - egoPosition)*egoOrient';
dimensions = arrayfun(@(x)...
    struct('Length',x.State(8),...
    'Width',x.State(9),...
    'Height',x.State(10),...
    'OriginOffset',[0 0 0]),tracks);
trkYaw = arrayfun(@(x)x.State(4),tracks) - radarData.EgoOrientation(3);
orient = quaternion([trkYaw zeros(numel(tracks),2)],'eulerd','ZYX','frame');
labels = string([tracks.TrackID]);
viewer.TrackPlotter.plotTrack(pos, posCov, vel, dimensions, orient, labels);

% Plot radar detections in ego vehicle coordinate
[x, y, z] = sph2cart(deg2rad(radarData.Azimuth), deg2rad(radarData.Elevation), radarData.Range);
pos = [x;y;z];
Rsensor = rotmat(quaternion([radarSpec.MountingAngles],'eulerd','ZYX','frame'),'frame');
pos = Rsensor'*pos + radarSpec.MountingLocation(:);
viewer.DetectionPlotter.plotDetection(pos');

end

Extended Capabilities

Version History

Introduced in R2024b