trackingSensorConfiguration
Represent sensor configuration for tracking
Description
The trackingSensorConfiguration
object creates the configuration
for a sensor used with a trackerPHD
System object™ or a trackerGridRFS
System object. You can use the trackingSensorConfiguration
object to specify sensor
parameters such as clutter density, sensor limits, and sensor resolution. You can also specify
how a tracker perceives the detections from the sensor using properties such as
FilterInitializationFcn
, SensorTransformFcn
, and
SensorTransformParameters
. See Create a Tracking Sensor Configuration for more details.
When used with a trackerPHD
System object, the trackingSensorConfiguration
object enables the tracker to perform four
main routine operations:
Evaluate the probability of detection at points in state-space.
Compute the expected number of detections from a target.
Initiate components in the probability hypothesis density.
Obtain the clutter density of the sensor.
When used with a trackerGridRFS
System object, the trackingSensorConfiguration
object assists the tracker to project sensor
data on 2-D grid. The tracker uses the SensorTransformParameters
property
to calculate the location and orientation of the sensor in the tracking coordinate frame. The
tracker uses the SensorLimits
property to calculate the field of view and
the maximum range of the sensor. The SensorTransformFcn
and
FilterInitializationFcn
properties are not relevant for the
trackerGridRFS
System object.
Creation
Syntax
Description
returns a config
= trackingSensorConfiguration(sensorIndex
)trackingSensorConfiguration
object with a specified sensor
index, sensorIndex
, and default property values.
additionally specifies the pose of the sensor mounting platform. In this case, the
config
= trackingSensorConfiguration(sensor
,platformPose
)SensorTransformParameters
property includes the coordinate
transform information from the scenario frame to the sensor platform frame.
returns a config
= trackingSensorConfiguration(sensorBlock
)trackingSensorConfiguration
object based on a sensor block in
Simulink.
specifies the pose of the sensor mounting platform. In this case, the
config
= trackingSensorConfiguration(sensorBlock
,platformPose
)SensorTransformParameters
property includes the coordinate
transform information from the scenario frame to the sensor platform frame.
___ = trackingSensorConfiguration(___,
set properties using one or more name-value pairs. Use this syntax with any of the
previous syntaxes.Name,Value
)
Inputs
sensorIndex
— Sensor Index
positive integer
Unique sensor index, specified as a positive integer.
sensor
— Sensor object
sensor object
Sensor object, specified as one of these objects:
radarDataGenerator
(Radar Toolbox)drivingRadarDataGenerator
(Automated Driving Toolbox)lidarPointCloudGenerator
(Automated Driving Toolbox)visionDetectionGenerator
(Automated Driving Toolbox)
platformPose
— Platform pose information
structure
Platform pose information, specified as a structure. The structure has these fields.
Field Name | Description |
---|---|
Position | Position of the platform with respect to the scenario frame, specified as a three-element vector. |
Velocity | Velocity of the platform with respect to the scenario frame, specified as a three-element vector. |
Orientation | Orientation of the platform frame with respect to the scenario frame,
specified as a 3-by-3 rotation matrix or a quaternion . |
Alternately, you can specify the structure using these fields.
Field Name | Description |
---|---|
Position | Position of the platform with respect to the scenario frame, specified as a three-element vector. |
Velocity | Velocity of the platform with respect to the scenario frame, specified as a three-element vector. |
Yaw | Yaw angle of the platform frame with respect to the scenario frame, specified as a scalar in degrees. The yaw angle corresponds to the z-axis rotation. |
Pitch | Pitch angle of the platform frame with respect to the scenario frame, specified as a scalar in degrees. The pitch angle corresponds to the y-axis rotation. |
Roll | Roll angle of the platform frame with respect to the scenario frame, specified as a scalar in degrees. The roll angle corresponds to the x-axis rotation. |
sensorBlock
— Simulink sensor block
handle of a valid Simulink sensor block | path of a valid Simulink sensor block
Simulink sensor block, specified as the handle or the path of a valid Simulink sensor block. A valid sensor block is one of those Simulink blocks:
Driving Radar Data Generator (Automated Driving Toolbox)
Radar Data Generator (Radar Toolbox)
Vision Detection Generator (Automated Driving Toolbox)
Lidar Point Cloud Generator (Automated Driving Toolbox)
platform
— Platform
Platform
object
Platform, specified as a Platform
object.
scenario
— Tracking scenario
trackingScenrio
object
Tracking scenario, specified as a trackingScenario
object.
Outputs
config
— Tracking sensor configuration
trackingSensorConfiguration
object
Tracking sensor configuration, returned as a
trackingSensorConfiguration
object.
configs
— Tracking sensor configurations
N-element cell-array of
trackingSensorConfiguration
objects
Tracking sensor configurations, returned as an N-element
cell-array of trackingSensorConfiguration
objects.
N is the number of sensors on the platform or in the
scenario.
Properties
SensorIndex
— Unique sensor identifier
positive integer
Unique sensor identifier, specified as a positive integer. This property distinguishes data that come from different sensors in a multi-sensor system.
Note
If you specify the platform
or
scenario
input argument, the object ignores the name-value input argument for this property.
Example: 2
Data Types: double
IsValidTime
— Indicate detection reporting status
false
(default) | true
Indicate the detection reporting status of the sensor, specified as
false
or true
. Set this property to
true
when the sensor must report detections within its sensor
limits to the tracker. If a track or target was supposed to be detected by a sensor but
the sensor reported no detections, then this information is used to count against the
probability of existence of the track when the isValidTime
property
is set to true
.
Data Types: logical
FilterInitializationFcn
— Filter initialization function
@initcvggiwphd
(default) | function handle | string scalar
Filter initialization function, specified as a function handle or as a string scalar
containing the name of a valid filter initialization function. The function initializes
the PHD filter used by trackerPHD
.
The function must support the following syntaxes:
filter = filterInitializationFcn() filter = filterInitializationFcn(detections)
filter
is a valid PHD filter with components for new-born targets, and
detections
is a cell array of objectDetection
objects. The first syntax allows you to specify the
predictive birth density in the PHD filter without using detections. The second syntax
allows the filter to initialize the adaptive birth density using detection information.
See the BirthRate property of
trackerPHD
for more details.
If you create your own FilterInitializationFcn
, you must also
provide a transform function using the SensorTransformFcn
property.
Other than the default filter initialization function
initcvggiwphd
, Sensor Fusion and Tracking Toolbox™ also provides other initialization functions such as initctrectgmphd
, initctgmphd
, initcvgmphd
, initcagmphd
, initctggiwphd
and initcaggiwphd
.
Data Types: function_handle
| char
SensorTransformFcn
— Sensor transform function
@cvmeas
| function handle | character vector
Sensor transform function, specified as a function handle or as a character vector containing the name of a valid sensor transform function. The function transforms a track's state into the sensor's detection state. For example, the function transforms the track's state in the scenario Cartesian frame to the sensor's spherical frame. You can create your own sensor transform function, but it must support this syntax:
detStates = SensorTransformFcn(trackStates,params)
params
are the parameters stored in the SensorTransformParameters
property. Notice that the signature of the function is similar to a measurement
function. Therefore, you can use a measurement function (such as cvmeas
, ctmeas
, or cameas
) as the SensorTransformFcn
.
Depending on the filter type and the target type, the output
detStates
needs to return differently.
When you use the object with
gmphd
for non-extended targets or withggiwphd
,detStates
is a N-by-M matrix, where N is the number of rows in theSensorLimits
property and M is the number of input states intrackStates
.When you used the object with
gmphd
for extended targets, theSensorTransformFcn
allows you to specify multipledetStates
pertrackState
. In this case,detStates
is a N-by-M-by-S matrix, where S is the number of detectable sources on the extended target. For example, if the target is described by a rectangular state, the detectable sources can be the corners of the rectangle.If any of the sources falls inside the
SensorLimits
, the target is declared detectable. The functions uses the spread (maximum value − minimum value) of eachdetStates
and the ratio between the spread and sensor resolution on each sensor limit to calculate the expected number of detections from each extended target. You can override this default setting by providing an optional output in theSensorTransformFcn
as:where[..., Nexp] = SensorTransformFcn(trackStates, params)
Nexp
is the expected number of detections from each extended track state.
The default SensorTransformFcn
is the sensor transform function
of the filter returned by FilterInitilizationFcn
. For example, the
initicvggiwphd
function returns the default
cvmeas
, whereas initictggiwphd
and
initicaggiwphd
functions return ctmeas
and
cameas
, respectively.
Data Types: function_handle
| char
SensorTransformParameters
— Parameters for sensor transform function
structure | array of structures
Parameters for the sensor transform function, returned as a structure or an array of structures. If you need to transform the state only once, specify it as a structure. If you need to transform the state multiple times, specify it as an n-by-1 array of structures. For example, to transform a state from the scenario frame to the sensor frame, you usually need to first transform the state from the scenario rectangular frame to the platform rectangular frame, and then transform the state from the platform rectangular frame to the sensor spherical frame. The structure contains these fields.
Field | Description |
Frame | Child coordinate frame type, specified as
|
OriginPosition | Child frame origin position expressed in the Parent frame, specified as a 3-by-1 vector. |
OriginVelocity | Child frame origin velocity expressed in the parent frame, specified as a 3-by-1 vector. |
Orientation | Relative orientation between frames, specified as a 3-by-3 rotation
matrix. If you set the |
IsParentToChild | Flag to indicate the direction of rotation between parent and child
frame, specified as |
HasAzimuth | Indicates whether outputs contain azimuth components, specified as
|
HasElevation | Indicates whether outputs contain elevation components, specified as
|
HasRange | Indicates whether outputs contain range components, specified as
|
HasVelocity | Indicates whether outputs contains velocity components, specified as
|
The scenario frame is the parent frame of the platform frame, and the platform frame is the parent frame of the sensor frame.
The default values for SensorTransformParameters
are a 2-by-1
array of structures.
Fields | Struct 1 | Struct 2 |
Frame | 'Spherical' | 'Rectangular' |
OriginPosition | [0;0;0] | [0;0;0] |
OriginVelocity | [0;0;0] | [0;0;0] |
Orientation | eye(3) | eye(3) |
IsParentToChild | false | false |
HasAzimuth | true | true |
HasElevation | true | true |
HasRange | true | true |
HasVelocity | false | true |
In this table, Struct 2 accounts for the transformation from the scenario
rectangular frame to the platform rectangular frame, and Struct 1 accounts for the
transformation from the platform rectangular frame to the sensor spherical frame, given
that you set the IsParentToChild
property to
false
.
Note
If you use a custom sensor transformation function in the
SensorTransformFcn
property, you can specify this property in
any format as long as the sensor transformation function accepts it.
Note
If you specify the platform
or
scenario
input argument, the object ignores the name-value input argument for this property.
Data Types: struct
SensorLimits
— Sensor's detection limits
3-by-2 matrix (default) | N-by-2 matrix
Sensor's detection limits, specified as an N-by-2 matrix, where N is the output dimension of the sensor transform function. The matrix must describe the lower and upper detection limits of the sensor in the same order as the outputs of the sensor transform function.
If you use cvmeas
, cameas
, or ctmeas
as the sensor transform function,
then you need to provide the sensor limits in order as:
The description of the limits and their default values are given in
this table. The default values for SensorLimits
are a 3-by-2 matrix
including the top six elements in the table. Moreover, if you use these three functions,
you can specify a different size matrix (1-by-2, 2-by-2, or 3-by-4), but you have to
specify the limits in the sequence in the SensorLimits matrix.
Limits | Description | Default values |
minAz | Minimum detectable azimuth in degrees. | -10 |
maxAz | Maximum detectable azimuth in degrees. | 10 |
minEl | Minimum detectable elevation in degrees. | -2.5 |
maxEl | Maximum detectable elevation in degrees. | 2.5 |
minRng | Minimum detectable range in meters. | 0 |
maxRng | Maximum detectable range in meters. | 1000 |
minRr | Minimum detectable range rate in meters per second. | N/A |
maxRr | Maximum detectable range rate in meters per second. | N/A |
Note
If you specify the platform
or
scenario
input argument, the object ignores the name-value input argument for this property.
Data Types: double
SensorResolution
— Resolution of sensor
[4;2;10]
(default) | N-element positive-valued vector
Resolution of a sensor, specified as a N-element positive-valued
vector, where N is the number of parameters specified in the
SensorLimits
property. If you want to assign only one resolution
cell for a parameter, simply specify its resolution as the difference between the
maximum limit and the minimum limit of the parameter.
Note
If you specify the platform
or
scenario
input argument, the object ignores the name-value input argument for this property.
Data Types: double
MaxNumDetections
— Maximum number of detections
Inf
(default) | positive integer
MaxNumDetsPerObject
— Maximum number of detections per object
Inf
(default) | positive integer
ClutterDensity
— Expected number of false alarms per unit volume
1e-3
(default) | positive scalar
DetectionProbability
— Probability of detecting target inside the coverage limits
0.9
(default) | scalar in the range (0,1]
Probability of detecting a target inside the coverage limits, specified as a scalar in the range (0, 1].
Example: 0.75
Data Types: single
| double
MinDetectionProbability
— Probability of detecting track estimated to be outside of sensor limits
0.05
(default) | positive scalar
Probability of detecting a target estimated to be outside of the sensor limits,
specified as a positive scalar. This property allows a trackerPHD
object to consider that the estimated target, which is outside the sensor limits, can be
detectable.
Note
If you specify the platform
or
scenario
input argument, the object ignores the name-value input argument for this property.
Example: 0.03
Data Types: double
Examples
Create Radar Sensor Configuration
Consider a radar with the following sensor limits and sensor resolution.
azLimits = [-10 10]; elLimits = [-2.5 2.5]; rangeLimits = [0 500]; rangeRateLimits = [-50 50]; sensorLimits = [azLimits;elLimits;rangeLimits;rangeRateLimits]; sensorResolution = [5 2 10 3];
Specify the sensor transform function that transforms the Cartesian coordinates [x;y;vx;vy] in the scenario frame to spherical coordinates [az;el;range;rr] in the sensor's frame. Use the measurement function cvmeas
as the sensor transform function.
transformFcn = @cvmeas;
To specify the parameters required for cvmeas
, use the SensorTransformParameters
property. Here, you assume the sensor is mounted at the center of the platform and the platform located at [100;30;20] is moving with a velocity of [-5;4;2] units per second in the scenario frame.
The first structure defines the sensor's location, velocity, and orientation in the platform frame.
params(1) = struct("Frame","Spherical", ... "OriginPosition",[0;0;0], ... "OriginVelocity",[0;0;0], ... "Orientation",eye(3), ... "HasRange",true, ... "HasVelocity",true);
The second structure defines the platform location, velocity, and orientation in the scenario frame.
params(2) = struct("Frame","Rectangular", ... "OriginPosition",[100;30;20], ... "OriginVelocity",[-5;4;2], ... "Orientation",eye(3), ... "HasRange",true, ... "HasVelocity",true);
Create the configuration.
config = trackingSensorConfiguration(SensorIndex=3,SensorLimits=sensorLimits,... SensorResolution=sensorResolution,... SensorTransformParameters=params,... SensorTransformFcn=@cvmeas,... FilterInitializationFcn=@initcvggiwphd)
config = trackingSensorConfiguration with properties: SensorIndex: 3 IsValidTime: 0 SensorLimits: [4x2 double] SensorResolution: [4x1 double] SensorTransformFcn: @cvmeas SensorTransformParameters: [1x2 struct] FilterInitializationFcn: @initcvggiwphd MaxNumDetections: Inf MaxNumDetsPerObject: Inf ClutterDensity: 1.0000e-03 DetectionProbability: 0.9000 MinDetectionProbability: 0.0500
Create Tracking Sensor Configuration for fusionRadarSensor
Create a fusionRadarSensor
object and specify its properties.
sensor = fusionRadarSensor(1, ... FieldOfView=[20 5], ... RangeLimits=[0 500], ... HasRangeRate=true, ... HasElevation=true, ... RangeRateLimits=[-50 50], ... AzimuthResolution=5, ... RangeResolution=10, ... ElevationResolution=2, ... RangeRateResolution=3);
Specify the cvmeas
function as the sensor transform function.
transformFcn = @cvmeas;
Create a trackingSensorConfiguration
object.
config = trackingSensorConfiguration(sensor,SensorTransformFcn=transformFcn)
config = trackingSensorConfiguration with properties: SensorIndex: 1 IsValidTime: 0 SensorLimits: [4x2 double] SensorResolution: [4x1 double] SensorTransformFcn: @cvmeas SensorTransformParameters: [2x1 struct] FilterInitializationFcn: [] MaxNumDetections: Inf MaxNumDetsPerObject: 1 ClutterDensity: 1.0485e-07 DetectionProbability: 0.9000 MinDetectionProbability: 0.0500
Create trackingSensorConfiguration
Using Sensor and Platform Pose Inputs
Create a monostatic lidar sensor object.
sensor = monostaticLidarSensor(1);
Define the pose of the sensor platform with respect to the scenario frame.
platformPose = struct("Position", [10 -10 0], ... "Velocity", [1 1 0], ... "Orientation", eye(3));
Create a trackingSensorConfiguration
object based on the sensor and the platform pose input.
config = trackingSensorConfiguration(sensor,platformPose)
config = trackingSensorConfiguration with properties: SensorIndex: 1 IsValidTime: 0 SensorLimits: [3x2 double] SensorResolution: [3x1 double] SensorTransformFcn: [] SensorTransformParameters: [2x1 struct] FilterInitializationFcn: [] MaxNumDetections: Inf MaxNumDetsPerObject: Inf ClutterDensity: 1.0000e-03 DetectionProbability: 0.9000 MinDetectionProbability: 0.0500
Create trackingSensorConfiguration
Using Platform Input
Create a trackingScenario
object and add a platform.
scene = trackingScenario; plat = platform(scene);
Add two sensors to the platform.
plat.Sensors = {fusionRadarSensor(1);monostaticLidarSensor(2)};
Create trackingSensorConfiguration
objects using the platform.
configs = trackingSensorConfiguration(plat)
configs=2×1 cell array
{1x1 trackingSensorConfiguration}
{1x1 trackingSensorConfiguration}
Create trackingSensorConfiguration
from Simulink Block
Open a saved Simulink model that contains a Fusion Radar Sensor block.
open_system("sensorModel");
Get the path and handle of the block.
blockPath = getfullname(gcb); blockHandle = getSimulinkBlockHandle(blockPath);
Create a trackingSensorConfiguration
object based on the block path.
tscByBlockPath = trackingSensorConfiguration(blockPath)
tscByBlockPath = trackingSensorConfiguration with properties: SensorIndex: 1 IsValidTime: 0 SensorLimits: [2x2 double] SensorResolution: [2x1 double] SensorTransformFcn: [] SensorTransformParameters: [2x1 struct] FilterInitializationFcn: [] MaxNumDetections: 100 MaxNumDetsPerObject: 1 ClutterDensity: 5.2536e-13 DetectionProbability: 0.9000 MinDetectionProbability: 0.0500
Create a trackingSensorConfiguration
object based on the block handle.
tscByBlockHandle = trackingSensorConfiguration(blockHandle)
tscByBlockHandle = trackingSensorConfiguration with properties: SensorIndex: 1 IsValidTime: 0 SensorLimits: [2x2 double] SensorResolution: [2x1 double] SensorTransformFcn: [] SensorTransformParameters: [2x1 struct] FilterInitializationFcn: [] MaxNumDetections: 100 MaxNumDetsPerObject: 1 ClutterDensity: 5.2536e-13 DetectionProbability: 0.9000 MinDetectionProbability: 0.0500
Create trackingSensorConfiguration
Using Scenario Input
Create a trackingScenario
object and add two platforms.
scene = trackingScenario; plat1 = platform(scene); plat2 = platform(scene);
Add two sensors to the first platform and one sensor to the second platform.
plat1.Sensors = {fusionRadarSensor(1);monostaticLidarSensor(2)}; plat2.Sensors = {fusionRadarSensor(3)};
Create trackingSensorConfiguration
objects using the scenario.
configs = trackingSensorConfiguration(scene)
configs=3×1 cell array
{1x1 trackingSensorConfiguration}
{1x1 trackingSensorConfiguration}
{1x1 trackingSensorConfiguration}
More About
Create a Tracking Sensor Configuration
To create the configuration for a sensor, you first need to specify the sensor transform function, which is usually given as:
where x denotes the tracking state,
Y denotes detection states, and p denotes the
required parameters. For object tracking applications, you mainly focus on obtaining an
object's tracking state. For example, a radar sensor can measure an object's azimuth,
elevation, range, and possibly range-rate. Using a trackingSensorConfiguration
object, you
can specify a radar's transform function using the SensorTransformFcn
property and specify the radar's mounting location, orientation, and velocity using
corresponding fields in the SensorTransformParameters
property. If the
object is moving at a constant velocity, constant acceleration, or constant turning, you can
use the built-in measurement function – cvmeas
, cameas
, or ctmeas
, respectively – as the SensorTransformFcn
. To set
up the exact outputs of these three functions, specify the hasAzimuth
,
hasElevation
, hasRange
, and
hasVelocity
fields as true
or
false
in the SensorTransformParameters
property.
To set up the configuration of a sensor, you also need to specify the sensor's detection
ability. Primarily, you need to specify the sensor's detection limits. For all the outputs
of the sensor transform function, you need to provide the detection limits in the same order
of these outputs using the SensorLimits
property. For example, for a
radar sensor, you might need to provide its azimuth, elevation, range, and range-rate
limits. You can also specify the radar's SensorResolution
and
MaxNumDetsPerObject
properties if you want to consider extended
object detection. You might also want to specify other properties, such as
ClutterDensity
, IsValidTime
, and
MinDetectionProbability
to further clarify the sensor's detection
ability.
Extended Capabilities
C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.
Version History
Introduced in R2019aR2022b: Create tracking sensor configuration from sensor, platform, and tracking scenario
You can now create trackingSensorConfiguration
objects, which are required by the trackerPHD
and
trackerGridRFS
System objects,
from sensor, platform, and tracking scenario. Specifically, you can:
Generate a
trackingSensorConfiguration
object directly based on one of these sensor objects:lidarPointCloudGenerator
(Automated Driving Toolbox)visionDetectionGenerator
(Automated Driving Toolbox)
Previously, you could directly generate a
trackingSensorConfiguration
object from thefusionRadarSensor
,radarDataGenerator
(Radar Toolbox), ordrivingRadarDataGenerator
(Automated Driving Toolbox) object.Generate a
trackingSensorConfiguration
object based on one of these Simulink blocks:Driving Radar Data Generator (Automated Driving Toolbox)
Radar Data Generator (Radar Toolbox)
Vision Detection Generator (Automated Driving Toolbox)
Lidar Point Cloud Generator (Automated Driving Toolbox)
Generate
trackingSensorConfigurarion
objects for all sensors mounted on aPlatform
object.Specify a platform pose input representing the coordinate transformation from the scenario to the sensor frame when you create the
trackingSensorConfigurarion
object.Generate
trackingSensorConfigurarion
objects for all sensors in atrackingScenario
object.
The trackingSensorConfiguration
has a new property,
MaxNumDetections
. Use the new property to set the maximum number of
detections by a specific sensor.
See Also
trackerPHD
| trackerGridRFS
| ggiwphd
| cvmeas
| cameas
| ctmeas
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)