Main Content

AutomotiveCameraBoxes

Sensor specification for vehicle-mounted camera that reports images with 2-D bounding boxes

Since R2024b

    Description

    An AutomotiveCameraBoxes object contains a sensor specification for a vehicle-mounted camera that has a built-in box detector and reports images with 2-D bounding boxes. You can use the AutomotiveCameraBoxes object as an input to multiSensorTargetTracker.

    Creation

    To create an AutomotiveCameraBoxes object, use the trackerSensorSpec function with the "automotive", "camera", and "bounding-boxes" input arguments. For example:

    spec = trackerSensorSpec("automotive","camera","bounding-boxes")

    Properties

    expand all

    Reference coordinate frame, specified as "ego" or "global".

    When you specify this property as

    "ego" — The origin of the coordinate system is fixed to an ego vehicle, and the position and orientation of the target are modeled with respect to this ego vehicle. To account for a moving ego vehicle, ego motion compensation is used. This ego motion information must be provided as the last input to the tracker. See the More About section and dataFormat function for more information.

    "global" — The origin of the coordinate system is fixed to a stationary global reference frame, and the position and orientation of the target is modeled with respect to this stationary frame. See the More About section and dataFormat function for more information on how to provide ego pose.

    Tip

    Choose "global" reference frame if either of the following is true.

    • All sensors are mounted on the same ego vehicle, and you know the position and orientation of that ego vehicle in some global reference frame. Additionally, you want the tracks described in that same global reference frame.

    • Sensors are mounted on different ego vehicles, which can be either stationary or moving.

    You may need to change the YawLimits property of your target specification to account for the yaw of the vehicle in global reference frame. For example,

    targetSpec.YawLimits = egoYaw + [-10 10];

    Note

    You must use the same ReferenceFrame on each target specification and sensor specification to initialize a tracker.

    Example: "global"

    Data Types: char | string

    Maximum number of the measurements the sensor can detect in one scan, specified as a positive integer.

    Example: 20

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    Sensor location on the ego vehicle, specified as a 1-by-3 real-valued vector of form [xm ym zm]. This property defines the coordinates of the sensor with respect to the ego vehicle origin. The default value specifies that the sensor origin is at the origin of its ego vehicle. Units are in meters.

    Example: [1.25 -0.1 0.8]

    Data Types: single | double

    Orientation of the sensor with respect to the ego vehicle, specified as a 1-by-3 real-valued vector of form [zyaw ypitch xroll]. Each element of the vector corresponds to an intrinsic Euler angle rotation that carries the body axes of the ego vehicle to the sensor axes. The three elements describe the rotations around the z-, y-, and x-axis sequentially. The default value [0 1 0] reflects a positive pitch in the mounting angle. Units are in degrees.

    Data Types: single | double

    Height of the ego vehicle coordinate frame origin with respect to the ground, specified as a real scalar. Units are in meters.

    Example: 1.2

    Data Types: single | double

    Camera intrinsic matrix, specified as a 3-by-3 nonnegative real-valued matrix.

    Data Types: single | double

    Size of the image reported by the camera, specified as a 1-by-2 real-valued positive vector of the form [height width]. Units are in pixels.

    Example: [1080 1920]

    Data Types: single | double

    Maximum detection range of the sensor, specified as a positive scalar. The sensor does not detect targets that are outside this range. Units are in meters.

    Example: 110

    Data Types: single | double

    Standard deviation of measurement errors in bounding box center reported by the detector, specified as a positive scalar. Units are in pixels.

    Example: 20

    Data Types: single | double

    Standard deviation of the measurement errors in the bounding box height reported by the detector, specified as a positive scalar. Units are in pixels.

    Example: 20

    Data Types: single | double

    Standard deviation of the measurement errors in the bounding box width reported by the detector, specified as a positive scalar. Units are in pixels.

    Example: 20

    Data Types: single | double

    Probability of detecting a target inside the coverage limits, specified as a scalar in the range (0, 1].

    Example: 0.75

    Data Types: single | double

    Average number of false positives per image reported by the detector, specified as a positive scalar.

    Example: 0.03

    Data Types: single | double

    Expected number of new targets per image reported by the detector, specified as a positive scalar.

    Example: 0.04

    Data Types: single | double

    Object Functions

    dataFormatStructure for data format required by task-oriented tracker

    Examples

    collapse all

    Create a specification for a camera mounted on an ego vehicle.

    cameraSpec = trackerSensorSpec("automotive","camera","bounding-boxes")
    cameraSpec = 
      AutomotiveCameraBoxes with properties:
    
                   ReferenceFrame: 'ego'                 
               MaxNumMeasurements: 64                    
                 MountingLocation: [0 0 0]         m     
                   MountingAngles: [0 1 0]         deg   
                  EgoOriginHeight: 0.3             m     
                       Intrinsics: [3⨯3 double]          
                        ImageSize: [480 640]       pixels
                         MaxRange: 100             m     
                   CenterAccuracy: 10              pixels
                   HeightAccuracy: 10              pixels
                    WidthAccuracy: 10              pixels
             DetectionProbability: 0.9                   
        NumFalsePositivesPerImage: 0.01                  
            NumNewTargetsPerImage: 0.01                  
    
    

    Configure the camera specification based on your application. In this example, the camera of interest is mounted at [3.7920 0 1.1] meters with respect to the ego vehicle origin and is pointing downward with a pitch angle of 1 degree. The resolution of the camera is 640 pixels in width and 480 pixels in height. Its intrinsics matrix is [800 0 320; 0 600 240; 0 0 1] and the maximum range is 80 meters. The height of the ego vehicle frame origin with respect to ground is 0.4 meters.

    cameraSpec.MountingLocation = [3.7920 0 1.1];
    cameraSpec.MountingAngles = [0 1 0];
    cameraSpec.ImageSize = [480 640];
    cameraSpec.Intrinsics = [800         0        320
                               0       600        240
                               0         0         1];
    cameraSpec.MaxRange = 80;
    cameraSpec.EgoOriginHeight = 0.4;

    You can use the camera specification as an input to the multiSensorTargetTracker function along with your target specifications to create a JIPDAtracker.

    carSpec = trackerTargetSpec("automotive","car","highway-driving");
    tracker = multiSensorTargetTracker(carSpec,cameraSpec,"jipda")
    tracker = 
      fusion.tracker.JIPDATracker with properties:
    
                    TargetSpecifications: {[1x1 HighwayCar]}
                    SensorSpecifications: {[1x1 AutomotiveCameraBoxes]}
                  MaxMahalanobisDistance: 5
        ConfirmationExistenceProbability: 0.9000
            DeletionExistenceProbability: 0.1000
    
    

    Use the dataFormat function to determine the format of inputs required by the tracker.

    The camera requires bounding boxes in the image space.

    cameraData = dataFormat(cameraSpec)
    cameraData = struct with fields:
               Time: 0
        BoundingBox: [4x64 double]
    
    

    More About

    expand all

    Version History

    Introduced in R2024b