Automated Driving Toolbox
Design, simulate, and test ADAS and autonomous driving systems
Automated Driving Toolbox™ provides algorithms and tools for designing, simulating, and testing ADAS and autonomous driving systems. You can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. Visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. The toolbox lets you import and work with HERE HD Live Map data and OpenDRIVE® road networks.
Using the Ground Truth Labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. For hardware-in-the-loop (HIL) and desktop simulation of sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios and radar and camera sensor outputs.
Automated Driving Toolbox provides reference application examples for common ADAS and automated driving features, including FCW, AEB, ACC, LKA, and parking valet. The toolbox supports C/C++ code generation for rapid prototyping and HIL testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.
Use reference applications as a basis for developing automated driving functionality. Automated Driving Toolbox includes reference applications for forward collision warning (FCW), lane keeping assist (LKA), and automated parking valet.
Scenario Generation and Sensor Models
Test automated driving algorithms using authored scenarios and synthetic detections from radar and camera sensor models.
Author Driving Scenarios
Define road networks, actors, and sensors using the Driving Scenario Designer app. Import prebuilt Euro NCAP tests and OpenDRIVE road networks.
Model and simulate the output of automotive radar and vision systems.
Test Algorithms Using Synthetic Data
Test and validate perception, sensor fusion, and control algorithms in open- and closed-loop settings using simulated data from driving scenarios and sensor models.
Ground Truth Labeling
Automate labeling of ground truth data and compare output from an algorithm under test with ground truth data.
Ground Truth Labeling
Interactive and automated ground truth labeling for object detection, semantic segmentation, and scene classification.
Testing Perception Algorithms
Evaluate the performance of perception algorithms by comparing ground truth data against algorithm outputs.
Perception with Computer Vision and Lidar
Develop and test vision and lidar processing algorithms for automated driving.
Vision System Design
Develop computer vision algorithms for vehicle and pedestrian detection, lane detection, and classification.
Use lidar data to detect obstacles and segment ground planes.
Sensor Fusion and Tracking
Perform multisensor fusion using multi-object tracking framework with Kalman filters.
Access and visualize high-definition map data from the HERE HD Live Map service, and display vehicle and object locations on streaming map viewers.
Plan driving paths by using vehicle costmaps and motion-planning algorithms.
Use lateral and longitudinal controllers to follow a planned trajectory.
HERE HD Live Map Reader
Read and visualize data from high-definition maps designed for automated driving applications
Read driving scenarios into Simulink to test vehicle controllers and sensor fusion algorithms
Bird's-Eye Scope for Simulink
Analyze sensor coverages, detections, and tracks in your model
Prebuilt Driving Scenarios
Test driving algorithms using Euro NCAP and other prebuilt scenarios
Plan driving paths using an RRT* path planner and costmap