Get Started with Automated Driving Toolbox
Automated Driving Toolbox™ provides algorithms and tools for designing, simulating, and testing ADAS and autonomous driving systems. You can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. Visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. The toolbox lets you import and work with HERE HD Live Map data and ASAM OpenDRIVE® road networks.
Using the Ground Truth Labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. For hardware-in-the-loop (HIL) testing and desktop simulation of perception, sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios. You can simulate camera, radar, and lidar sensor output in a photorealistic 3D environment and sensor detections of objects and lane boundaries in a 2.5D simulation environment.
Automated Driving Toolbox provides reference application examples for common ADAS and automated driving features, including forward collision warning, autonomous emergency braking, adaptive cruise control, lane keeping assist, and parking valet. The toolbox supports C/C++ code generation for rapid prototyping and HIL testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.
Tutorials
- Get Started with Ground Truth Labelling
Interactively label multiple lidar and video signals simultaneously. - Create Driving Scenario Programmatically
Programmatically create ground truth driving scenarios for synthetic sensor data and tracking algorithms. - Create Driving Scenario Interactively and Generate Synthetic Sensor Data
Use the Driving Scenario Designer app to create a driving scenario and generate sensor detections and point cloud data from the scenario. - Simulate Simple Driving Scenario and Sensor in Unreal Engine Environment
Learn the basics of configuring and simulating scenes, vehicles, and sensors in a virtual environment rendered using the Unreal Engine® from Epic Games®. - Overview of Simulating RoadRunner Scenarios with MATLAB and Simulink
This topic describes the workflow to simulate RoadRunner scenarios with MATLAB® and Simulink®. - Visual Perception Using Monocular Camera
Construct a monocular camera sensor simulation capable of lane boundary and vehicle detections. - Train a Deep Learning Vehicle Detector
Train a vision-based vehicle detector using deep learning. - Multiple Object Tracking Tutorial
Perform automatic detection and motion-based tracking of moving objects in a video by using a multi-object tracker. - Design Lidar SLAM Algorithm Using Unreal Engine Simulation Environment
Develop a simultaneous localization and mapping algorithm using synthetic lidar sensor data recorded from the Unreal Engine simulation environment. - Develop Visual SLAM Algorithm Using Unreal Engine Simulation
Develop a visual simultaneous localization and mapping (SLAM) algorithm using image data from the Unreal Engine simulation environment.
Ground Truth Labeling
Driving Scenario Design
Detection and Tracking
Localization and Mapping
About Automated Driving
- Coordinate Systems in Automated Driving Toolbox
Understand coordinate systems for automated driving.
Videos
Sensor Simulation and Virtual Scene Design with the Driving Scenario Designer
App, Part 1
Create virtual driving scenarios and import scenarios into the app.
Sensor Simulation and Virtual Scene Design with the Driving Scenario Designer
App, Part 2
Generate synthetic sensor detections and export them to MATLAB.
Design Lidar-Based SLAM Using Unreal Engine Simulation Environment
Build a Map from Lidar Data Using SLAM.
How to Simulate Automated Driving Systems: Adaptive Cruise Control
Simulate and test adaptive cruise control application for automated
driving.