Automotive Research Association of India Enables Virtual Testing of ADAS Application with Real-World Simulation Scenarios
With 300 million vehicles operating on the third largest road network in the world, India sees a significant number of traffic accidents and fatalities each year. A recent study found that more than three-fourths of those accidents were due to driver error. With safety features such as forward collision warning, automated emergency braking, driver monitoring, and blind spot detection, advanced driver assistance systems (ADAS) can help reduce the risk of such errors.
Engineers developing ADAS applications for the Indian market must account for scenarios that are often encountered by the country’s drivers, including high traffic volumes, unique traffic patterns, and weather uncertainty—as well as infrastructure challenges such as narrow bridges and broken pavement.
To meet the challenges of developing and testing ADAS applications—including extensively validating their performance across a vast number of country-specific scenarios—engineers at Automotive Research Association of India (ARAI) have established a new workflow. Based on MATLAB® and Simulink®, this workflow helps accelerate the delivery of ADAS functionality by enabling virtual testing via simulations derived from real-world driving scenarios (Figure 1). The workflow is broadly divided into three main phases: collecting vehicle sensor data, creating virtual scenarios based on that data, and using the scenarios to test ADAS functionality.
Collecting Vehicle Sensor Data
Simulating realistic driving scenarios requires real-world data collected from vehicles equipped with multiple sensors, including camera, lidar, and global positioning system (GPS) devices (Figure 2). ARAI has collected data from various locations across India—each with unique environmental conditions—at different times of the day, thus creating an extensive database of recorded sensor data.
Before the sensors mounted on the vehicle were used to collect data, they first needed to be calibrated. ARAI engineers performed this calibration for the camera and lidar using the Lidar Camera Calibrator app, which enabled them to estimate the rigid transformation between the two devices and save the transformation in MATLAB.
Once the sensors were calibrated, the team was ready to begin recording data. ARAI captured data synchronously from all vehicle-mounted sensors in a rosbag file using ROS. This data was visualized in ROS Toolbox, which was also used to extract data from individually recorded sensors. The team could then analyze synchronized recordings from multiple sensors in MATLAB (Figure 3).
Creating Virtual Scenarios from Recorded Sensor Data
The creation of driving scenarios requires integrating global positioning data from the recording vehicle (also known as the ego vehicle) with road data (from OpenStreetMap®, for example) along with lidar-based tracks of other vehicles on the road (Figure 4).
In this phase of the workflow, ARAI engineers begin by visualizing and selecting the recorded data to be used in creating the scenario, which in turn will be used to test a specific aspect of ADAS functionality, such as the detection of a vehicle in the driver’s blind spot. Once this is done, the lidar data must be labeled so that non-ego vehicles can be tracked. To simplify this part of the process, ARAI engineers use the Lidar Labeler app, which employs point cloud temporal interpolation to help automate annotation of vehicles of interest. They then use the OpenStreetMap road network data to create driving scenarios that combine the GPS data for the ego vehicle with the synchronized labeled lidar data for non-ego vehicles (Figure 5). The team is then able to export the road network, vehicles, and vehicle trajectories in their driving scenario to the ASAM OpenSCENARIO® 1.0 file format, for optional interoperability with other third-party simulators supporting ASAM OpenSCENARIO import.
Figure 5. Driving scenario with synchronized video.
The team has used this approach to create not only scenarios that replicate real-world recorded data but also scenario variants for vehicle crashes and other events not likely to be captured in day-to-day driving. To create a crash scenario in which the ego vehicle collides with another vehicle, for example, the engineers modified an existing scenario, and sharply reduced the velocity of a non-ego vehicle that the ego vehicle was trailing. This type of scenario could then be used to test forward collision warning (FCW) and automatic emergency braking (AEB) features.
Testing ADAS Functionality with Virtual Scenarios
In the final phase of the workflow, ARAI engineers use the virtual scenarios to test specific ADAS functionality. This begins with a testbench created in Simulink. A testbench for an AEB system, for example, would include blocks for the scenario and associated sensor outputs to be used in the test, as well as blocks for the sensor fusion and tracking algorithms, decision logic, controls, and vehicle dynamics (Figure 6).
Engineers visualize the results of scenario-based tests both during and after their execution via the Bird’s-Eye Scope in Automated Driving Toolbox™ and via plots of individual signals (Figure 7).
Of course, verification of an ADAS application requires running many tests across a wide range of scenarios to ensure all requirements are satisfied. In the ARAI workflow, engineers use the Requirements Editor app to author requirements and configure tests associated with those requirements in Test Manager (Figure 8). Engineers can run tests sequentially or concurrently on a multiple-core workstation with Parallel Computing Toolbox™. Once all the tests are complete, the engineers generate a report showing all tests as passed or failed, which can be shared with other groups for further analysis and follow-up.
Having established a workflow for developing and testing ADAS applications via simulation of real-world scenarios, ARAI is well-positioned to extend it—for example, by adding support for software-in-the-loop and hardware-in-the-loop testing and the development of synthetic scenarios.
Published 2023