Technical Articles

Automotive Research Association of India Enables Virtual Testing of ADAS Application with Real-World Simulation Scenarios


Bringing Real World to Simulation for Virtual Testing of Automated Driving (AD)

Ninad Pachhapurkar, Automotive Research Association of India (ARAI)
Jyoti Kale, Automotive Research Association of India (ARAI)

With 300 million vehicles operating on the third largest road network in the world, India sees a significant number of traffic accidents and fatalities each year. A recent study found that more than three-fourths of those accidents were due to driver error. With safety features such as forward collision warning, automated emergency braking, driver monitoring, and blind spot detection, advanced driver assistance systems (ADAS) can help reduce the risk of such errors.

Engineers developing ADAS applications for the Indian market must account for scenarios that are often encountered by the country’s drivers, including high traffic volumes, unique traffic patterns, and weather uncertainty—as well as infrastructure challenges such as narrow bridges and broken pavement.

To meet the challenges of developing and testing ADAS applications—including extensively validating their performance across a vast number of country-specific scenarios—engineers at Automotive Research Association of India (ARAI) have established a new workflow. Based on MATLAB® and Simulink®, this workflow helps accelerate the delivery of ADAS functionality by enabling virtual testing via simulations derived from real-world driving scenarios (Figure 1). The workflow is broadly divided into three main phases: collecting vehicle sensor data, creating virtual scenarios based on that data, and using the scenarios to test ADAS functionality.

On the bottom right, a busy road in India. On the top right and left, simulated images of the road showing the driver’s perspective and a top-down road view, respectively.

Figure 1. A real-world driving scenario recreated for virtual testing.

Collecting Vehicle Sensor Data

Simulating realistic driving scenarios requires real-world data collected from vehicles equipped with multiple sensors, including camera, lidar, and global positioning system (GPS) devices (Figure 2). ARAI has collected data from various locations across India—each with unique environmental conditions—at different times of the day, thus creating an extensive database of recorded sensor data.

A side view of the ego vehicle parked on the street, with signs indicating where various sensors are mounted.

Figure 2. Car equipped with camera, lidar, and other sensor devices.

Before the sensors mounted on the vehicle were used to collect data, they first needed to be calibrated. ARAI engineers performed this calibration for the camera and lidar using the Lidar Camera Calibrator app, which enabled them to estimate the rigid transformation between the two devices and save the transformation in MATLAB.

Once the sensors were calibrated, the team was ready to begin recording data. ARAI captured data synchronously from all vehicle-mounted sensors in a rosbag file using ROS. This data was visualized in ROS Toolbox, which was also used to extract data from individually recorded sensors. The team could then analyze synchronized recordings from multiple sensors in MATLAB (Figure 3).

On the left, a real-world driving scenario captured by the recording camera. On the right, the lidar data of the same scenario.

Figure 3. View of the real-world driving scenario as captured through the recording camera (left) and the corresponding lidar data (right).

Creating Virtual Scenarios from Recorded Sensor Data

The creation of driving scenarios requires integrating global positioning data from the recording vehicle (also known as the ego vehicle) with road data (from OpenStreetMap®, for example) along with lidar-based tracks of other vehicles on the road (Figure 4).

Diagram showing how data collected from the ego vehicle is processed to produce visual plots of different driving scenarios.

Figure 4. Data flow in the creation of virtual scenarios from recorded data.

In this phase of the workflow, ARAI engineers begin by visualizing and selecting the recorded data to be used in creating the scenario, which in turn will be used to test a specific aspect of ADAS functionality, such as the detection of a vehicle in the driver’s blind spot. Once this is done, the lidar data must be labeled so that non-ego vehicles can be tracked. To simplify this part of the process, ARAI engineers use the Lidar Labeler app, which employs point cloud temporal interpolation to help automate annotation of vehicles of interest. They then use the OpenStreetMap road network data to create driving scenarios that combine the GPS data for the ego vehicle with the synchronized labeled lidar data for non-ego vehicles (Figure 5). The team is then able to export the road network, vehicles, and vehicle trajectories in their driving scenario to the ASAM OpenSCENARIO® 1.0 file format, for optional interoperability with other third-party simulators supporting ASAM OpenSCENARIO import.

Figure 5. Driving scenario with synchronized video.

The team has used this approach to create not only scenarios that replicate real-world recorded data but also scenario variants for vehicle crashes and other events not likely to be captured in day-to-day driving. To create a crash scenario in which the ego vehicle collides with another vehicle, for example, the engineers modified an existing scenario, and sharply reduced the velocity of a non-ego vehicle that the ego vehicle was trailing. This type of scenario could then be used to test forward collision warning (FCW) and automatic emergency braking (AEB) features.

Testing ADAS Functionality with Virtual Scenarios

In the final phase of the workflow, ARAI engineers use the virtual scenarios to test specific ADAS functionality. This begins with a testbench created in Simulink. A testbench for an AEB system, for example, would include blocks for the scenario and associated sensor outputs to be used in the test, as well as blocks for the sensor fusion and tracking algorithms, decision logic, controls, and vehicle dynamics (Figure 6).

Screenshot showing the Simulink testbench for an AEB system based on an Automated Driving Toolbox example.

Figure 6. A Simulink testbench for an AEB system based on an Automated Driving Toolbox example.

Engineers visualize the results of scenario-based tests both during and after their execution via the Bird’s-Eye Scope in Automated Driving Toolbox™ and via plots of individual signals (Figure 7).

Figure 7. Visualization using Bird’s-Eye Scope views (left) and plots of key signals, such as velocity, stopping time, FCW status, and AEB status (right).

Of course, verification of an ADAS application requires running many tests across a wide range of scenarios to ensure all requirements are satisfied. In the ARAI workflow, engineers use the Requirements Editor app to author requirements and configure tests associated with those requirements in Test Manager (Figure 8). Engineers can run tests sequentially or concurrently on a multiple-core workstation with Parallel Computing Toolbox™. Once all the tests are complete, the engineers generate a report showing all tests as passed or failed, which can be shared with other groups for further analysis and follow-up.

Screenshot of the Requirements Editor app showing linking a requirement to an associated test.

Figure 8. Linking a requirement to an associated test with the Requirements Editor app.

Having established a workflow for developing and testing ADAS applications via simulation of real-world scenarios, ARAI is well-positioned to extend it—for example, by adding support for software-in-the-loop and hardware-in-the-loop testing and the development of synthetic scenarios.

Published 2023

View Articles for Related Capabilities

View Articles for Related Industries