Main Content

Radar Architecture: Part 2 – Test automation and requirements traceability

This example is the second part of a two-part series on how to design and test a radar system in Simulink® based on a set of performance requirements. It discusses testing of the model developed in Part 1 and verification of the initial requirements. It shows how to use Simulink Test™ for setting up test suites to verify requirements linked to the components of the system. The example also explores a scenario when the stated requirements have been revised leading to the changes in the design and tests.

Part 1 of this example starts with a set of performance requirements. It develops an architecture model of a radar system using Simulink System Composer™. This architecture model is employed as a virtual test bed for testing and verifying the radar system designs. Part 1 shows how to use Simulink Requirements™ to link the requirements to the components of the architecture. It also demonstrates how to implement the individual components of the architecture using Simulink.

Automated Testing

Prior to setting up the tests we load the model constructed in the Part 1 of the example.


Simulink Test Manager is a tool for creating tests suites for the model. To access Test Manager click on Simulink Test in the Apps tab, then navigate to Tests tab and click Simulink Test Manager. To get started with the tests, we create a new test file for the model by clicking on New Test File. Then we add two separate test suites one for each requirement. The test suites can be further configured by:

  • Adding a description to each test suite to shortly describe what functionality is being tested;

  • Linking the test suite to one or multiple requirements. The tests in the test suite must pass in order for the requirements to be verified;

  • Adding callbacks for setup before and cleanup after the test run. In this example we need a global variable added to the base workspace in order to aggregate the results of multiple Monte Carlo runs within a single test suite.

Further we configure the tests within the test suites. The changes are made only in System Under Test, Parameter Overrides, Iterations, and Custom Criteria sections.

  • In the System Under Test section the Model field must be set to the name of the model, which in this example is slexRadarArchitectureExample.

  • The Parameter Overrides section is used to assign different values to the parameters in the base workspace during a test execution. We use this section to specify the targets parameters for the maximum range test and the range resolution test.

For the maximum range test we specify a single target with 1 m2 RCS at the range of 6000 m from the radar as stated in R1.

For the range resolution test, we specify two targets with different RCS that are separated in range by 70 m as required by R2.

  • Because of the random noise and the target fluctuation effects we can only verify the averaged radar system performance collected over multiple test runs. The Iterations section of the test can be used to configure the test to run multiple times to implement Monte Carlo simulations. We add a custom script to the Scripted Iterations subsection. In this example we only add 10 iterations to the test. However, to robustly verify the performance of the system more iterations would be required.

  • The Custom Criteria section allows to specify a custom rule that verifies the test results at the end of each iteration. We configure it to run helperslexRadarArchitectureTestCriteria helper function that processes results of each test iteration and stores them in the detectionResults variable in the base workspace. The function computes the number of the detection threshold crossings. If this number is equal to the number of targets in the test, the iterations is considered to pass, otherwise the iteration is declared as failed. On the last iteration helperslexRadarArchitectureTestCriteria computes the total number of passed iterations. The second argument to this helper function is the percentage of the iterations that must pass for the entire test to pass. The maximum range test requires that at least 90% of all iterations have passed. Since the range resolution test models two independent targets, it requires that at least 80% of all test iterations are successful.

Open this test suite in Test Manager.


After adding the tests and linking them to the requirements, the status of the requirements in Requirements Editor indicates that the verification has been added but the tests have not yet been executed.

Now the tests can be launched. After running both test suites, we can inspect the results of each individual iteration using Data Inspector. The custom criteria helper function also prints status of each iteration to the command window.

Since both tests passed, Requirements Editor now shows that both requirements have been implemented and verified.

Revised Requirements

It is common that during a design process the initial requirements are revised and changed. In this example we assume that the maximum range has been increased to 8000 m and the range resolution has been set to 35 m. The update requirements are

  • R1: The radar must detect a Swerling 1 Case target with a radar cross section (RCS) of 1 m2 at the range of 8000 m with a probability of detection of 0.9 and the probability of false alarm of 1e-6;

  • R2: When returns are detected from two Swerling 1 Case targets separated in range by 35 m, with the same azimuth, the radar must resolve the two targets and generate two unique target reports 80 percent of the time;

Making changes to requirements in Requirements Editor will generate change issues and highlight the Summary status of the corresponding requirement in red. The links to the components that implement the changed requirement and to the tests that verify it are also highlighted. This way it is easy to identify which components of the design and which tests need to be updated in order to address the changes in the requirement and to test them.

The effects of changes in the requirements or in the component implementations can also be conveniently monitored through the requirements Traceability Matrix.

Updated System Parameters

The new maximum range requirement is beyond the current unambiguous range of the system which equals to 7494.8 m. To satisfy the new requirement we need to increase the unambiguous range. This can be accomplished by lowering the PRF. Setting the PRF to 16 kHz will result in the unambiguous range of 9368.5 m which is well beyond the required maximum range of 8000 m.

Since the current radar design transmits unmodulated rectangular pulses, the resolution limit of the system is determined by the pulse width. The current range resolution limit is 60 m. The new requirement of 35 m is almost two times lower. A rectangular pulse which satisfies this requirement would have to be twice as short, reducing the available power at the same range by half. The requirement analysis using the Radar Designer app shows that this system will not be able to reach the required detection performance at the maximum range of 8000 m. To achieve the required maximum range and range resolution, without increasing the peak transmitted power or the antenna gain, we need to adopt a new waveform with the time-bandwidth product that is larger than 1. Setting the pulse width to 1 μs and the bandwidth to 5 MHz will provide the desired resolution.

Open this design in Radar Designer app.


The Pulse Waveform Analyzer app can be used to select a radar waveform from several alternatives. In this example we use the LFM waveform.


Revised Design

A convenient way to modify the behavior of a system's components is to add an alternative design by creating a variant. This is done by right clicking on the component and selecting Add Variant Choice. We add a variant to Waveform Generator and add Simulink behavior to it to implement the LFM waveform generation.

The Linear FM block is configured such that the pulse width is set to the new value of 1 μs, the sweep bandwidth is set to 5 MHz, and the PRF property is set to the updated PRF value of 16 kHz. Now we can run the model with the LFM waveform.

% Set the model parameters

% Update the model parameters to use the LFM waveform

simOut = sim('slexRadarArchitectureExample.slx');

data = simOut.logsout{1}.Values.Data;

plot(range_gates, data(numel(range_gates)+1:end));
xlabel('Range (m)');
ylabel('Power (W)');
title('Signal Processor Output');

grid on;

Figure contains an axes. The axes with title Signal Processor Output contains an object of type line.

Updated Tests

Before verifying that the radar system with LFM can satisfy the updated requirements, we need to make corresponding modifications to the tests by updating the targets positions.

  • Set the target range in the maximum range test to 8000 m

  • Change target ranges in the range resolution test so the targets are positioned 35 m from each other

After updating the tests, we need to clear all change issues in Requirements Editor. This is done by clicking Show Links in the Requirements tab, then selecting the links and clicking on Clear All button in Change Information section of the Details panel on the right. When the issues have been cleared, the tests can be launched. The new design will pass the updated tests and verify that the system satisfies the updated requirements confirming the predictions made by the Radar Designer app.


This example is the second part of a two-part series on how to design and test a radar system in Simulink based on a set of performance requirements. It showed how to use Simulink Test to test the model developed in Part 1, how to link the test to the requirements, and how to verify that the requirements are satisfied by running Monte Carlo simulations. The example also illustrated how to trace changes in the requirements to the corresponding components and how to create alternative designs by adding variants to the model. Part 1 of this example started with the requirements that must be satisfied by the final design. It used Simulink System Composer to develop an architecture model of a radar system that can serve as a virtual test bed. Part 1 then showed how to use Simulink Requirements to link the requirements to the components and how to implement the individual components of the architecture using Simulink.