Main Content

Assess Requirements-Based Testing Quality by Using the Model Testing Dashboard

You can assess the status of your model testing activities by using the metrics in the Model Testing Dashboard. When you test your models against requirements, you maintain traceability between the requirements, models, test cases, and results. The dashboard helps you to track the status of these artifacts and the traceability relationships between them. Each metric in the dashboard measures a different aspect of the quality of the testing artifacts and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178C. For more information, see Assess the Completeness of Requirements-Based Testing in Accordance with ISO 26262.

From the dashboard, you can identify and fix testing issues. Update the dashboard metrics to track your progress toward testing compliance.

Open the Project and Model Testing Dashboard

The Model Testing Dashboard shows data on the traceability and testing status of each unit in your project. A unit is a functional entity in your software architecture that you can execute and test independently or as part of larger system tests. You can label models as units in the Model Testing Dashboard. If you do not specify the models that are considered units, then the dashboard considers a model to be a unit if it does not reference other models. The dashboard considers each model in your project to represent a unit because you use models to design and test the algorithms.

  1. Open the project that contains the models and testing artifacts. For this example, at the MATLAB® command line, enter dashboardCCProjectStart('incomplete'):

    dashboardCCProjectStart('incomplete')

  2. Open the Model Testing Dashboard by using one of these approaches:

    • On the Project tab, click Model Testing Dashboard.

    • At the MATLAB command line, enter modelTestingDashboard.

  3. The first time that you open the dashboard for the project, the dashboard must identify the artifacts in the project and collect traceability information.

    Dashboard identifying and tracing models, requirements, tests, and test results

    The dashboard performs an initial traceability analysis and collects metric results for the metrics available in your installation. Collecting results for all MathWorks® metrics requires licenses for Simulink® Check™, Requirements Toolbox™, and Simulink Test™. If metric results have been collected, viewing the results requires only a Simulink Check license.

Model Testing Dashboard showing results for unit db_DriverSwRequest

The dashboard analyzes the traceability links from the artifacts to the models in the project and populates the widgets with metric results for the unit that is selected in the Artifacts panel.

Assess Traceability of Artifacts

When the dashboard collects and reports metric data, it scopes the results to the artifacts in one unit in the project. Use the Artifacts panel to view each unit in the project, represented by the name of its model, and the artifacts that trace to it.

  1. In the Artifacts panel, click the unit db_DriverSwRequest. The dashboard widgets populate with metric data from the artifacts in this unit.

  2. In the Artifacts panel, expand the section for the unit. Click the arrow to the left of db_DriverSwRequest. The section below the unit shows the artifacts of each type that trace to the unit.

  3. Expand the Functional Requirements section and then expand the Implemented and Upstream sections. This unit implements the requirements in the file db_SoftwareReqs.slreqx and links to the upstream, system-level requirements in db_SystemReqs.slreqx. Click the arrow to the left of a file name to see the individual requirements that trace to the model.

You can explore the units and sections in the Artifacts panel to see which requirements, test cases, and test results trace to each unit in the project. For more information on how the dashboard analyzes this traceability, see Trace Artifacts to Units for Model Testing Analysis.

Explore Metric Results for a Unit

  1. On the Artifacts panel, click the unit db_DriverSwRequest to view the Model Testing results. The dashboard widgets populate with the metric results for the unit.

  2. In the Test Case Analysis section of the dashboard, locate the Tests with Requirements widget. To view tooltips with details about the results, point to the sections of the gauge or to the percentage result.

    Tests with Requirements widget with tooltip indicating the number of tests that have links to requirements

  3. To explore the metric data in more detail, click an individual metric widget. For example, click the Tests with Requirements widget to view the Metric Details for the metric.

    Table in Metric Details with test cases and statuses of whether each test case is linked to requirements

    The table shows each test case for the unit, the test file containing each test case, and whether the test case is linked to requirements.

  4. The table shows that the test case Detect long decrement is missing linked requirements. You can use the Model Testing Dashboard to open the test case in the Test Manager. In the Artifact column, click Detect long decrement.

  5. At the top of the Model Testing Dashboard, there is a breadcrumb trail from the Metric Details back to the Model Testing results.

    Breadcrumb trail buttons at the top of the dashboard tab

    Click the breadcrumb button for db_DriverSwRequest to return to the Model Testing results for the unit.

You can click on any of the widgets in the dashboard to view the details of their metric results. Use the hyperlinks in the tables to open the artifacts and address testing gaps. For more information on using the data in the dashboard, see Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.

Track Testing Status of a Project Using the Model Testing Dashboard

To use the Model Testing Dashboard to track your testing activities, set up and maintain your project using the best practices described in Manage Requirements-Based Testing Artifacts for Analysis in the Model Testing Dashboard. As you develop and test your models, use the dashboard to identify testing gaps, fix the underlying artifacts, and track your progress towards model testing completion. For more information on finding and addressing gaps in your model testing, see Fix Requirements-Based Testing Issues.

See Also