Main Content

getMetrics

Access metric data for model testing artifacts

Description

example

results = getMetrics(metricEngine,metricIDs) returns metric results for the specified metric.Engine object for the metrics specified by metricIDs. To collect metric results for the metricEngine object, use the execute function. Then, to access the results, use the getMetrics function.

results = getMetrics(metricEngine,metricIDs,'ArtifactScope',scope) returns metric results for the artifacts in the specified scope. For example, you can specify scope to be a single design unit in your project, such as a Simulink® model or an entire model reference hierarchy. A unit is a functional entity in your software architecture that you can execute and test independently or as part of larger system tests.

Examples

collapse all

Use a metric.Engine object to collect design cost metric data on a model reference hierarchy in a project.

To open the project, use this command.

dashboardCCProjectStart

The project contains db_Controller, which is the top-level model in a model reference hierarchy. This model reference hierarchy represents one design unit.

Create a metric.Engine object.

metric_engine = metric.Engine();

Update the trace information for metric_engine to reflect any pending artifact changes.

updateArtifacts(metric_engine)

Create an array of metric identifiers for the metrics you want to collect. For this example, create a list of all available design cost estimation metrics.

metric_Ids = getAvailableMetricIds(metric_engine,...
    'App','DesignCostEstimation')
metric_Ids = 

  1×2 string array

    "DataSegmentEstimate"    "OperatorCount"

To collect results, execute the metric engine.

execute(metric_engine,metric_Ids);

Because the engine was executed without the argument for ArtifactScope, the engine collects metrics for the db_Controller model reference hierarchy.

Use the getMetrics function to access the high-level design cost metric results.

results_OpCount = getMetrics(metric_engine,'OperatorCount');
results_DataSegmentEstimate = getMetrics(metric_engine,'DataSegmentEstimate');

disp(['Unit:  ', results_OpCount.Artifacts.Name])
disp(['Total Cost:  ', num2str(results_OpCount.Value)])

disp(['Unit:  ', results_DataSegmentEstimate.Artifacts.Name])
disp(['Data Segment Size (bytes):  ', num2str(results_DataSegmentEstimate.Value)])
Unit:  db_Controller
Total Cost:  334

Unit:  db_Controller
Data Segment Size (bytes):  151

The results show that for the db_Controller model, the estimated total cost of the design is 334 and the estimated data segment size is 151 bytes.

Use the generateReport function to access detailed metric results in a pdf report. Name the report 'MetricResultsReport.pdf'.

reportLocation = fullfile(pwd,'MetricResultsReport.pdf');
generateReport(metric_engine,...
    'App','DesignCostEstimation',...
    'Type','pdf',...
    'Location',reportLocation);

The report contains a detailed breakdown of the operator count and data segment estimate metric results.

Table of contents for generated report.

Input Arguments

collapse all

Metric engine object for which to access metric results, specified as a metric.Engine object.

Metric identifiers for metrics to access, specified as a character vector or cell array of character vectors. For a list of design cost metrics and their identifiers, see Design Cost Model Metrics. For a list of requirements-based model testing metrics and their identifiers, see Model Testing Metrics (Simulink Check).

Example: 'DataSegmentEstimate'

Example: {'DataSegmentEstimate', 'OperatorCount'}

Path and identifier of project file for which to get metric results, specified as a cell array of character vectors. The first entry is the full path to a project file. The second entry is the identifier of the object inside the project file.

For a unit model, the first entry is the full path to the model file. The second entry is the name of the block diagram.

Example: {'C:\work\MyModel.slx', 'MyModel'}

Output Arguments

collapse all

Metric results, returned as an array of metric.Result objects.

Version History

Introduced in R2022a