Metric data for specified metric algorithm and requirements-based testing artifacts
metric.Result object contains the metric data for a specified
metric algorithm and testing artifacts that trace to the specified unit.
handle to a metric result object.
metric_result = metric.Result
Alternatively, if you collect results by executing a
object, using the
getMetrics function on the engine object returns the
metric.Result objects in an array.
MetricID — Metric identifier
Metric identifier for the metric algorithm that calculated the results, returned as a string.
Artifacts — Testing artifacts
structure | array of structures
Testing artifacts for which the metric is calculated, returned as a structure or an array of structures. For each artifact that the metric analyzed, the returned structure contains these fields:
UUID— Unique identifier of the artifact.
Name— Name of the artifact.
Type— Type of artifact.
ParentUUID— Unique identifier of the file that contains the artifact.
ParentName— Name of the file that contains the artifact.
ParentType— Type of file that contains the artifact.
Value — Result value
integer | string | double vector | structure
Value of the metric result for the specified algorithm and artifacts, returned as an integer, string, double vector, or structure. For a list of metrics and their result values, see Model Testing Metrics.
Scope — Scope of metric results
Scope of the metric results, returned as a structure. The scope is the unit for which the metric collected results. The structure contains these fields:
UUID— Unique identifier of the unit.
Name— Name of the unit.
Type— Type of unit.
ParentUUID— Unique identifier of the file that contains the unit.
ParentName— Name of the file that contains the unit.
ParentType— Type of file that contains the unit.
UserData — User data
User data provided by the metric algorithm, returned as a string.
Collect Metric Data on Testing Artifacts in a Project
Collect metric data on the requirements-based testing artifacts in a
project. Then, access the data by using the
Open the project. At the command line, type
metric.Engine object for the project.
metric_engine = metric.Engine();
Update the trace information for
ensure that the artifact information is up to
Collect results for the metric Requirements per test case by using the
execute function on the
Use the function
getMetrics to access the results. Assign the
array of result objects to the
results = getMetrics(metric_engine,'RequirementsPerTestCase');
Access the metric results data by using the properties of the
metric.Result objects in the array.
for n = 1:length(results) disp(['Test Case: ',results(n).Artifacts(1).Name]) disp([' Number of Requirements: ',num2str(results(n).Value)]) end
Test Case: Set button Number of Requirements: 0 Test Case: Decrement button hold Number of Requirements: 1 Test Case: Resume button Number of Requirements: 1 Test Case: Cancel button Number of Requirements: 1 Test Case: Decrement button short Number of Requirements: 2 Test Case: Increment button hold Number of Requirements: 1 Test Case: Increment button short Number of Requirements: 2 Test Case: Enable button Number of Requirements: 1