metric.Result
Description
A metric.Result
object contains the results for a specified
metric.
Creation
Description
creates a
handle to a metric result object.metric_result
= metric.Result
Alternatively, if you collect results by executing a metric.Engine
object, using the getMetrics
function on the engine object returns the
collected metric.Result
objects in an array.
Properties
MetricID
— Metric identifier
string
Metric identifier for the metric that calculated the results, returned as a string.
Example:
'TestCasesPerRequirementDistribution'
Artifacts
— Project artifacts
structure | array of structures
Project artifacts for which the metric is calculated, returned as a structure or an array of structures. For each artifact that the metric analyzed, the returned structure contains these fields:
UUID
— Unique identifier of the artifact.Name
— Name of the artifact.ParentUUID
— Unique identifier of the file that contains the artifact.ParentName
— Name of the file that contains the artifact.
Value
— Result value
integer | string | double vector | structure
Value of the metric result for the specified metric and artifacts, returned as an integer, string, double vector, or structure. For a list of model testing metrics and their result values, see Model Testing Metrics.
Scope
— Scope of metric results
structure
Scope of the metric results, returned as a structure. The scope is the unit or component for which the metric collected results. The structure contains these fields:
UUID
— Unique identifier of the unit or component.Name
— Name of the unit or component.ParentUUID
— Unique identifier of the file that contains the unit or component.ParentName
— Name of the file that contains the unit or component.
UserData
— User data
string
User data provided by the metric algorithm, returned as a string.
Examples
Collect metric results on Design Artifacts in a Project
Use a metric.Engine
object to collect metric results
on the design artifacts in a project.
Open a project containing models that you want to analyze. For this example, in the MATLAB® Command Window, enter:
openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample"); openProject("cc_CruiseControl");
Create a metric.Engine
object. You can use the
metric.Engine
object to collect the metric results for the current
project.
metric_engine = metric.Engine();
Collect results for the metric slcomp.OverallCyclomaticComplexity
by
executing the metric engine. For more information on the metric, see Model Maintainability Metrics.
execute(metric_engine,'slcomp.OverallCyclomaticComplexity');
Use the function getMetrics
to access the results. Assign the array of result objects to the results
variable.
results = getMetrics(metric_engine,'slcomp.OverallCyclomaticComplexity');
Access the metric results data by using the properties of the metric.Result
objects in the results
array.
for n = 1:length(results) disp(['Model: ',results(n).Scope.Name]) disp([' Overall Design Cyclomatic Complexity: ',num2str(results(n).Value)]) end
Model: cc_DriverSwRequest Overall Design Cyclomatic Complexity: 9 Model: cc_ThrottleController Overall Design Cyclomatic Complexity: 4 Model: cc_ControlMode Overall Design Cyclomatic Complexity: 22 Model: cc_CruiseControl Overall Design Cyclomatic Complexity: 1 Model: cc_LightControl Overall Design Cyclomatic Complexity: 4
For more information on how to collect metrics for design artifacts, see Collect Model Maintainability Metrics Programmatically.
Collect metric results on Testing Artifacts in a Project
Collect metric results on the requirements-based testing artifacts in a
project. Then, access the data by using the metric.Result
objects.
Open a project that contains models and testing artifacts. For this example, in the MATLAB Command Window, enter:
openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample"); openProject("cc_CruiseControl");
Create a metric.Engine
object. You can use the
metric.Engine
object to collect metric results for the current
project.
metric_engine = metric.Engine();
Update the trace information for metric_engine
to ensure that the artifact information is up to date.
updateArtifacts(metric_engine)
Collect results for the metric 'RequirementsPerTestCase'
by using the execute
function on the metric.Engine
object.
execute(metric_engine,'RequirementsPerTestCase');
Use the function getMetrics
to access the results. Assign the array of result objects to the results
variable.
results = getMetrics(metric_engine,'RequirementsPerTestCase');
Access the metric results data by using the properties of the metric.Result
objects in the array.
for n = 1:length(results) disp(['Test Case: ',results(n).Artifacts(1).Name]) disp([' Number of Requirements: ',num2str(results(n).Value)]) end
Version History
Introduced in R2020bR2023a: Does not return fields Type
and ParentType
metric.Result
objects do not return the fields Type
and ParentType
for the properties Artifacts
and
Scope
.
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)