Address Missing Coverage
After you view your model coverage results, you might find that your model does not have 100% coverage. You can address missing coverage in your model.
Add a Test Case to Improve Coverage
In the slvnvdemo_powerwindow
model, the And
block
power_window_control_system/validate_passenger/check_up/allow_action
has 75% condition coverage. The true
case of the first condition
did not occur because, in the Signal Editor block
Input
at the model root level, the Active
scenario parameter is set to Driver
.
Change Active scenario to Passenger
.
Simulate the model again by clicking Analyze Coverage.
The And block allow_action
now receives 100%
condition coverage. The first condition was true for 595 time steps, and false for
2797 time steps. Additionally, the T1 and
T2 links in the condition and MCDC tables link to the tests
that were used to test each objective. For example, the true
case
of the first condition was satisfied by test run 2, T2. Click
the link to scroll to the Aggregated Tests section of the
report.
Review Coverage Data in the Coverage Results Explorer
You can also review coverage data by using the Coverage Results Explorer. On the Coverage tab, click Results Explorer. The run data is in the left pane, under Current Cumulative Data. Click Run 1 and Run 2 to compare the coverage results.
Click Current Cumulative Data to see the aggregated results of these two runs.
The aggregated results show higher coverage percentages than either of the two individual test cases because the test cases satisfy different objectives in some blocks.
Filter Coverage Outcomes
If you analyze the coverage report and find that you are missing coverage that is not possible to fix by changing the model or a test case, you can filter the missing outcomes so that they are not reported as missing coverage. Some possible reasons you might want to filter coverage outcomes include:
A block is tested by a different test suite, and is not applicable to the current coverage analysis.
A block is intended to catch edge cases that you think should not occur anyway. This type of model design is sometimes called defensive coding.
There are two types of coverage filters:
An exclusion filter rule can be applied to a model element, and causes that element to be ignored by coverage analysis. The excluded model elements appear dimmed in the highlighted model, like other elements that are not applicable to the metrics you selected.
A justification filter rule can be applied to a coverage outcome that is not satisfied. This filter rule allows Simulink® Coverage™ to analyze the rest of the model element but does not report the justified outcome as missing coverage. This filter rule allows you to improve your coverage for a model object without excluding it entirely.
Suppose that the And block condition 1 MCDC outcome has been tested by a different test suite and is not applicable for this case. You can justify the outcome so that it is not reported as missing coverage.
Click the And block allow_action
to scroll to
the relevant section in the coverage report. The MCDC condition C1
(allow_action In1)
is incomplete because the TF
case did not occur. To justify the C1 (allow_action In1)
MCDC
outcome, click the Add justification rule icon .
The Coverage Results Explorer opens the Filter Editor pane with a new untitled filter file. The filter file contains a justification rule for the specified outcome. You can add multiple filter rules to the same filter file.
In the Name field, enter
slvnvdemo_powerwindow_filter
. Under Filter
Rules, double-click the Rationale field and
enter Tested in a different test suite
. Click
Apply, then save the file. The model and coverage report
automatically update to indicate that the outcome is justified.
In the Coverage Details pane, the justified outcome is highlighted in cyan and links to the justification rationale. Clicking J1 brings you to the section of the report titled Objects Filtered from Coverage Analysis. This section of the report only appears if you apply one or more filters to the coverage data.
Command-Line Information
To programmatically add a test and aggregate coverage, enter:
blockPath = [modelName,'/Input']; set_param(blockPath,'ActiveScenario','Passenger') simOut2 = sim(simIn); covDataRun2 = simOut2.covData; cvmodelview(covDataRun2); aggregatedCovData = covData + covDataRun2;
To programmatically filter coverage outcomes, enter:
filt = slcoverage.Filter; setFilterName(filt,'slvnvdemo_powerwindow_filter'); blockPath = [modelName,'/power_window_control_system/validate_passenger/check_up/allow_action']; sel = slcoverage.MetricSelector(slcoverage.MetricSelectorType.MCDCOutcome,blockPath,1,1); rule = slcoverage.FilterRule(sel,'Tested in a different test suite'); addRule(filt,rule); save(filt,'slvnvdemo_powerwindow_filter') topModelCovData = get(aggregatedCovData,modelName); topModelCovData.filter = 'slvnvdemo_powerwindow_filter'; cvmodelview(aggregatedCovData);