Main Content

Analyze Code and Perform Software-in-the-Loop Testing

You can analyze your code to detect errors, check for standards compliance, and evaluate key metrics such as length and cyclomatic complexity. When you create your own code, you typically check for run-time errors by using static code analysis and run test cases that evaluate the code against requirements and evaluate code coverage. Based on the results, you refine the code and add tests.

In this example, you generate code and demonstrate that the code execution produces equivalent results to the model by using the same test cases and baseline results. Then you compare the code coverage to the model coverage. Based on the test results, you add tests and modify the model to regenerate the code.

The code analysis and software-in-the-loop testing process. You generate code, analyze your code, perform testing to verify equivalence with your model. Then, you analyze code coverage and report result if they meet your requirements. Otherwise, you refine your code, tests and model.

Analyze Code for Defects, Metrics, and MISRA C:2012

First, check that the model produces MISRA™ C:2012 compliant code and analyze the generated code for code metrics and defects. To produce code compliant with MISRA, you use the Code Generation Advisor and Model Advisor. To check whether the code is MISRA compliant, you use the Polyspace® MISRA C:2012 checker.

  1. Open the example project.

    openExample("shared_vnv/CruiseControlVerificationProjectExample");
    pr = openProject("SimulinkVerificationCruise");

  2. Open the simulinkCruiseErrorAndStandardsExample model.

    open_system("simulinkCruiseErrorAndStandardsExample");

    The simulinkCruiseErrorAndStandardsExample model includes the Compute target speed subsystem, which has 5 inputs and 2 outputs.

Run Code Generator Checks

Check your model by using the Code Generation Advisor. Configure the code generation parameters to generate code more compliant with MISRA C and more compatible with Polyspace.

  1. Right-click the Compute target speed subsystem and select C/C++ Code > Code Generation Advisor.

  2. In the left pane, select the Code Generation Advisor folder. In the right pane, under Available objectives select Polyspace and click the right arrow. The MISRA C:2012 guidelines objective is already selected.

    Code Generation Objectives dialog box

  3. Click Run Selected Checks.

    The Code Generation Advisor checks whether the model includes blocks or configuration settings that are not recommended for MISRA C:2012 compliance and Polyspace code analysis. For this model, the check for incompatible blocks passes, but some configuration settings are incompatible with MISRA compliance and Polyspace checking.

    Code Generation Advisor results

  4. Click Check model configuration settings against code generation objectives. Accept the parameter changes by selecting Modify Parameters.

  5. To rerun the check, click Run This Check.

Run Model Advisor Checks

Before you generate code from your model, use the Model Advisor to check your model for MISRA C and Polyspace compliance.

  1. At the bottom of the Code Generation Advisor window, click Model Advisor.

  2. In the Model Advisor, under the By Task folder, select Modeling Standards for MISRA C:2012.

  3. Click Run Checks and review the results.

  4. If any of the tasks fail, make the suggested modifications and rerun the checks until the MISRA modeling guidelines pass.

Generate and Analyze Code

After you check the model for compliance, you can generate the code. After you generate the code, you can use Polyspace to check the code for compliance with MISRA C:2012 and generate reports to demonstrate compliance with MISRA C:2012.

  1. In the Simulink® model, right-click the Compute target speed subsystem and click C/C++ Code > Build This Subsystem.

  2. After the code generates, in the Simulink Editor, right-click the Compute target speed subsystem and select Polyspace > Options.

  3. Click Configure to choose more advanced Polyspace analysis options. The Polyspace window opens.

  4. In the left pane, click Coding Standards & Code Metrics, then select Calculate Code Metrics to enable code metric calculations for your generated code. Select Check MISRA C:2012.

  5. Save and close the Polyspace window.

    Polyspace configuration dialog box

  6. In the model, right-click the Compute target speed subsystem and select Polyspace > Verify > Code Generated For Selected Subsystem.

    Polyspace Bug Finder™ analyzes the generated code for a subset of MISRA checks. You can see the progress of the analysis in the MATLAB® Command Window. After the analysis finishes, the Polyspace environment opens.

Review Results

The Polyspace environment shows you the results of the static code analysis. For example, expand the tree for rule 8.7 and click through the results. Rule 8.7 states that functions and objects should not be global if the function or object is local. These results refer to variables that other components also use, such as CruiseOnOff. You can annotate your code or your model to justify every result.

Polyspace Bug Finder dialog box

To configure the analysis to check only a subset of MISRA rules:

  1. In your model, right-click the Compute target speed subsystem and select Polyspace > Options.

  2. In the Configuration Parameters dialog, set Settings from to Project configuration.

  3. Click Apply.

  4. Click Configure.

  5. In the Polyspace window, in the left pane, click Coding Standards & Code Metrics. Then select Check MISRA C:2012 and, from the drop-down list, select single-unit-rules. Now Polyspace checks only the MISRA C:2012 rules that are applicable to a single unit.

  6. Save and close the Polyspace window.

  7. Rerun the analysis with the new configuration. When you limit the rules Polyspace checks to the single-unit subset, Polyspace finds only two violations.

    Code coverage analysis dashboard

Generate Report

To demonstrate compliance with MISRA C:2012 and report on your generated code metrics, you must export your results. If you want to generate a report every time you run an analysis, see Generate report (Polyspace Bug Finder).

  1. If they are not open already, open your results in the Polyspace environment.

  2. From the toolbar, select Reporting > Run Report.

  3. Select BugFinderSummary as your report type.

  4. Click Run Report. The report is saved in the same folder as your results.

  5. To open the report, select Reporting > Open Report.

Test Code Against Model Using Software-in-the-Loop Testing

Next, run the same test cases on the generated code to show that the code produces the equivalent results to the original model and fulfills the requirements. Then compare the code coverage to the model coverage to see the extent to which the tests exercised the generated code.

  1. In MATLAB, in the Project pane, in the tests folder, open SILTests.mldatx. The file opens in the Test Manager.

  2. Review the test case. In the Test Browser pane, click SIL Equivalence Test Case. This equivalence test case runs two simulations for the simulinkCruiseErrorAndStandardsExample model using a test harness:

    • Simulation 1 is a model simulation in normal mode.

    • Simulation 2 is a software-in-the-loop (SIL) simulation. For the SIL simulation, the test case runs the code generated from the model instead of running the model.

    The equivalence test logs one output signal and compares the results from the simulations. The test case also collects coverage measurements for both simulations.

  3. To run the equivalence test, select the test case and click Run.

  4. Review the results in the Test Manager. In the Results and Artifacts pane, select SIL Equivalence Test Case. The test case passes and the code produces the same results as the model for this test case.

    Test Manager showing passed test cases and model and code coverage results.

  5. In the right pane, expand the Coverage Results section. The coverage measurements show the extent to which the test case exercises the model and the code.

When you run multiple test cases, you can view aggregated coverage measurements in the results for the whole run. Use the coverage results to add tests and meet coverage requirements, as shown in Perform Functional Testing and Analyze Test Coverage.

You can also test the generated code on your target hardware by running a processor-in-the-loop (PIL) simulation. By adding a PIL simulation to your test cases, you can compare the test results and coverage results from your model to the results from the generated code as it runs on the target hardware. For more information, see Code Verification Through Software-in-the-Loop and Processor-in-the-Loop Execution (Embedded Coder).

Related Topics