Main Content

Reduce Memory Usage and Time Taken by Polyspace Analysis

Issue

The verification is stuck at a certain point for a long time. Sometimes, after the period of inactivity exceeds an internal threshold, the verification stops or you get an error message:

The analysis has been stopped by timeout.

For large projects with several hundreds of source files or millions of lines of code, you might run into the same issue in another way. The verification stops with the error message:

Fatal error: Not enough memory

If you have a multicore system with more than four processors, try increasing the number of processors by using the option -max-processes. By default, the verification uses up to four processors. If you have fewer than four processors, the verification uses the maximum available number. You must have at least 4 GB of RAM per processor for analysis. For instance, if your machine has 16 GB of RAM, do not use this option to specify more than four processors. In the log file of a verification, Polyspace® generates estimates of maximum memory usage for different stages of the verification. For example, the log file shows messages such as this one:

Maximum Memory Usage: 573 MB
These estimates can help you assess whether more memory can speed-up your verification.

If the verification still takes too long, to improve the speed and make the verification faster, try one of the solutions below.

Possible Cause: Temporary Folder on Network Drive

Polyspace produces some temporary files during analysis. If the folder used to store these files is on a network drive, the analysis can slow down.

Solution: Change Temporary Folder

Change your temporary folder to a path on a local drive.

To learn how Polyspace determines the temporary folder location, see Storage of Temporary Files During Polyspace Analysis.

Possible Cause: Anti-Virus Software

In some cases, anti-virus software checks can noticeably slow down a Polyspace analysis. This reduction occurs because the software checks the temporary files produced by the Polyspace analysis.

Configure Exceptions for Polyspace Processes

Check the processes running and see if an anti-virus is consuming large amount of memory.

See Fix Errors or Slow Polyspace Runs from Disk Defragmentation and Anti-virus Software.

Possible Cause: Large and Complex Application

The verification time depends on the size and complexity of your code.

If the application contains greater than 100,000 lines of code, the verification can sometimes take a long time. Even for smaller applications, the verification can take long if it involves complexities such as structures with many levels of nesting or several levels of aliasing through pointers. You can see the number of lines excluding comments towards the beginning of the analysis log in your results folder. Search for the string:

Number of lines without comments

However, if verification with the default options takes unreasonably long or stops altogether, there are multiple strategies to reduce the verification time. Each strategy involves reducing the complexity of verification in some way.

Solution: Use Polyspace Bug Finder First

Use Polyspace Bug Finder™ first to find defects in your code. Some defects that Polyspace Bug Finder finds can translate to a red error in Polyspace Code Prover™. Once you fix these defects, use Polyspace Code Prover for a more rigorous verification.

Solution: Modularize Application

You can divide the application into multiple modules. Verify each module independently of the other modules. You can review the complete results for one module, while the verification of the other modules are still running.

When you divide your complete application into modules, each module has some information missing. For instance, one module can contain a call to a function that is defined in another module. The software makes certain assumptions about the undefined functions. If the assumptions are broader than an actual representation of the function, you see an increase in orange checks from overapproximation. For instance, an error management function might return an int value that is either 0 or 1. However, when Polyspace cannot find the function definition, it assumes that the function returns all possible values allowed for an int variable. You can narrow down the assumptions by specifying external constraints.

When modularizing an application manually, you can follow your own modularization approach. For instance, you can copy only the critical files that you are concerned about into one module, and verify them. You can represent the remaining files through external constraints, provided you are confident that the constraints represent the missing code faithfully. For instance, the constraints on an undefined function represent the function faithfully if they represent the function return value and also reproduce other relevant side effects of the function. To specify external constraints, use the option Constraint setup (-data-range-specifications).

Solution: Choose Lower Precision Level or Verification Level

If your verification takes too long, use a lower precision level or a lower verification level. Fix the red errors found at that level and rerun verification.

  • The precision level determines the algorithm used for verification. Higher precision leads to greater number of proven results but also requires more verification time. For more information, see Precision level (-O0 | -O1 | -O2 | -O3).

  • The verification level determines the number of times Polyspace runs on your source code. For more information, see Verification level (-to).

The verification results from lower precision can contain more orange checks. An orange check indicates that the analysis considers an operation suspect but cannot prove the presence or absence of a run-time error. You have to review an orange check thoroughly to determine if you can retain the operation. By increasing the number of orange checks, you are effectively increasing the time you spend reviewing the verification results. Therefore, use these strategies only if the verification is taking too long.

Solution: Reduce Code Complexity

Both for better readability of your code and for shorter verification time, you can reduce the complexity of your code. Polyspace calculates code complexity metrics from your application, and allows you to limit those metrics below predefined values.

For more information, see:

Solution: Compute Global Variable Sharing and Usage Only

Run a less extensive Code Prover analysis on your application that computes global variable sharing and usage only. You can then run a full analysis component-by-component for run-time error detection. See Show global variable sharing and usage only (-shared-variables-mode).

Solution: Enable Approximations

Depending on your situation, you can choose scaling options to enable certain approximations. Often, warning messages indicate that you must use those options to reduce verification.

SituationOption
Your code contains structures that are many levels deep.Depth of verification inside structures (-k-limiting)
The verification log contains suggestions to inline certain functions.Inline (-inline)

Solution: Remove Parts of Code

You can try to remove code from third-party libraries. The software uses stubs for functions that are not defined in files specified for the Polyspace analysis.

Although the analysis time is reduced, you can see an increase in orange checks because of Polyspace assumptions about stubbed functions. See Code Prover Assumptions About Stubbed Functions.

Possible Cause: Too Many Entry Points for Multitasking Applications

If your code is intended for multitasking and you provide many Tasks, verification can take a long time. The following warning can appear:

Warning: Polyspace detected that the application has x tasks and y critical sections.
         The precision of the analysis will be reduced to avoid scaling issues.
         Check if you can reduce the number of tasks and critical sections in the configuration.
If you receive this warning, it means that Polyspace is switching to a less precise analysis mode to complete the verification in a reasonable amount of time. In this less precise mode, the verification can consider some shared variables as full-range and cause orange checks from overapproximation.

Solution

See if you can reduce the number of entry points that you specify. If you know that some of your entry point functions do not execute concurrently, you do not have to specify them as separate entry points. You can call those functions sequentially in a wrapper function, and then specify the wrapper function as your entry point.

For instance, if you know that the entry point functions task1, task2, and task3 do not execute concurrently:

  1. Define a wrapper function task that calls task1, task2, and task3 in all possible sequences.

    void task() {
       volatile int random = 0;
       if (random) {
           task1();
           task2();
           task3();
      } else if (random) {
           task1();
           task3();
           task2();
      } else if (random) {
           task2();
           task1();
           task3();
      } else if (random) {
           task2();
           task3();
           task1();
      } else if (random) {
           task3();
           task1();
           task2();
      } else {
           task3();
           task2();
           task1();
      } 
    }
    

  2. Instead of task1, task2, and task3, specify task for the option Tasks (-entry-points).

For an example of using a wrapper function as an entry point, see Configuring Polyspace Multitasking Analysis Manually.

Related Topics

External Websites