Main Content

padv.builtin.task.RunTestsPerTestCase Class

Namespace: padv.builtin.task
Superclasses: padv.Task

Task for running each test case using Simulink Test

Description

This class requires CI/CD Automation for Simulink Check.

The padv.builtin.task.RunTestsPerTestCase class provides a task that can run each test case using Simulink® Test™.

You can add the task to your process model by using the method addTask. After you add the task to your process model, you can run the task from the Process Advisor app or by using the function runprocess. The task runs each test case individually and certain tests can generate code. You can control whether Simulink Test or the MATLAB® Unit Test framework executes the test cases by using the task property UseMATLABUnitTest.

The Process Advisor app shows the names of both the test cases and the associated models under Run Tests in the Tasks column. If you only want to see the model names, use the padv.builtin.task.RunTestsPerModel task instead.

To generate a consolidated test results report and a merged coverage report for your model, you can use the built-in task padv.builtin.task.MergeTestResults.

To view the source code for this built-in task, in the MATLAB Command Window, enter:

open padv.builtin.task.RunTestsPerTestCase

The padv.builtin.task.RunTestsPerTestCase class is a handle class.

Note

Since this task runs each test case individually, the task only executes test-case level callbacks. The task does not execute test-file level callbacks or test-suite level callbacks.

Creation

Description

task = padv.builtin.task.RunTestsPerTestCase() creates a task for running test cases using Simulink Test.

example

task = padv.builtin.task.RunTestsPerTestCase(Name=Value) sets certain properties using one or more name-value arguments. For example, task = padv.builtin.task.RunTestsPerTestCase(Name = "MyRunTestsTask") creates a task with the specified name.

You can use this syntax to set property values for Name, InputQueries, IterationQuery, InputDependencyQuery, Licenses, LaunchToolAction, and LaunchToolText.

The padv.builtin.task.RunTestsPerTestCase class also has other properties, but you cannot set those properties during task creation.

Properties

expand all

The RunTestsPerTestCase class inherits properties from padv.Task. The properties listed in Specialized Inherited Properties are padv.Task properties that the RunTestsPerTestCase task overrides.

The task also has properties for specifying Test Execution Options.

Specialized Inherited Properties

Unique identifier for task in process, specified as a string.

Example: "MyRunTestsTask"

Data Types: string

Human-readable name that appears in Process Advisor app, specified as a string.

Example: "My Run Tests Task"

Data Types: string

Task description, specified as a string.

When you point to a task in Process Advisor and click the information icon, the tooltip shows the task description.

Example: "This task uses Simulink Test to run the test cases associated with your model. The task runs the test cases on a test-by-test basis. Certain tests may generate code."

Data Types: string

Path to task documentation, specified as a string.

When you point to a task in Process Advisor, click the ellipsis (...), and click Help, Process Advisor opens the task documentation.

Example: fullfile(pwd,"taskHelpFiles","myTaskDocumentation.pdf")

Data Types: string

Type of artifact, specified as one or more of the values listed in this table. To specify multiple values, use an array.

CategoryArtifact TypeDescription

MATLAB

"m_class"MATLAB class
"m_file"MATLAB file
"m_func"MATLAB function
"m_method"MATLAB class method
"m_property"MATLAB class property

Model Advisor

"ma_config_file"Model Advisor configuration file
"ma_justification_file"Model Advisor justification file

Other

"coder_code_files"Code files
"other_file"Other file

Process Advisor

"padv_dep_artifacts"

Related artifacts that current artifact depends on

"padv_output_file"

Process Advisor output file

Project

"project"Current project file

Requirements

"mwreq_item"Requirement (since R2024b)

"sl_req"

Requirement (for R2024a and earlier)
"sl_req_file"Requirement file
"sl_req_table"Requirements Table

Stateflow®

"sf_chart"Stateflow chart
"sf_graphical_fcn"Stateflow graphical function
"sf_group"Stateflow group
"sf_state"Stateflow state
"sf_state_transition_chart"Stateflow state transition chart
"sf_truth_table"Stateflow truth table

Simulink

"sl_block_diagram"Block diagram
"sl_data_dictionary_file"Data dictionary file
"sl_embedded_matlab_fcn"MATLAB function
"sl_block_diagram"Block diagram
"sl_library_file"Library file
"sl_model_file"Simulink model file
"sl_protected_model_file"Protected Simulink model file
"sl_subsystem"Subsystem
"sl_subsystem_file"Subsystem file
"sl_subsystem"Subsystem

System Composer™

"zc_block_diagram"System Composer architecture
"zc_component"System Composer architecture component
"zc_file"System Composer architecture file
Tests"harness_info_file"Harness info file
"sl_harness_block_diagram"Harness block diagram
"sl_harness_file"Test harness file
"sl_test_case"Simulink Test case
"sl_test_case_result"Simulink Test case result
"sl_test_file"Simulink Test file
"sl_test_iteration"Simulink Test iteration
"sl_test_iteration_result"Simulink Test iteration result
"sl_test_report_file"Simulink Test result report
"sl_test_result_file"Simulink Test result file
"sl_test_resultset"Simulink Test result set
"sl_test_seq"Test Sequence
"sl_test_suite"Simulink Test suite
"sl_test_suite_result"Simulink Test suite result

Example: "sl_test_case"

Example: ["sl_test_case" "other_file"]

Query that finds the artifacts that the task iterates over, specified as a padv.Query object or the name of a padv.Query object. When you specify IterationQuery, the task runs one time for each artifact returned by the query. In the Process Advisor app, the artifacts returned by IterationQuery appear under task title.

For more information about task iterations, see Overview of Process Model.

Query that finds artifact dependencies for task inputs, specified as a padv.Query object or the name of a padv.Query object.

The build system runs the query specified by InputDependencyQuery to find the dependencies for the task inputs, since those dependencies can impact if task results are up-to-date. For more information, see Overview of Process Model.

Example: padv.builtin.query.GetDependentArtifacts

List of licenses that the task requires, specified as a string.

Data Types: string

Function that launches a tool, specified as the function handle.

When you point to a task in the Process Advisor app, you can click the ellipsis (...) to see more options. For built-in tasks, you have the option to launch a tool associated with the task.

For the task RunTestsPerTestCase, you can launch Simulink Test Manager.

Data Types: function_handle

Description of the action that the LaunchToolAction property performs, specified as a string.

Data Types: string

Type of CI-compatible result files that the task itself generates when skipped, specified as either:

  • "JUnit" — JUnit-style XML report for task results.

  • "" — None. The build system generates a JUnit-style XML report for the task instead.

Type of CI-compatible result files that the task itself generates when run, specified as either:

  • "JUnit" — JUnit-style XML report for task results.

  • "" — None. The build system generates a JUnit-style XML report for the task instead.

Inputs to the task, specified as:

  • a padv.Query object

  • the name of padv.Query object

  • an array of padv.Query objects

  • an array of names of padv.Query objects

By default, the task RunTestsPerTestCase gets the current test case that the task is iterating over by using the built-in query padv.builtin.query.GetIterationArtifact.

Location for standard task outputs, specified as a string.

The built-in tasks use tokens, like $DEFAULTOUTPUTDIR$, as placeholders for dynamic path resolution during run-time. For more information, see Dynamically Resolve Paths with Tokens.

Data Types: string

Test Execution Options

Name of test result file, specified as a string.

The built-in tasks use tokens, like $ITERATIONARTIFACT$, as placeholders for dynamic path resolution during run-time. For more information, see Dynamically Resolve Paths with Tokens.

Data Types: string

Since R2023a

Simulation mode for running tests, specified as "Normal", "Accelerator", "Rapid Accelerator", "Software-in-the-Loop", or "Processor-in-the-Loop".

By default, the property is empty (""), which means the built-in task uses the simulation mode that you define in the test itself. If you specify a value other than "", the built-in task overrides the simulation mode set in Simulink Test Manager. You do not need to update the test parameters or settings to run the test in the new mode.

The task ignores this property when your MATLAB release is earlier than R2023a or when the UseMATLABUnitTest is specified as true (1).

Example: "Software-in-the-Loop"

Use MATLAB Unit Test framework to execute test cases, specified as either:

  • true (1) — The task runs your test cases by using the MATLAB Unit Test framework to create a test runner, creates a suite of tests from your test file, and run the tests. If you use the pipeline generator, padv.pipeline.generatePipeline, and your pipeline generator options specify the GenerateJUnitForProcess property as true (1), the task uses the MATLAB unit test XML plugin to produce JUnit-style XML format test results that integrate into CI platforms.

  • false (0) — The task runs your test cases by using Simulink Test. Starting in R2023a, if you specified the task property SimulationMode, the task overrides the test simulation mode without having to change the test definition.

Example: true

Data Types: logical

Methods

expand all

Examples

collapse all

Add a task to your process that can run test cases using Simulink Test.

Open the process model for your project. If you do not have a process model, open the Process Advisor app to automatically create a process model.

In the process model file, add the RunTestsPerTestCase task to your process model by using the addTask method.

runTestsPerTestCaseTask = pm.addTask(padv.builtin.task.RunTestsPerTestCase);

You can reconfigure the task behavior by using the task properties. For example, to specify a different file name for the test results:

runTestsPerTestCaseTask.ResultFileName = "$ITERATIONARTIFACT$_TestResultsFile";

If you want to generate a consolidated test results report and merged coverage reports, you can add the built-in task MergeTestResults to your process. By default, the built-in task MergeTestResults gets the current model and the outputs from the task RunTestsPerTestCase.

If you want the RunTestsPerTestCase task to only run on test cases that have a specific test tag, specify the IterationQuery using the built-in query padv.builtin.query.FindTestCasesForModel and specify the test tag using the Tags argument. For example, to have the task only run on test cases that were tagged with the test tag FeatureA:

runTestsPerTestCaseTask = pm.addTask(padv.builtin.task.RunTestsPerTestCase,...
    IterationQuery = padv.builtin.query.FindTestCasesForModel(Tags="FeatureA"));

Suppose that you only want the RunTestsPerTestCase task to run for tests that use a specific project label.

By default, the RunTestsPerTestCase task in the default process model uses the built-in query padv.builtin.query.FindTestCasesForModel as the IterationQuery. This means that the task runs once for each test case associated with models in the project.

To run the task for tests that use a specific project label, in the process model, you can change the IterationQuery for the task to:

  1. Use the built-in query padv.builtin.query.FindTestCasesForModel to find the models in the project

  2. Specify the IncludeLabel argument of the query to only include test cases that use a specific project label. In this example, the project label is ModelTest and the project label category is TestType.

milTask = pm.addTask(padv.builtin.task.RunTestsPerTestCase());

% Specify which set of artifacts to run for
milTask.IterationQuery = ...
    padv.builtin.query.FindTestCasesForModel(...
        IncludeLabel = {'TestType','ModelTest'});

For more information on the built-in queries, see Find Artifacts with Queries. If you need to perform a query that is not already covered by a built-in query, see Create Custom Queries.