Main Content

generateReport

Generate report file containing metric results

Since R2021a

    Description

    reportFile = generateReport(metricEngine) creates a PDF report in the root folder of the project for the metric results in the Model Testing Dashboard for the metric engine, metricEngine. Before you generate the report, collect metric results for the metric engine by using the execute function.

    reportFile = generateReport(metricEngine,'App','DashboardApp','Dashboard',dashboardIdentifier) creates a PDF report in the root folder of the project of the metric results in the dashboardIdentifier dashboard for the metric engine, metricEngine. Before you generate the report, collect metric results for the engine by using the execute function.

    Note that there is also a function generateReport (Fixed-Point Designer) in the Fixed-Point Designer™ documentation.

    example

    reportFile = generateReport(___,Name,Value) specifies options using one or more name-value arguments. For example, 'Type','html-file' generates an HTML report.

    example

    Examples

    collapse all

    Analyze the testing artifacts in a project and generate a report file that contains the results.

    Open the project that you want to analyze. For this example, in the MATLAB® Command Window, enter:

    openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample");
    openProject("cc_CruiseControl");

    Create a metric.Engine object for the project.

    metric_engine = metric.Engine();

    Update the trace information for metric_engine to ensure that the artifact information is up to date.

    updateArtifacts(metric_engine)

    Create a list of the available metric identifiers for the Model Testing Dashboard by specifying the dashboard version as 'ModelUnitTesting'.

    metric_ids = getAvailableMetricIds(metric_engine,...
    'App','DashboardApp',...
    'Dashboard','ModelUnitTesting');

    Collect the results by executing the metric engine on the list of metric identifiers.

    execute(metric_engine, metric_ids);

    Generate a PDF report of the model testing results in the root folder of the project.

    generateReport(metric_engine,'App','DashboardApp',...
    'Dashboard','ModelUnitTesting');

    The report opens automatically. To prevent the report from opening automatically, specify 'LaunchReport' as false when you call the generateReport function.

    For each unit in the report, there is an artifact summary table that displays the number of artifacts in the requirements, design, and tests.

    Analyze the maintainability of artifacts in a project and generate a report file that contains the results.

    Open the project that you want to analyze. For this example, in the MATLAB Command Window, enter:

    openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample");
    openProject("cc_CruiseControl");

    Create a metric.Engine object for the project.

    metric_engine = metric.Engine();

    Create a list of the metric identifiers for the Model Maintainability Dashboard by specifying the dashboard version as 'ModelMaintainability'.

    metric_ids = getAvailableMetricIds(metric_engine,...
    'App','DashboardApp',...
    'Dashboard','ModelMaintainability');

    Collect the results by executing the metric engine on the list of metric identifiers.

    execute(metric_engine, metric_ids);

    Generate an HTML report named maintainabilityResults in the current directory, which is the root folder of the project.

    reportLocation = fullfile(pwd, 'maintainabilityResults.html');
    generateReport(metric_engine,'App','DashboardApp',...
    'Dashboard','ModelMaintainability',...
    'Type','html-file','Location',reportLocation);

    The report opens automatically. To prevent the report from opening automatically, specify 'LaunchReport' as false when you call the generateReport function.

    To open the table of contents and navigate to results for each unit, click the menu icon in the top-left corner of the report.

    Input Arguments

    collapse all

    Metric engine object for which you collected metric results, specified as a metric.Engine object.

    Identifier for the dashboard, specified as either:

    • "ModelMaintainability" for the Model Maintainability Dashboard

    • "ModelUnitPILTesting" for the PIL Code Testing dashboard

    • "ModelUnitSILTesting" for the SIL Code Testing dashboard

    • "ModelUnitTesting" for the Model Testing Dashboard

    Example: "ModelMaintainability"

    Data Types: char | string

    Name-Value Arguments

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

    Example: 'Type','html-file'

    Option to automatically open the generated report, specified as true or false.

    Example: false

    Data Types: logical

    Full file name for the generated report, specified as a character vector or string scalar. Use the location to specify the name of the report.

    By default, the report name is the dashboardIdentifier, followed by an underscore, followed by the project name, and the report is generated in the root folder of the project.

    Example: 'C:\MyProject\Reports\RBTResults.html'

    File type for the generated report, specified as 'pdf' or 'html-file'.

    Example: 'html-file'

    Output Arguments

    collapse all

    Full file name of the generated report, returned as a character vector.

    Alternative Functionality

    App

    You can use the dashboard user interface to generate a report.

    To open the dashboard user interface, use one of these approaches:

    • In the Command Window, enter:

      modelDesignDashboard
      The dashboard opens the Model Maintainability Dashboard.

    • In the Command Window, enter:

      modelTestingDashboard
      The dashboard opens the Model Testing Dashboard.

    Click the Report button on the toolstrip. The Create Metric Result Report dialog opens. Click Create to create a report.

    For an example of how to use the dashboard user interface, see Monitor Design Complexity Using Model Maintainability Dashboard.

    Version History

    Introduced in R2021a