Main Content

Validate Sensitivity Analysis

You can validate sensitivity analysis by checking generated parameter values, evaluation results, and analysis results.

Inspect the Generated Parameter Set

To perform sensitivity analysis, you select model parameters for evaluation, and generate a representative set of parameter values to explore the design space. You create the parameter set by specifying parameter distributions such as normal or uniform. You can also specify correlations between parameters. For more information, see Generate Parameter Samples for Sensitivity Analysis. After generating the parameter values, plot them to check if generated parameter values match the desired specifications. This is particularly important if you generate a small number of random samples for each parameter set.

If you see a discrepancy in the generated parameters and the specified distribution and correlations, you can try one of the following:

  • Generate the random samples again, until you achieve the specified distributions and correlations.

  • Increase the sample size at the expense of increasing the evaluation time.

  • Specify different sampling methods. Use Latin hypercube sampling method for a more systematic space-filling approach than random sampling. If you have Statistics and Machine Learning Toolbox™ software, use the Sobol and Halton quasirandom sampling methods for a more space-filling approach than the Latin hypercube method.

To plot the generated parameters in the Sensitivity Analyzer:

  1. Select the generated parameter set in the Parameter Sets area of the app.

    Parameter Sets area is highlighted on the left, Scatter plot option is highlighted on the Plots tab on top, and the Parameter Set table is in the main area of the app

  2. In the Plots tab, select Scatter Plot.

    The generated plot displays histograms of generated values for each parameter on the diagonal, and the pairwise scatter plot of the parameters on the off-diagonals. For more information about the scatter plot, see Interact with Plots in the Sensitivity Analyzer.

    Scatter plot for ParamSet

  3. Inspect the histograms to ensure that the generated parameter values match the intended parameter distributions. Inspect the off-diagonal scatter plots to ensure that any specified correlations between parameters are present.

To plot the generated parameter values at the command line, use sdo.scatterPlot. Use functions such as mean to check the sample statistics.

Check Evaluation Results

After generating a parameter set, you define a cost function by creating a design requirement on the model signals. You then evaluate the cost function at each set of parameter values. To validate the evaluation results, inspect the evaluated cost function values. If the cost function evaluations contain NaN values, that could indicate an issue.

To check for NaN values in the Sensitivity Analyzer after the evaluation is complete:

  1. Open the evaluation results table if it is not already open.

    In the Results area of the app, right-click the evaluated result, and select Open in the menu.

    Open is selected in the drop-down list on right-clicking EvalResult in the Results area

    In the Evaluation Results table, each row of the table lists the parameter set values and the corresponding evaluated design requirement cost function values.

  2. Sort the evaluated requirement values in descending order. To do so, click twice on the evaluated requirement column. Any NaN values are listed at the top of the evaluated requirement column.

    SignalMatching column at the end of the EvalResult table is selected

  3. Inspect the parameter values that resulted in the NaN values for evaluated requirements. If you do not expect an NaN result for that row of parameter values, investigate your model further.

To view the evaluated results at the command line, inspect the cost function evaluation output of sdo.evaluate.

Perform Sensitivity Analysis with Different Parameter Set

After evaluation, you analyze the effect of the parameters on the design requirements, and identify most influential parameters. For more information, see Analyze Relation Between Parameters and Design Requirements. To validate the analysis results, generate a different parameter set and reevaluate the design requirements. If the analysis results are not consistent, consider increasing the number of samples in your parameter set.

Related Topics