What Is Nonlinear Regression?
Nonlinear regression is a statistical technique that helps describe nonlinear relationships in experimental data. Nonlinear regression models are generally assumed to be parametric, where the model is described as a nonlinear equation. Typically machine learning methods are used for non-parametric nonlinear regression.
Parametric nonlinear regression models the dependent variable (also called the response) as a function of a combination of nonlinear parameters and one or more independent variables (called predictors). The model can be univariate (single response variable) or multivariate (multiple response variables).
The parameters can take the form of an exponential, trigonometric, power, or any other nonlinear function. To determine the nonlinear parameter estimates, an iterative algorithm is typically used.
\[y=f(X,\beta)+\epsilon\]
where, \(\beta\) represents nonlinear parameter estimates to be computed and \(\epsilon\) represents the error terms.
Popular algorithms for fitting a nonlinear regression include:
- Gauss-Newton algorithm
- Gradient descent algorithm
- Levenberg-Marquardt algorithm
For these and other functions for parametric regression as well as for stepwise, robust, univariate, and multivariate regression, see Statistics and Machine Learning Toolbox™. It can be used to:
- Fit a nonlinear model to data and compare different models
- Generate predictions
- Evaluate parameter confidence intervals
- Evaluate goodness-of-fit
For nonparametric models using machine learning techniques such as neural networks, decision trees, and ensemble learning, see Deep Learning Toolbox™ and Statistics and Machine Learning Toolbox™.
To create a model that fits curves, surfaces, and splines to data, see Curve Fitting Toolbox™.
Examples and How To
Software Reference
See also: machine learning, linear regression, data fitting, data analysis, mathematical modeling, What is linear regression?, Machine Learning Models, Time Series Regression, MANOVA