# 香蕉函数的最小化

`$f\left(x\right)=100\left(x\left(2\right)-x\left(1{\right)}^{2}{\right)}^{2}+\left(1-x\left(1\right){\right)}^{2}.$`

$f\left(x\right)$ 称为香蕉函数，因为它围绕原点呈弯曲状。它是优化问题中的难题，因为大多数方法在尝试求解此问题时收敛速度慢。

$f\left(x\right)$$x=\left[1,1\right]$ 点处有唯一最小值，其中 $f\left(x\right)=0$。此示例说明从 $f\left(x\right)$ 点处开始最小化 $x0=\left[-1.9,2\right]$ 的多种方式。

### 不使用导数的优化

`fminsearch` 函数在无约束下求问题的最小值。它使用的算法不估计目标函数的任何导数。它使用 fminsearch 算法 中所述的几何搜索方法。

```fun = @(x)(100*(x(2) - x(1)^2)^2 + (1 - x(1))^2); options = optimset('OutputFcn',@bananaout,'Display','off'); x0 = [-1.9,2]; [x,fval,eflag,output] = fminsearch(fun,x0,options); title 'Rosenbrock solution via fminsearch'```

```Fcount = output.funcCount; disp(['Number of function evaluations for fminsearch was ',num2str(Fcount)])```
```Number of function evaluations for fminsearch was 210 ```
`disp(['Number of solver iterations for fminsearch was ',num2str(output.iterations)])`
```Number of solver iterations for fminsearch was 114 ```

### 使用估计导数的优化

`fminunc` 函数在无约束下求问题的最小值。它使用基于导数的算法。该算法不仅尝试估计目标函数的一阶导数，还尝试估计二阶导数的矩阵。`fminunc` 通常比 `fminsearch` 效率更高。

```options = optimoptions('fminunc','Display','off',... 'OutputFcn',@bananaout,'Algorithm','quasi-newton'); [x,fval,eflag,output] = fminunc(fun,x0,options); title 'Rosenbrock solution via fminunc'```

```Fcount = output.funcCount; disp(['Number of function evaluations for fminunc was ',num2str(Fcount)])```
```Number of function evaluations for fminunc was 150 ```
`disp(['Number of solver iterations for fminunc was ',num2str(output.iterations)])`
```Number of solver iterations for fminunc was 34 ```

### 使用最陡下降法的优化

```options = optimoptions(options,'HessUpdate','steepdesc',... 'MaxFunctionEvaluations',600); [x,fval,eflag,output] = fminunc(fun,x0,options); title 'Rosenbrock solution via steepest descent'```

```Fcount = output.funcCount; disp(['Number of function evaluations for steepest descent was ',... num2str(Fcount)])```
```Number of function evaluations for steepest descent was 600 ```
```disp(['Number of solver iterations for steepest descent was ',... num2str(output.iterations)])```
```Number of solver iterations for steepest descent was 45 ```

### 使用解析梯度的优化

```grad = @(x)[-400*(x(2) - x(1)^2)*x(1) - 2*(1 - x(1)); 200*(x(2) - x(1)^2)]; fungrad = @(x)deal(fun(x),grad(x)); options = resetoptions(options,{'HessUpdate','MaxFunctionEvaluations'}); options = optimoptions(options,'SpecifyObjectiveGradient',true,... 'Algorithm','trust-region'); [x,fval,eflag,output] = fminunc(fungrad,x0,options); title 'Rosenbrock solution via fminunc with gradient'```

```Fcount = output.funcCount; disp(['Number of function evaluations for fminunc with gradient was ',... num2str(Fcount)])```
```Number of function evaluations for fminunc with gradient was 32 ```
```disp(['Number of solver iterations for fminunc with gradient was ',... num2str(output.iterations)])```
```Number of solver iterations for fminunc with gradient was 31 ```

### 使用解析黑塞矩阵的优化

```hess = @(x)[1200*x(1)^2 - 400*x(2) + 2, -400*x(1); -400*x(1), 200]; fungradhess = @(x)deal(fun(x),grad(x),hess(x)); options.HessianFcn = 'objective'; [x,fval,eflag,output] = fminunc(fungradhess,x0,options); title 'Rosenbrock solution via fminunc with Hessian'```

```Fcount = output.funcCount; disp(['Number of function evaluations for fminunc with gradient and Hessian was ',... num2str(Fcount)])```
```Number of function evaluations for fminunc with gradient and Hessian was 32 ```
`disp(['Number of solver iterations for fminunc with gradient and Hessian was ',num2str(output.iterations)])`
```Number of solver iterations for fminunc with gradient and Hessian was 31 ```

### 使用最小二乘求解器的优化

```options = optimoptions('lsqnonlin','Display','off','OutputFcn',@bananaout); vfun = @(x)[10*(x(2) - x(1)^2),1 - x(1)]; [x,resnorm,residual,eflag,output] = lsqnonlin(vfun,x0,[],[],options); title 'Rosenbrock solution via lsqnonlin'```

```Fcount = output.funcCount; disp(['Number of function evaluations for lsqnonlin was ',... num2str(Fcount)])```
```Number of function evaluations for lsqnonlin was 87 ```
`disp(['Number of solver iterations for lsqnonlin was ',num2str(output.iterations)])`
```Number of solver iterations for lsqnonlin was 28 ```

### 使用最小二乘求解器的优化和雅可比矩阵

```jac = @(x)[-20*x(1),10; -1,0]; vfunjac = @(x)deal(vfun(x),jac(x)); options.SpecifyObjectiveGradient = true; [x,resnorm,residual,eflag,output] = lsqnonlin(vfunjac,x0,[],[],options); title 'Rosenbrock solution via lsqnonlin with Jacobian'```

```Fcount = output.funcCount; disp(['Number of function evaluations for lsqnonlin with Jacobian was ',... num2str(Fcount)])```
```Number of function evaluations for lsqnonlin with Jacobian was 29 ```
```disp(['Number of solver iterations for lsqnonlin with Jacobian was ',... num2str(output.iterations)])```
```Number of solver iterations for lsqnonlin with Jacobian was 28 ```