# 使用梯度和 Hessian 稀疏模式进行最小化

`$f\left(x\right)=\sum _{i=1}^{n-1}\left({\left({x}_{i}^{2}\right)}^{\left({x}_{i+1}^{2}+1\right)}+{\left({x}_{i+1}^{2}\right)}^{\left({x}_{i}^{2}+1\right)}\right),$`

`n = 1000;`

```load brownhstr spy(Hstr)```

```options = optimoptions(@fminunc,'Algorithm','trust-region',... 'SpecifyObjectiveGradient',true,'HessPattern',Hstr);```

```xstart = -ones(n,1); xstart(2:2:n,1) = 1; fun = @brownfg;```

`[x,fval,exitflag,output] = fminunc(fun,xstart,options);`
```Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. ```

`disp(fval)`
``` 7.4738e-17 ```
`disp(exitflag)`
``` 1 ```
`disp(output)`
``` iterations: 7 funcCount: 8 stepsize: 0.0046 cgiterations: 7 firstorderopt: 7.9822e-10 algorithm: 'trust-region' message: 'Local minimum found....' constrviolation: [] ```

`disp(max(x))`
``` 1.9955e-10 ```
`disp(min(x))`
``` -1.9955e-10 ```

### 辅助函数

```function [f,g] = brownfg(x) % BROWNFG Nonlinear minimization test problem % % Evaluate the function n=length(x); y=zeros(n,1); i=1:(n-1); y(i)=(x(i).^2).^(x(i+1).^2+1) + ... (x(i+1).^2).^(x(i).^2+1); f=sum(y); % Evaluate the gradient if nargout > 1 if nargout > 1 i=1:(n-1); g = zeros(n,1); g(i) = 2*(x(i+1).^2+1).*x(i).* ... ((x(i).^2).^(x(i+1).^2))+ ... 2*x(i).*((x(i+1).^2).^(x(i).^2+1)).* ... log(x(i+1).^2); g(i+1) = g(i+1) + ... 2*x(i+1).*((x(i).^2).^(x(i+1).^2+1)).* ... log(x(i).^2) + ... 2*(x(i).^2+1).*x(i+1).* ... ((x(i+1).^2).^(x(i).^2)); end end```