Function lsqnonlin() has no optimization effect

1 次查看(过去 30 天)
It is a teaching example, but I can't get the right answer in my software. How can I solve this problem?
  4 个评论
Yuwen Zhu
Yuwen Zhu 2020-5-18
@Matt J@Ang Feng Thanks for your answers. I try this code again, and it works right.
Yuwen Zhu
Yuwen Zhu 2020-5-18
And I have another question. My code has no optimization effection.
function f=objfunc(x)
PNO=[60.8,60.8,60.8,60.8,60.8,60.8,15.2,30.4,60.8,121.59,151.99,202.65,60.8,60.8,60.8,60.8,60.8];
PO2=[253.3,506.6,1013.3,2026.5,5066.3,10132.5,2026.5,2026.5,2026.5,2026.5,2026.5,2026.5,2026.5,2026.5,2026.5,2026.5,2026.5];
rNO=[7.12e-05,9.66e-05,1.31e-04,1.77e-04,2.66e-04,3.60e-04,5.32e-05,9.72e-06,1.77e-04,3.25e-04,3.948e-04,5.07e-04,1.79e-04,1.77e-04,1.79e-04,1.76e-04,1.79e-04];
a=log(PNO);
b=log(PO2);
y=log(rNO);
f=x(1)+a*x(2)+b*x(3)-y;
x0=[-16.800207,1.2008,0.4151];
[x,resnorm]=lsqnonlin(@objfunc,x0)
Local minimum found.
Optimization completed because the size of the gradient is less than
the value of the optimality tolerance.
<stopping criteria details>
x =
-16.8802 1.2008 0.4151
resnorm =
4.4249

请先登录,再进行评论。

回答(1 个)

Matt J
Matt J 2020-5-18
And I have another question. My code has no optimization effect
In this case, your initial guess already is optimal. You can see this by solving for the optimum x analytically, which is possible in this case because your model is linear, and because you have no bound constraints:
a=log(PNO(:));
b=log(PO2(:));
y=log(rNO(:));
x=[ones(size(a)),a,b]\y(:)
x =
-16.8802
1.2008
0.4151

类别

Help CenterFile Exchange 中查找有关 Solver Outputs and Iterative Display 的更多信息

标签

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by