- lsqcurvefit is better for a problem of that form.
- Are you supplying your own derivative calculations using the GradObj option (and Hessian option if applicable)? You should do so, since the analytical derivatives are easy here. With lsqcurvefit, there are similar options, e.g. Jacobian.
- How are you initializing the optimization? Because your model is loglinear, it is likely that the initial guess as generated below will be more effective than random guessing.
minimize a function fast
3 次查看(过去 30 天)
显示 更早的评论
Hi!
I am trying the minimize a function of the following form: (Y - exp(a + b_1 X1 + b_2 X2)).^2
Y, X1, and X2 are all vectors and I am trying to find a, b_1, and b_2 that minimize this function. Its basically a nonlinear regression. So far I always used fminunc but it is very slow. I need to do this many times, so my program runs for more than a day. I appreciate your help. Thank you!
0 个评论
采纳的回答
Matt J
2014-10-23
编辑:Matt J
2014-10-24
n=numel(Y);
x0=[ones(n,1) X1(:), X2(:)]\log(Y(:)); %x0=[a;b_1;b_2]
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Nonlinear Optimization 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!