Question on optimization problem and fminsearch​,fminunc,l​sqnonlin

8 次查看(过去 30 天)
Hey all, I am trying to do an optimization problem where I import real life data and try to find the best combination of 6 unknown variables that describe the real life data. The function being run in the optimization call is a series of if/then statements and equations and the output evalutaion is based on the distance difference between real data and the simulated. There are as many equations as variables plus the if/then statements When I use fminsearch the program works just okay but not ideal to find the minimum. When i try fminunc or lsqnonlin, the output basically repeats the initial guess which is not really close to the actual solution. Why are these functions so dependant on the initial guess? Which of these functions should I be using? Any ideas on what I could do to solve this problem in my optimization?
  2 个评论
Sargondjani
Sargondjani 2012-6-6
and as sean notes: fminunc assumes your problem is differentiable... if it is not, than take his advice
but if your problem is differentiable and fminunc exactly returns the initial guess, then something is wrong. you should check the exit message... could be that the maximum number of function evaluations is reached, or something like that
Wes
Wes 2012-6-6
It does not return the exact initial guess. It returns the first two parameters the same as the initial guess, but then changes the last 2-3 parameters to fit. Not sure if that makes any sense or not, but I will one of those other function methods and see how that works.

请先登录,再进行评论。

回答(2 个)

Sean de Wolski
Sean de Wolski 2012-6-5
The initial guess is important because the above mentioned optimizers are trying to find a local minimum, i.e. the one closest to the initial guess that can be achieved using derivatives. From your above description, it sounds like there is a good chance that your function is not differentiable and thus a genetic algorithm, global search or patternsearch is required to find the global minimum. These functions are in the Global Optimization Toolbox:

Geoff
Geoff 2012-6-5
Depending on how localised your minima are, you can sometimes get around this with a simplex-based solver like fminsearch. I start with a large simplex, run the solution and let it converge. Then I reduce the size of the simplex, "shake up" the result (offsetting by the simplex) and let the solution converge again. I repeat this several times. But then, I don't know if fminsearch does this already. Caveat on this is that I was using a Nelder-Mead implementation in C++, not MatLab... I think you may be able to use optimset to configure fminsearch with a bit more of a manual feel.
If you're not time-constrained, you may want to set a large number of random initial guesses, sampled across your solution space, solve each one and choose the best. But given you have 6 unknowns, it doesn't take much partitioning before the problem blows up. And if some variables are unconstrained, this can become quite impractical.

类别

Help CenterFile Exchange 中查找有关 Solver Outputs and Iterative Display 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by