fmincon doesn't find a global minimum for a convex function
12 次查看(过去 30 天)
显示 更早的评论
In my opinion, fmincon is a built-in function for local minimum in matlab. If the objective function is a convex problem, there is only one basin and the local minimum is the global minimum. While starting from different initial points in my experiment, the algorithm got different minimums function. I wonder if fmincon guarantees to be converged to a global minimum for convex problem. If not, is there any other techiniques I can use for convex opimization as fast as possible? Is running GlobalSearch a good idea for global optimization? Thanks.
Here is the programming code:
options = optimoptions('fmincon');
problem.options = options;
problem.solver = 'fmincon';
problem.objective = @(x) langBW(x, in_s, in_e, C1, a, p_ul);
problem.Aineq = ones(1,user_num);
problem.bineq = BW2;
problem.nonlcon = @(x) nonlConstr_bw(x,a,p_ul,T1,in_s,in_e,BW2);
problem.x0 = ones(user_num,1)
[b_ul,fval] = fmincon(problem);
langBW is the objective function, which is a convex function of x , the code of langBW is as follow,
function fmin = langBW(x, in_s, in_e, C1, a, p_ul)
if size(x,1)<size(x,2)
x = x';
end
b_ul = x;
r_ul = b_ul .* log2(1 + a.*p_ul./b_ul);
fmin = sum((in_s+in_e).*p_ul./r_ul) + sum(C1);
end
The nonlConstr_bw is the function of nonlinear constraints. It is shown as follow,
function [c,ceq] = nonlConstr_bw(x,a,p_ul,T1,in_s,in_e)
user_num = size(p_ul,1);
if size(x,1)<size(x,2)
x = x';
end
b_ul = x;
r_ul = b_ul .* log2(1 + a.*p_ul./b_ul);
c1 = max(in_s./r_ul) + in_e./r_ul - T1;
c = c1;
ceq = zeros(user_num,1);
end
Except x , all other variables are supplied. The problem is that when I set different `problem.x0`, for example, when `problem.x0=ones(user_num,1);`, the solution of `[b_ul,fval] = fmincon(problem);` is different from that when `problem.x0=2*ones(user_num,1);`. That is what I am confused about.
2 个评论
回答(2 个)
Aquatris
2018-5-10
fmincon is not necessarily for finding local minima but it can get stuck in a local minimum due to the nature of the algorithms. You might wanna try changing the algorithms (interior point - trust region - sqp) used by fmincon or changing some of the parameters of optimization if you do not get reasonable results. Evolutionary algorithms tend to find global minimum a lot better. You might wanna give them a try. Matlab has a built-in genetic algorithm optimization functions if you have the required toolbox or you can find community supplied particle swarm optimization code.
Matt J
2018-5-10
编辑:Matt J
2018-5-10
Even if the function is convex, there may be multiple global minima and the one you arrive at would depend on the initial point.
Keep in mind as well that your selection of stopping parameters (TolX, TolFun, etc...) have an influence on where the algorithm quits. Even if your function has a unique minimum, the algorithm will try to converge to it, but won't typically reach it exactly and the final error will be x0-dependent.
7 个评论
Matt J
2018-5-11
编辑:Matt J
2018-5-11
We cannot run the code, because the problem constants (in_s, in_e, etc...) have not been provided.
I will mention, however, that your nonlinear constraints, while they may be convex, are not differentiable due to the max() operation. fmincon assumes differentiability. A differentiable equivalent to your constraint would be,
term1=in_s./r_ul;
term2=in_e./r_ul;
c = term1(:) + term2(:).' - T1;
ceq = [];
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!