FMINUNC CHECK GRADIENT FAILS
6 次查看(过去 30 天)
显示 更早的评论
Hello everyone,
I'm trying to minimize this function through fminunc running:
[ygrad, cost] = tvd_sim_grad(x, lam, Nit,t);
where x is 4096x1 double & lam, Nit, t are 1x1 double.
function [xden,fval] = tvd_sim_grad(y, lam, Nit,t)
rng default % For reproducibility
ycut=double(abs(y)-t>0); % OUTLIERS REDUCTION TO t = variance calculated using robust covariance estimation.
yind=find(ycut==1);
y(yind)=t;
y=y+1; % necessary to get out of the neighborhood of zero
y0=y;
ObjectiveFunction = @(y) tvd_sim2(y,y0,lam);
options = optimoptions('fminunc','MaxIter',Nit,'ObjectiveLimit',0,'MaxFunEvals',Inf,'TolFun',1e-20,...
'TolX',1e-20,'UseParallel',false,'SpecifyObjectiveGradient',true,'CheckGradients',true,...
'FinDiffRelStep',1e-10,'DiffMinChange',0,'DiffMaxChange',Inf,'Diagnostics','off','Algorithm','quasi-newton',...
'HessUpdate','bfgs','FinDiffType','central','HessianFcn',[],...
'PlotFcns','optimplotfval','Display','final-detailed');
[xden,fval] = fminunc(ObjectiveFunction,y,options);
xden= xden-1; % zero realignment
end
function [TVD,mygrad] = tvd_sim2(x,y, lam)
TVD=1/2.*sum(abs((y-x).^2)) + lam.*sum(abs(diff(diff(-y./(1-x.*y-x.^2)))));
f=@(x) 1/2.*sum(abs((y-x).^2)) + lam.*sum(abs(diff(diff(-y./(1-x.*y-x.^2)))));
mygrad=gradient(f(x'));mygrad=mygrad';
end
This is a modification of the total variation denoising that I created to make the function itself derivable (the original is not differentiable in the second term). This function is differentiable in all real space except to 0. As you can see I made the opportune modification to the dataset to avoid zeros and now the data is condensed around the value 1.
When I use :
'SpecifyObjectiveGradient',false
I obtain this great results (red line is xden) :
But when I use :
'SpecifyObjectiveGradient',true
it makes 0 iteration and fails returning :
Optimization stopped because the objective function cannot be decreased in the
current search direction. Either the predicted change in the objective function,
or the line search interval is less than eps.
'CheckGradients',true
gives me :
Objective function derivatives:
Maximum relative difference between supplied
and finite-difference derivatives = 33382.1.
Supplied derivative element (1012,1): 0.480282
Finite-difference derivative element (1012,1): -33381.7
CheckGradients failed.
____________________________________________________________
Error using validateFirstDerivatives (line 102)
CheckGradients failed:
Supplied and finite-difference derivatives not within 1e-06.
how to get the above results by providing the gradient and why it doesn't work?
Thanks !
13 个评论
Torsten
2022-12-20
A one-sided finite difference approximation for the derivative instead of a centered one will half the number of function calls...
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Linear Programming and Mixed-Integer Linear Programming 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!