fminsearch doesn't converge to the right value
13 次查看(过去 30 天)
显示 更早的评论
Hi all,
I am trying to solve and minimize the following exponential fit, trying to retrieve the exponential term.
I am using fminsearch to minimize the "sum-squared-error cost function" but somehow it is not really working. I am sure I am making some stupid mistake but I can't figure it out ... It should give me back the 0.1 value but never really get it (even if noise is 0).
Here the code:
x=0:100;
y=exp(-0.1*x)'+0.1*rand(101,1);
plot(x,y)
hold on
%%
[h, ff, s] = sc(0.5, x, y)
plot(x, y)
plot(x,h,x,y,x,ff)
% hold on
% s
function [h ff s]=sc(k, x, y)
co=[exp(-k*x)' ones(size(y))]\y;
%
% ff=co(1)*exp(-k*x)'+co(2)*ones(size(y));
% h=y-ff;
% s=sum(h.^2);
%
ff = @(k,x) co(1).*exp(-k.*x)'+co(2).*ones(size(y));
h = @(k) sum((y-ff(k,x)).^2);
options = optimset('PlotFcns',@optimplotfval,'Display','iter');
s = fminsearch(h,k,options);
ff=exp(-s*x)'+co(2)*ones(size(y));
h=y-ff;
end
Thanks for help,
E
0 个评论
采纳的回答
John D'Errico
2024-5-3
编辑:John D'Errico
2024-5-3
These things are always easier than you think. Well, except when they are just as hard as you think. ;-)
First, I'll create x as a COLUMN vector. That will avoid problems later, since you were forced to constantly transpose it. Just start out right from the beginning!
x = (0:100)';
Next, WHY IN THE NAME OF GOD AND LITTLE GREEN APPLES DID YOU USE RAND??????????????????
Do you understand that rand creates random numbers in the interval [0,1]? There will NEVER BE NEGATIVE NUMBERS COMING FROM RAND. USE RANDN INSTEAD!
y = exp(-0.1*x) + 0.1*rand(101,1); % THIS IS JUST BAD!!!!!
plot(x,y)
Next, plot it. Look at what you have. Note that here, the negative exponential is pretty much garbage noise for most of the curve, so the only useful signal comes from the initial part. But, ok. You still have enough data to get a fit out of it.
The model you have chosen is of the general form
y = a*exp(-k*x) + b
You MIGHT expect to find a==1, b == 0, and k = 0.1. Except that you used rand! And that means I will expect to see approximately b = 0.05.
Of course, those parameters will be estimated from somewhat poor data, but we will still get a valid result. Just for kicks, I'l throw the curve fitting toolbox at it first.
mdl = fittype('a*exp(-k*x) + b','indep','x')
fittedmdl = fit(x,y,mdl)
Even without starting values provided, fit was happy. We got something reasonable. Note that the estimate of b would be expected to be roughly 0.05. NOT zero. This is because the mean of a random number with a uniform distribution on the interval [0.0.1] is 0.05.
plot(fittedmdl,x,y)
I would not expect anything better. Now to look at your use of fminsearch for the fit. It appears you are trying to use a partitioned least squares, where you separate the conditionally linear parameters from the linear parameters. But you don't understand how to write that. (You could just download my fminspleas from the file exchange.)
k0 = 0.25;
[kest,abest]= pleasFit(k0, x, y)
As you can see, it returns the same estimates as did fit. Again, the estimate of b will be screwy because of your use of rand. The estimate I would espect to see for this model, with this data is ....
y = 1*exp(-0.1*x) + 0.05
function [kest,abest] = pleasFit(k0, x, y)
% Partitioned least squares estimation for the model
% y = a*exp(-k*x) + b
%
% k will be provided as an initial value, x and y are column vectors
linearest = @(x,y,k) [exp(-k*x),ones(size(y))]\y;
obj = @(k) norm([exp(-k*x),ones(size(y))]*linearest(x,y,k) - y);
options = optimset('PlotFcns',@optimplotfval,'Display','iter');
[kest,err,exitflag] = fminsearch(obj,k0,options);
abest = linearest(x,y,kest);
end
In the end, there were two MAJOR issues. First, WHY DID YOU USE RAND? RAND generates UNIFORM numbers in the interval [0,1]. I think too often, people just assume a random number is a random number, and do not recognize the difference between them. The result is you will get confusing results, because the additive random part does not have a mean of zero when you used rand.
Next, your code for the fit using fminsearch was completely wrong. It needs to estimate the parameters a and b INSIDE the objective. So for every value of k that fminsearch tries, it needs to estimate a NEW set of estimates for a and b. This was your major problem of course.
A partitioned nonlinear least squares splits the problem into two sets of unknowns. The nonlinear parameter (k), and then two conditionally linear parameters, here a and b. If you knew the value of k, you could estimate a and b. In fact, you could then just use polyfit to estimate a and b. (I probably should have done so.) But the linear part of the fit needs to happen INSIDE the objective function.
2 个评论
John D'Errico
2024-5-3
Exactly. I could have written it like this instead:
function [kest,abest] = pleasFit(k0, x, y)
% Partitioned least squares estimation for the model
% y = a*exp(-k*x) + b
%
% k will be provided as an initial value, x and y are column vectors
obj = @(k) norm(polyval(polyfit(exp(-k*x),y,1),exp(-k*x)) - y);
options = optimset('PlotFcns',@optimplotfval,'Display','iter');
[kest,err,exitflag] = fminsearch(obj,k0,options);
abest = polyfit(exp(-k*x),y,1) - y);
end
I'll concede that few people even attempt ideas like partitioned least squares. So kudos to you for making a good stab at it.
Once you get the hang of it, the process works very well. Partitioned least squares is far more robust than a general nonlinear least squares, since it greatly reduces the size of the parameter space. Here, we have only a 1 variable problem to search using fminsearch. That makes the search considerably faster, since you are only searching over a 1-d parameter space. It make the problem more robust, since there is no need to provide 3 starting values.
更多回答(1 个)
Milan Bansal
2024-5-3
编辑:Milan Bansal
2024-5-3
I understand that you are facing some issues when using "fminsearch" for optimizing the "sum-squared-error" cost function to find the best fit for the exponential model , where A is a constant, B is noise, and "k" is the exponential decay rate you are trying to estimate.
Please refer to the following points to estimate the value of "k":
- Define an objective function that calculates the sum-squared-error between the model predictions and the actual data for a given "k". This function will be minimized to find the best "k".
- Use the fminsearch function to find the "k" that minimizes the objective function. Start with an initial guess for "k" and let fminsearch adjust "k" to minimize the sum-squared-error.
Here is how you can modify your code to resolve the issue:.
x = 0:100;
y = exp(-0.1*x)' + 0.1*randn(101,1);
plot(x, y, 'b.'); % Plotting the original data
hold on;
initialGuess = 0.5; % Initial guess for k
[kOpt, fval] = optimizeK(initialGuess, x, y); % Optimization to find the best k
% Generating the fitted curve with the optimized k
yFitted = exp(-kOpt*x);
plot(x, yFitted, 'r-'); % Plotting the fitted curve
title(sprintf('Fitted Curve with k = %f', kOpt));
hold off;
function [kOpt, fval] = optimizeK(kInitial, x, y)
objectiveFunction = @(k) sum((y - exp(-k*x)').^2); % Objective function to minimize sum-squared-error
% Using fminsearch to find the optimal k
options = optimset('PlotFcns',@optimplotfval, 'Display', 'iter');
[kOpt, fval] = fminsearch(objectiveFunction, kInitial, options);
end
The value of "k" estimated from above code is 0.094, which is close to the desired value 0.1. The difference is due to the added noise. If the noise become zero, resultant "k" will be 0.1.
Please refer to the following documentation to learn more about fminsearch.
Hope this helps!
2 个评论
John D'Errico
2024-5-3
编辑:John D'Errico
2024-5-3
Argh. NO!!!! This answer was completely wrong! It did not use a partitioned least squares estimator! It just estimated k, given the model
y= exp(-k*x)
You need to see the difference. What was done here was @Milan Bansal did nothing more than assume the above model, even though initially it ws stated the model would be
y = A*exp(-k*x) + B
I did say before that very few people seem to appreciate what a partitioned least squares estimation is and how it works. This is a good example.
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!