Minimize error between data distribution and expected distribution
1 次查看(过去 30 天)
显示 更早的评论
Hi all,
I have a 3 set of data which are expected to:
1) 1st data-block to approach a Gaussian distribution with mu = 0 and sigma = 1;
2) 2nd data-block to approach a Gaussian distribution with mu = 0 and sigma = .8;
3) 3rd data-block to approach a Gaussian distribution with mu = 0 and sigma = .5;
Each data-block has only a limited number of representations (generally between 2048 and 8192) and because of some filter effects drawn by the specific code I use, they will not exactly match the corresponding expected distribution.
The point is that, although what it implies in terms of manipulation, I want each data-block to minimize the discrepancy between actual and expected distribution. It's to be remarked that I won't increase the number of representations, due to some need I will not explain in detail.
Generally, the first data-block, respect to the normal Gaussian distribution, looks like the followinf figure:
I was thinking to use lsqcurvefit for this purpose.
What would you suggest?
0 个评论
回答(1 个)
Wouter
2013-3-20
Do you know this function:
histfit
6 个评论
Wouter
2013-3-21
编辑:Wouter
2013-3-21
You could try to change individual datapoints after your filteringset in order to update your datapoints; this will change the blue bars. For example; find a blue bar that is too high; change one of those datapoints into a value which lies in a blue bar that too low (compared to the red line). This does however changes your data and will render step 2)treat_with_piece_of_code useless.
However it makes more sense to find a better fit to the histogram; i.e. change the red line. Lsqcurvefit would only be useful if you would like to update the red line (fit)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!