Sorry, but this method, of simply stepping through lots of possible values of a parameter is a terrible way to solve the problem. It is why optimization tools were created, so that you need not use some silly artifice as you have chosen. (Yes, it is how every novice to programming chooses to solve the problem. That does not make it a good scheme, as you have discovered. In fact, about 40+ years ago, that is how I might have tried to solve this problem.)
So, for the simple case of fitting a single parameter model, you can use fminbnd to solve the problem. Minimize the norm of the residual errors as the objective function.
Note that fminbnd will be far more efficient that will a tool like fminsearch, but fminbnd requires a bracket around the solution. So if you cannot choose two values that bound the solution, then use fminsearch. And fminsearch can solve for multiple parameters at once, whereas fminbnd is limited to one parameter.
Better yet though is to use the curve fitting toolbox. It has no limits on the number of parameters. Or, if you have the stats toolbox, then use nlinfit. Or, if you have the optimization toolbox, then lsqnonlin or lsqcurvefit are two great choices.
All of the above toolboxes are terribly useful things to have around. Personally, I'd not go without any of them. If you have none of them though, there are many tools on the file exchange too. Or just use fminsearch. But don't write your own optimization code. That is never a good idea unless you really know what you are doing. And if you really know what you are doing, then you also know why it is better to use the tools in existence, written by professionals in the art.
