First, you can deal with the problem of ‘running out of iterations’ by increasing 'MaxIter' and 'MaxFunEvals' in optimset, increasing perhaps both to 1E+5. You might also experiment with decreasing 'TolFun' and 'TolX' to 1E-10 or so. It takes longer, but increases your probability of getting a good fit. You can speed that process by providing an analytic Jacobian, also easy if you have the Symbolic Math Toolbox.
Second, I've found that the trust-region-reflective algorithm lsqcurvefit uses can occasionally find several local minima before it finds the global minimum. I suggest you experiment with 'Algorithm','levenberg-marquardt' in optimset to see if that avoids local minima. The only drawback is that using the Levenberg-Marquardt algorithm doesn't allow you to constrain the parameters. Also, if you have the Statistics Toolbox, it's always a good idea to check your parameters for statistical significance with nlparci. (There is a workaround if you don't have it.)
Third, patternsearch in the Global Optimization Toolbox can be very helpful in situations where an objective function has many local minima. I definitely suggest it, especially if the Levenberg-Marquardt option for lsqcurvefit still gives you problems.
Fourth, objective functions can be difficult to fit if the data vary significantly in magnitude (on the order of 1E±6 or so). The Excel Solver scales automatically (not something I would recommend), but scaling can be helpful in some situations if the parameters affected by scaling are linearly related to the scaled data.
Fifth, nonlinear parameter estimation with complicated problems is invariably trial-and-error. The patternsearch results can depict this vividly.
Sixth, the paper was interesting. I learned something, even though I didn't take the time to read it exhaustively. (I saved it to read later.)