[SciPy-User] Alternatives to scipy.optimize
Charles R Harris
Sat Feb 25 22:30:24 CST 2012
On Sat, Feb 25, 2012 at 12:34 PM, <firstname.lastname@example.org> wrote:
> On Sat, Feb 25, 2012 at 2:17 PM, Erik Petigura <email@example.com> wrote:
> > Dear Scipy,
> > Up until now, I've found the optimize module very useful. Now, I'm
> > that I need finer control. I am fitting a model to data that is of the
> > following from:
> > model = func1(p1) + func2(p2)
> > func1 is nonlinear in its parameters and func2 is linear in its
> > There are two things I am struggling with:
> > 1. I'd like to find the best fit parameters for func1 using an iterative
> > approach (e.g. simplex algorithm that changes p1.). At each iteration, I
> > want to compute the optimum p2 by linear least squares in the interest of
> > speed and robustness.
> you can still do this with any regular optimizer like optimize.fmin,
> just calculate the linear solution inside the outer function that is
> optimized by fmin.
That's what I do: use leastsq and let it vary the p1 parameters which are
passed to a function that uses linear least squares to compute the
residuals of the linear least squares problem func2(p2) = data - func1(p1).
The residuals are the the values returned to leastsq. The func2 doesn't
even have to be linear if the solution can be easily computed for subsets
of the data. I've used this for fits involving hundreds of quaternions as
the p2 and a far smaller number of p1.
> > 2. I'd also like the ability to hold certain parameters fixed in the
> > optimization with out redefining my objective function each time.
This is trickier, but leastsq will generally work if you just ignore some
of the p1 parameters. Better would be to adjust the number of parameters
passed to the inner function, but that is more complicated.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User