[SciPy-User] Using leastsq(), fmin(), anneal() to do a least squares fit
Wed Jun 30 14:11:08 CDT 2010
Perhaps your minimum is numerically unstable, or around the global
minimum, the cost function is more or less constant? Due to the 12
variables, you may also have several local minimas where you may be
If you want more advanced optimization tools, you may try OpenOpt or
2010/6/30 Andreas <firstname.lastname@example.org>:
> Hi there,
> I have an optimization problem in 12 variables.
> I first wrote a functino toBeMinimized(), which outputs these 12
> variables as one array. Trying to solve this problem with leastsq(), I
> noticed that however i play around with the parameters, the function
> does not seem to find the global optimum.
> So I figured I'd try some other functions from scipy.optimize, starting
> with anneal(). I wrote a wrapper function around my original
> toBeMinimized(), doing nothing but call
> np.sum(toBeMinimized(params)**2). Now, however, the results I get from
> anneal vary widely, and don't seem to have anything in common with the
> results from leastsq().
> Basically the same happens when I use fmin() instead of anneal().
> I'm somewhat at a loss here. leastsq() seems to give the most consistent
> results, but still they vary too much to be too useful for me.
> Any ideas?
> Thanks for your insight,
> SciPy-User mailing list
Information System Engineer, Ph.D.
More information about the SciPy-User