[SciPy-Dev] Minimizer in scipy.optimize
Fri Mar 16 15:19:52 CDT 2012
16.03.2012 18:33, Martin Teichmann kirjoitti:
> I had been working on a mostly-python implementation of the
> Levenberg-Marquardt algorithm for data fitting, which I put here:
> one of my main goals was to make it more flexible and usable
> than the FORTRAN version we have in scipy right now.
Your contribution splits into two independent parts:
(a) a new implementation of Levenberg-Marquardt in Python/Cython
(b) a new API for fitting
My comments on (a): from what I've looked into this pull request, you
have done a careful job, and this would be an useful addition to Scipy,
IMHO. As you say, it offers a better base for further work (e.g. adding
bounds) than the MINPACK Fortran routines.
However, I feel uncomfortable replacing the MINPACK minimization routine
with a new implementation, which has not yet stood the test of time. So
merging this part gets +1 from me, but it should not override `leastsq`,
at least not yet.
Comments on (b).
This requires reaching a decision on what sort of convenience
optimization interface(s) should Scipy have, which is the main sticking
point with the PR. Currently, there's curve_fit, and you can use the
function-based ones directly.
There could be room for more. However, how it should look like is partly
a matter of taste. We could of course add all of the suggested
interfaces. So far, this has IIRC not been extensively discussed, and it
is unclear to me what would be useful.
> So I took an object-oriented approach, where you inherit from a
> fitter class and reimplement the function to fit. Some convenience
> functions around makes this approach very simple, the
> most simple version is using a deocrator, say you want
> to fit your data to a gaussian, you would write:
> @fitfunction(width=3, height=2, position=4)
> def gaussian(x, width, height, position):
> # some code to calculate gaussian here
> gaussian.fit(xdata, ydata, width=2, height=1)
> that's it! I would like to have some comments about it.
This looks nice and concise.
Another option is to just extend the `curve_fit` interface, e.g., allow
specifying fixed parameter values and bounds:
result = curve_fit(gaussian, xdata, ydata,
bounds=dict(width=(0, 1), height=(5, 6)))
Or, offer both.
There's probably no reason to couple this sort of thing very tightly
with a given optimization algorithm.
> There also was the effort of Denis Laxalde,
> where he tries to unify minimization algorithms.
The minimize() work is more of a internal refactoring of the
optimization routines, to specify an more uniform interface that can be
used in higher-level solutions. That is, the '_minimize_*' functions.
The interface these routines provide this way is pretty much the raw
function-based interface they had previously (apart maybe from
> This was actually a very unfortunate approach: he
> unifies many algorithms into one function. This function
> is then nothing else then a big if-else-statement that
> gives command to the algorithm in question. This
> is a non-extensible approach, I cannot simply add
> my super-cool-minimizer, give it a new name and
> drop it in for the scipy ones, but I have to change
> the scipy function to incorporate my new function.
I don't see why this would be a problem: minimize() only calls routines
within Scipy, and for those you can just add the couple of lines to the
entry point function. This is a less complex approach than using a
registry pattern, and IMHO at the moment the situation is under control
(the entry point function body is 71 lines) and there's not much
Changing this to use a registry pattern or something similar could of
course be possible, so that users and other parties could hook up new
optimization algorithms to the same interface. However, I'm not sure if
there is a need for this.
More information about the SciPy-Dev