[SciPy-User] scipy.optimize named argument inconsistency

josef.pktd@gmai... josef.pktd@gmai...
Tue Sep 6 10:23:19 CDT 2011


On Tue, Sep 6, 2011 at 10:53 AM, Matt Newville
<newville@cars.uchicago.edu> wrote:
>>> Well, I understand that this adds an implicit bias for least-squares, but
>>> if the algorithms receive an array from the user-function, taking
>>> (value*value).sum() might be preferred over raising an exception like
>>>     ValueError: setting an array element with a sequence
>>
>> I'd rather have an exception, but maybe one that is more explicit.
>> leastsq is efficient for leastsquares problems. If I switch to fmin or
>> similar, it's usually because I have a different objective function,
>> and I want to have a reminder that I need to tell what my objective
>> function is (cut and paste errors are pretty common).
>
> The present situation makes it more challenging to try out different
> minimization procedures, as the objective functions *must* be
> different, and in a way that is poorly (ok, un-) documented.  If the
> objective functions had consistent signatures, it would make it much
> easier to (as was suggested) write a wrapper that allowed selection of
> the algorithm.
>
>>> which is apparently meant to be read as "change your function to return a
>>> scalar". I don't see in the docs where it actually specifies what the user
>>> functions *should* return.
>>>
>>> I also agree with Charles' suggestion. Unraveling multi-dimensional arrays
>>> for leastsq (and others) would be convenient.
>>
>> I'm not quite sure what that means.
>
> leastsq (and the underlying lmdif) require a 1-d array.  The
> suggestion is to not fail if a n-d array is passed, but to unravel it.
>
>> I think there is a difference between low level wrappers for the
>> optimization algorithms (leastsq) and "convenience" functions like
>> curve_fit.
>
> I agree.
>
>> I'm in favor of standardizing names (original topic), but I don't
>> think it is useful to "enhance" general purpose optimizers with lot's
>> of problem specific features and increase the list of optional
>> arguments.
>
> OK.  I was not suggesting adding problem-specific features, just
> suggesting that standardizing the behavior of the objective functions
> might be helpful too.

What I'm trying to argue is that there are inherent differences
between optimizers that cannot or should not be unified in the
interface to the optimizers. This is different from removing naming
inconsistencies.

But I also think it would be useful to add a dispatch function with
unified interface as Skipper proposed earlier in the thread, and that
we use in statsmodels for some of the optimizers (although we don't
wrap leastsq, and we haven't gotten around yet to wrap the constraint
optimizers). scipy.distribution.fit now also allows a choice of
optimizers, if I remember correctly.

Josef

>
> --Matt
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>


More information about the SciPy-User mailing list