[SciPy-User] tip (maybe): scaling and optimizers
Fri Oct 26 12:18:19 CDT 2012
On Fri, Oct 26, 2012 at 3:20 AM, Paweł Kwaśniewski <firstname.lastname@example.org> wrote:
> Hi Josef,
> I also noticed that fmin_slsqp is highly scale-sensitive, I also had
> that impression using leastsq. Can you tell me where I can find some
> more information on how to deal with this?
For fmin_slsqp there is only the mailing list thread and my adjustments
in a pull request for statsmodels. I don't have any other information.
What I did was to replace sum of log likelihood terms by the mean,
that is divide objective function (and gradient and hessian) by number of terms.
Since then fmin_slsqp seems to work pretty well.
I never ran into serious problems with leastsq, but there might be a
problem with the numerical derivatives (finite difference) which in my
impression are not always very good.
> 2012/10/26 <email@example.com>:
>> mainly an observation:
>> After figuring out that fmin_slsqp is scale sensitive, I switched to
>> normalizing, rescaling loglikelihood functions in statsmodels.
>> Loglikelihood functions are our main functions for nonlinear optimization.
>> Today I was working by accident on an older branch of statsmodels, and
>> the results I got with fmin_bfgs were awful.
>> After switching to statsmodels master, the results I get with
>> fmin_bfgs are much better (very good: robust and accurate).
>> The impression I got from this and from a discussion with Ian Langmore
>> (on an L1 penalized optimization pull request) is that many scipy
>> optimizers might be scale sensitive in the default settings.
>> Watch the scale of your objective function !?
>> (qualifier: I don't remember if other changes are in statsmodels
>> master and not in my old branch that make optimization more robust.)
>> "anecdotal evidence ain't proof"
>> ( http://www.unilang.org/viewtopic.php?f=11&t=38585&start=0&st=0&sk=t&sd=a )
>> SciPy-User mailing list
> SciPy-User mailing list
More information about the SciPy-User