[SciPy-User] scipy.optimize.leastsq question

Joshua Holbrook josh.holbrook@gmail....
Tue Jul 6 13:54:12 CDT 2010


On Tue, Jul 6, 2010 at 10:46 AM, ms <devicerandom@gmail.com> wrote:
>
> On 06/07/10 19:19, Joshua Holbrook wrote:
> > On Tue, Jul 6, 2010 at 10:02 AM, ms<devicerandom@gmail.com>  wrote:
> >
> >> On 29/06/10 09:59, Sebastian Walter wrote:
> >>>>> Only use derivative free optimization methods if your problem is not
> >> continuous.
> >>>>> If your problem is differentiable, you should compute the Jacobian
> >>>>> yourself, e.g. with
> >>>>>
> >>>>> def myJacobian(x):
> >>>>>       h = 10**-3
> >>>>>       # do finite differences approximation
> >>>>>       return ....
> >>   >>>
> >>   >>>  and provide the Jacobian to
> >>   >>>  scipy.optimize.leastsq(..., Dfun = myJacobian)
> >>
> >> Uh, I am real newbie in this field, but I expected that the Jacobian was
> >> needed if there was an analytical expression for the derivatives; I
> >> thought the leastsq routine calculated the finite difference
> >> approximation by itself otherwise. So I never bothered providing an
> >> "approximate" Jacobian. Or maybe I do not get what do you mean by finite
> >> difference.
>
> > I say this not as someone intimately familiar with scipy.optimize, but as
> > someone who has implemented a least squares-ish algorithm himself.
> >
> > You are almost certainly correct in that leastsq calculates an approximate
> > Jacobian using a finite difference method on its own. However, if you can
> > symbolically differentiate your promblem without too much heartache, then
> > supplying an exact Jacobian is probably preferrable due to higher precision
> > and less function evaluation (f(x) and f(x+h), differenced and normalized,
> > vs. simply f'(x)).
> >
> > On the other hand: When I implemented my algorithm (nearly two years ago),
> > my equations were pretty nasty. My derivatives just happened to be much much
> > worse (as can be seen at
> > http://modzer0.cs.uaf.edu/~jesusabdullah/gradients.html, at least for a
> > little while), and at the time sympy honestly wasn't production-ready. So, I
> > ended up using a finite difference method to calculate them (I believe I
> > used scipy's derivative function), with which I did have to tweak step
> > sizes.
>
> Thank you. What you tell me is very similar to what I have always
> understood. But I was confused because of the pseudocode that Sebastian
> Walter provided:
>
>  >> On 29/06/10 09:59, Sebastian Walter wrote:
>  >>>>> If your problem is differentiable, you should compute the Jacobian
>  >>>>> yourself, e.g. with
>  >>>>>
>  >>>>> def myJacobian(x):
>  >>>>>       h = 10**-3
>  >>>>>       # do finite differences approximation
>  >>>>>       return ....
>  >>   >>>
>  >>   >>>  and provide the Jacobian to
>  >>   >>>  scipy.optimize.leastsq(..., Dfun = myJacobian)
>
> that explicitly says you can provide one with finite differences
> approximation, so I am unsure.
>
> thanks,
> M.
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user


Huh! I didn't notice that interesting comment there :) It could be
that there's something I don't know. On the other hand, it could also
just be that computing your own Jacobian allows for fine-tuning--for
example, different dx's depending on the function, higher-precision
FDMs for more complex equations, and maybe even mixings-in of simple,
known derivatives. For example, while most of my Jacobian's equations
were pretty obnoxious, I did have a few simple ones (some 0s, and
possibly some basic trig functions here n' there--I don't remember
exactly), and while some of my functions used degrees/radians as
inputs, others used distance, and the shape of the function depended
heavily on a relative scale, not an absolute one, and as such had a
parameterized dx.

--Josh


More information about the SciPy-User mailing list