[SciPy-User] optimize.fmin_cg terminates when w - grad*1e-10 yields lower obj & grad
Mon Oct 5 20:50:20 CDT 2009
On Mon, Oct 5, 2009 at 6:32 PM, Jason Rennie <email@example.com> wrote:
> Setting amin in linesearch.line_search() to a smaller value (I tried 1e-12)
> seems to be a hack workaround for the zoom() issue (since it almost never
> falls-through to the (old?) optimize.line_search() method). But, I'm
> wondering: has Johnathan Shewchuk's "Preconditioned Nonlinear Conjugate
> Gradients with Secant and Polak-Ribiere" (pg. 53/59 of his tutorial) been
> considered? I used this as the basis for a CG Matlab implementation for my
> thesis work and it handled a very large problem (million+ params) nicely
> once I worked out the numerical issues (it's amazing how much work is
> involved in turning such detailed psuedocode into a solid implementation!).
> Link to the paper:
> On Mon, Oct 5, 2009 at 4:55 PM, Jason Rennie <firstname.lastname@example.org> wrote:
>> Looks like optimize.zoom() is also buggy in that it will return a step
>> size corresponding to an increased objective if it can't find a step in
>> maxiter iterations.
Do you have a test case? What I have seen in the optimize.tests is only one
case for fmin_cg, which looks similar to your case
log_pdot = dot(self.F, x)
logZ = log(sum(exp(log_pdot)))
f = logZ - dot(self.K, x)
but might have well behaved parameterization.
If you can write a test case that works on the limit of the current precision,
we could include it in the test suite. The same optimization problem is
used to test several minimizers, so this could also check whether any of
the other ones is able to handle this problem.
If zoom is also buggy, more work and a failing test case will be required to
find and correct the bug.
For your other comments, I don't know enough about fmin_cg.
amin=1e-12 Could this be a problem if the numerical precision of
the objective function and the gradient are not high enough?
If you have a better cg algorithm or one that works better for some
cases, you could propose it for inclusion in scipy.
Thanks for filing the ticket.
>> On Mon, Oct 5, 2009 at 4:33 PM, <email@example.com> wrote:
>>> On Mon, Oct 5, 2009 at 4:11 PM, Jason Rennie <firstname.lastname@example.org> wrote:
>>> > The bug seems to be that scipy.optimize.linesearch.line_search can
>>> > return a
>>> > step size which increases the objective. Later linesearches are then
>>> > fubar'd b/c the (phi0-old_old_fval)/derphi0 calculation yields a
>>> > negative
>>> > value.
>>> > Would someone mind sanity-checking this assertion? Is it possible
>>> > for minpack2.dcsrch to return a step which yields a negative objective?
>>> > I'm
>>> > seeing it when the amin value is hit. I.e. it's returning a step size
>>> > of
>>> > 1e-8.
>>> > Thanks,
>>> > Jason
>>> bug candidate: linesearch doesn't honor warn
>>> Has fortran 0 or 1 based indexing?
>>> I think task[1:4] == 'WARN':
>>> should instead be
>>> task[:4] == 'WARN':
>>> > --
>>> > Jason Rennie
>>> > Research Scientist, ITA Software
>>> > 617-714-2645
>>> > http://www.itasoftware.com/
>>> > _______________________________________________
>>> > SciPy-User mailing list
>>> > SciPy-User@scipy.org
>>> > http://mail.scipy.org/mailman/listinfo/scipy-user
>>> SciPy-User mailing list
>> Jason Rennie
>> Research Scientist, ITA Software
> Jason Rennie
> Research Scientist, ITA Software
> SciPy-User mailing list
More information about the SciPy-User