[SciPy-user] [SciPy-dev] question about scipy.optimize.line_search

Dominique Orban dominique.orban@gmail....
Thu Jun 28 16:00:30 CDT 2007

Alan G Isaac wrote:
> On Thu, 28 Jun 2007, Dmitrey apparently wrote: 
>>help(line_search) yields 
>>line_search(f, myfprime, xk, pk, gfk, old_fval, old_old_fval, args=(), 
>>c1=0.0001, c2=0.90000000000000002, amax=50) 
>>    Find alpha that satisfies strong Wolfe conditions. 
>>    Uses the line search algorithm to enforce strong Wolfe conditions 
>>    Wright and Nocedal, 'Numerical Optimization', 1999, pg. 59-60 
>>    For the zoom phase it uses an algorithm by 
>>    Outputs: (alpha0, gc, fc) 
>>So I need to know what are other args, especially gfk (is it a gradient 
>>in point xk?), old_fval, old_old_fval (I guess I know what do c1 & c2 mean) 

> This is certainly lacking documentation!  A little is here:
> http://docs.neuroinf.de/api/scipy/scipy.optimize.optimize-pysrc.html#line_search
> Can anyone help Dmitrey more?

Each iteration of a linesearch procedure to satisfy the strong Wolfe 
conditions requires an evaluation of f and of its gradient. I have no 
idea who coded this and I don't have the book handy this moment, but I 
would guess gk is the gradient of the objective at the current trial 
point. No clue about the old_val and old_old_val (doesn't look like my 
dream programming style).

Enforcing the strong-Wolfe conditions is not an easy task, is a 
sensitive process, and the algorithm presented in the book is certainly 
simplified as much as possible for clarity of exposition. For more 
robust software, you would be better off using the implementation of 
More and Thuente

Moré, J. J. and Thuente, D. J. 1994. Line search algorithms with 
guaranteed sufficient decrease. ACM Trans. Math. Softw. 20, 3 (Sep. 
1994), 286-307. DOI= http://doi.acm.org/10.1145/192115.192132

This is Fortran software which you could interface. I did the job in 
NLPy (http://nlpy.sf.net). You should be able to reuse my interface.


More information about the SciPy-user mailing list