[SciPy-dev] Fwd: from GSoC student Dmitrey Kroshko

Matthieu Brucher matthieu.brucher@gmail....
Mon Aug 13 13:44:55 CDT 2007


2007/8/13, Dmitrey Kroshko <openopt@ukr.net>:
>
>  Hi Matthieu,
> I just implemented very simple algorithm for line-search subproblem,
> described in the book B. Pshenichniy "Linearisation method", page 46, and
> lincher is almost the same as the NLP solver from same page 46. You can
> check that one in lincher.py, line 296 (as you see, it didn't use gradient
> info vs yours). But since the book is dated 1983, I guess there are better
> algorithms for now. I will be happy if you'll provide any one (that can be
> used with non-smooth funcs).
>


It's logical that 'my' line search uses the gradient information, it's the
WP rules. As I stated before, this is only one of several searches that are
available. One of the other available is the backtracking search, one of
those mentioned in our discussion which is only application of the Armijo
rule (actually the first of the two Wolfe Powell rules).


So I have updated svn,
>


Thanks :)
The code here
http://projects.scipy.org/scipy/scikits/browser/trunk/openopt/scikits/openopt/solvers/optimizers/line_search/backtracking_search.pydoes
almost the same, but with a custom factor instead of only the
0.5 value.


and if you are interested here's results of mine and yours funcs:
>
> #my Armijo implementation
> itn 66 : Fk= 85.4300044557 maxResidual= 8.77891444873e-07
> istop:  4 (|| F[k] - F[k-1] || < funtol)
> Solver:   Time elapsed = 7.41     CPU Time Elapsed = 7.28
> objFunValue: 85.4300044557 (feasible)
>
> #Matthieu:
>         state = {'direction' : direction, 'gradient': lsF.gradient(x0)}
>         mylinesearch = line_search.StrongWolfePowellRule(sigma=5)
>         destination = mylinesearch(function = lsF, origin = x0, step =
> direction, state = state)
>
> itn 78 : Fk= 85.4300278074 maxResidual= 6.97178904829e-07
> istop:  4 (|| F[k] - F[k-1] || < funtol)
> Solver:   Time elapsed = 8.58     CPU Time Elapsed = 8.46
> objFunValue: 85.4300278074 (feasible)
>
> if I use
> line_search.StrongWolfePowellRule() (i.e. with default param sigma)
> it yields (the example from lincher.py head)
> itn 0: Fk= 1428.11851019 maxResidual= 2242631.78131
> itn 10 : Fk= 86.6072664467 maxResidual= 0.466521056114
> N= 5336.61507377
> and then CPU hang up (at least, I didn't observe anything till ~2 min and
> then stopped; my iprint = 10).
>

Well, with this value, outside the (0,1) intervale, you are using the Armijo
rule only ;)
I'm happy you found a correct line search for your optimizer :)

Matthieu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-dev/attachments/20070813/d8e87246/attachment.html 


More information about the Scipy-dev mailing list