[SciPy-dev] GSoC schedule question (letter for my mentors & Matthieu Brucher)
Wed Aug 1 09:18:41 CDT 2007
So, as you see, params are not described; especially I'm interested in
> old_fval and old_old_fval.
> So, there was a letter from NLPy developer about an alternative (link to
> an article was provided), also, Matthieu proposed using one of his own
> I think the importance of the auxiliary solver is very high and should be
> done in 1st order. My opinion about next thing to do is using an appropriate
> solver from Matthieu's package. However, Matthieu's syntax differs too much
> from openopt one. I think first of all there should be an openopt binding to
> Matthieu's package, that will allow for oo users to use same syntax:
> prob = NLP(...)
> r = prob.solve()
> I would implement the binding by myself, but I miss well-described API
> documentation of Matthieu's code. First of all I'm interested
> 1) what solvers does it contain?
I put a small description here :
https://projects.scipy.org/scipy/scikits/wiki/Optimization (still in
2) how xtol, funtol, contol etc can be passed to the solvers?
Each of these parameters are either step information, line search
information or criterion information. Each parameter must be given to the
corresponding object that will use it (I didn't want to centralize
everything as some modules need pre-computation before they can be used in
the optimizer, like the Fibonacci section search).
then, secondary (it can wait, maybe default parameters would be enough for
> 3) which params can I modify (like type of step (Armijo|Powell|etc), etc)
You can modify everything. The goal is to provide bricks that you can build
together so that the optimizer makes what you need. If you want to provide
new modules, here are some "rules" :
- the function should be provided as an object that defines the correct
methods, like __call__, gradient or hessian if needed (the case of
approximation of the gradient as a finite-element one should be programmed
with a class from which the function derives, but we can speak of this in
another mail if you want details on this one)
- a criterion module takes only one argument which is the current state of
- a step module takes three arguments : the function being optimized, the
point where to search for a step and the state of the optimizer
- a line search module takes a four arguments : the point, the computed
step, the function and the state (I suppose this should be refactored to be
more consistent with the step module...)
- the core optimizer that uses these mdoules and dispatches the data
BTW ralg is also missing a good line-search optimizer. It requires the one
> that finds solutions with slope angle > pi/2. But it can wait, it has one
> quite good and problem with lincher is more actual.
If you have an algorithm that can do this, you only have to program it and
everyone will be able to use it with the other modules.
So I think the next GSoC schedule step should be connection of 1-2
> Matthieu's solvers (that take into account slope angle, like strong Wolfe
> conditions do) to native openopt syntax.
If I understand this correctly, it is wrapping some usual combinations
together so that people use them without knowing, like for the brent
functiona nd the Brent class ? It should be easy by overriding the optimizer
constructor and by adding a solve method that just calls optimize() (in this
case, I'll probably modify optimize() so that it does not return something,
and the state of the optimizer would provide the answer).
Are you agree?
> if yes, my question to Matthieu: can you provide a description of some
> appropriate solvers from your package? and/or an example of usage?
If you need more details, ask specific questions, I'll gladly answer them
(and add them to the wiki)
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Scipy-dev