[SciPy-dev] GSoC schedule question (letter for my mentors & Matthieu Brucher)
Wed Aug 1 11:57:23 CDT 2007
thank you for the answer.
Please inform me, does your package provide a func that is equivalent to
Can you supply an example of usage of the one?
here's a link to Wolfe conditions:
1) for lincher I need something like that (If you have a solver that
uses better conditions than Wolfe ones - ok).
2) for ralg I need auxiliary line search solver that will surely provide
(pk, grad f(xk+alpha*pk)) <= 0
1st task is much more important.
2nd one have a solver already, but not the best one, in future I intend
to replace the one by something better.
Do you have any idea about 1st , or maybe 2nd?
Matthieu Brucher wrote:
> So, as you see, params are not described; especially I'm
> interested in old_fval and old_old_fval.
> So, there was a letter from NLPy developer about an alternative
> (link to an article was provided), also, Matthieu proposed using
> one of his own solvers.
> I think the importance of the auxiliary solver is very high and
> should be done in 1st order. My opinion about next thing to do is
> using an appropriate solver from Matthieu's package. However,
> Matthieu's syntax differs too much from openopt one. I think first
> of all there should be an openopt binding to Matthieu's package,
> that will allow for oo users to use same syntax:
> prob = NLP(...)
> r = prob.solve()
> I would implement the binding by myself, but I miss well-described
> API documentation of Matthieu's code. First of all I'm interested
> 1) what solvers does it contain?
> I put a small description here :
> https://projects.scipy.org/scipy/scikits/wiki/Optimization (still in
> 2) how xtol, funtol, contol etc can be passed to the solvers?
> Each of these parameters are either step information, line search
> information or criterion information. Each parameter must be given to
> the corresponding object that will use it (I didn't want to centralize
> everything as some modules need pre-computation before they can be
> used in the optimizer, like the Fibonacci section search).
> then, secondary (it can wait, maybe default parameters would be
> enough for now)
> 3) which params can I modify (like type of step
> (Armijo|Powell|etc), etc)
> You can modify everything. The goal is to provide bricks that you can
> build together so that the optimizer makes what you need. If you want
> to provide new modules, here are some "rules" :
> - the function should be provided as an object that defines the
> correct methods, like __call__, gradient or hessian if needed (the
> case of approximation of the gradient as a finite-element one should
> be programmed with a class from which the function derives, but we can
> speak of this in another mail if you want details on this one)
> - a criterion module takes only one argument which is the current
> state of the optimizer
> - a step module takes three arguments : the function being optimized,
> the point where to search for a step and the state of the optimizer
> - a line search module takes a four arguments : the point, the
> computed step, the function and the state (I suppose this should be
> refactored to be more consistent with the step module...)
> - the core optimizer that uses these mdoules and dispatches the data
> BTW ralg is also missing a good line-search optimizer. It requires
> the one that finds solutions with slope angle > pi/2. But it can
> wait, it has one quite good and problem with lincher is more actual.
> If you have an algorithm that can do this, you only have to program it
> and everyone will be able to use it with the other modules.
> So I think the next GSoC schedule step should be connection of 1-2
> Matthieu's solvers (that take into account slope angle, like
> strong Wolfe conditions do) to native openopt syntax.
> If I understand this correctly, it is wrapping some usual combinations
> together so that people use them without knowing, like for the brent
> functiona nd the Brent class ? It should be easy by overriding the
> optimizer constructor and by adding a solve method that just calls
> optimize() (in this case, I'll probably modify optimize() so that it
> does not return something, and the state of the optimizer would
> provide the answer).
> Are you agree?
> if yes, my question to Matthieu: can you provide a description of
> some appropriate solvers from your package? and/or an example of
> If you need more details, ask specific questions, I'll gladly answer
> them (and add them to the wiki)
> Scipy-dev mailing list
More information about the Scipy-dev