[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)
Alan G Isaac
Thu Mar 8 07:35:34 CST 2007
On Thu, 8 Mar 2007, Matthieu Brucher apparently wrote:
> Here is a little proposal for the simple optimizer - I intend to make a
> damped one if the structure I propose is OK -.
> What is in the package :
> - Rosenbrock is the Rosenbrock function, with gradient and hessian method,
> the example
> - Optimizer is the core optimizer, the skeletton
> - StandardOptimizer is the standard optimizer - not very complicated in fact
> - with six optimization examples
> - Criterions is a file with three simple convergence criterions, monotony,
> relative error and absolute error. More complex can be created.
> - GradientStep is a class taht computes the gradient step of a function at a
> specific point
> - NewtonStep is the same as the latter, but with a Newton step.
> - NoAppendList is an empty list, not derived from list, but it could be done
> if needed. The goal was to be able to save every set of parameters if
> needed, by passing a list or a container to Optimizer
Hard to say since there is no interface description,
but a couple reactions ...
- isolate the examples (probably you already did) perhaps in
- possibly package the step classes together
- don't introduce the NoAppendList class unless it is really
needed, and it doesn't seem to be. The Optimizer can just
create a standard container when needed to keep track of
as much of the optimization history as might be desired.
(Thinking about an interface to express various desires
might be useful.)
- rename Criterions, perhaps to Criteria or ConvergeTest
More information about the Scipy-dev