# [SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Matthieu Brucher matthieu.brucher@gmail....
Fri Mar 9 03:58:52 CST 2007

```Here is my new proposal.

So, the interface is separated in three modules :
- Criteria contains the converge criteria
- MonotonyCriterion, constructed with a iteration limit and an error level
- RelativeValueCriterion, constructed with a iteration limit and an error
level
- AbsoluteValueCriterion, constructed with a iteration limit and an error
level
I think the names are self-explaining. The interface of these criteria is
a simple __call__ method that take the current number of iterations, the
last values of the cost function and the corresponding points.
- Step contains some step that the optimizer can take in the process. Their
interface is simple, a __call__ method with a cost function as an argument
as well as the point at which the step must be computed
- GradientStep needs that the cost function implements the gradient method
- NewtonStep needs that the cost unction implements the gradient and the
hessian method
- Optimizer contains the optimizer skeletton as well as a standard optimizer
- Optimizer implements the optimize method that calls iterate until the
criterion is satisfied. Paramaters are the cost function and the criterion,
can have a record argument that will be called on each iteration with the
information of the step - point, value, iteration, step, ... - and can have
a stepSize parameter - it is a factor that will be multiplied by the step,
useful to have little step in a steepest /gradient descent -
- StandardOptimizer implements the standard optimizer, that is the new
point is the last point + the step. The additional arguments are an instance
of a step - GradientStep or NewtonStep at this point - and... the starting
point. perhaps these arguments should be put in the Optimizer.

A cost function that must be optimized must/can have :
- a __call__ method with a point as argument
- a gradient method with the same argument, if needed
- a hessian argument, same argument, if needed

Other steps I use in my research need additional method, but for a simple
proposal, no need for them.

Some examples are provided in the Rosenbrock.py file, it is the Rosenbrock
function, with __call__, gardient and hessian method. Then 6 optimizations
are made, some converge to the real minimum, other don't because of the
choice for the criterion - the MonotonyCriterion is not very useful here,
but for a damped or a stochastic optimizer, it is a pertinent choice -.
AppendList.py is an example for the "record" parameter, it saves only the
points used in the optimization.

2007/3/8, Matthieu Brucher <matthieu.brucher@gmail.com>:
>
>
> > For the etcetera argument, I suppose a **parameter is
> > > a good choice
> >
> > It is the obvious choice, but I am not sure what the best
> > approach will be.  Presumably implementation will be
> > revealing.
>
>
>
> Some of the arguments cannot be decided beforehand. For instance, for a
> damped optimizer, which sets of parameters should be saved ? Everyone,
> including each that is tested for minimization of the function, or only the
> one at the end of the loop ?
>
> I'll make a simple proposal for the interface with the modifications I
>
> Matthieu
>
-------------- next part --------------
An HTML attachment was scrubbed...