[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)

Matthieu Brucher matthieu.brucher@gmail....
Fri Apr 13 11:14:55 CDT 2007


A new proposal...
I refactored the code for the line search part, it is now a another module.
The damped optimizer of the last proposal is now a damped line search, by
default no line search is performed at all.

Matthieu

2007/4/13, Matthieu Brucher <matthieu.brucher@gmail.com>:
>
> A little update of my proposal :
>
> - each step can be update after each iteration, it will be enhanced so
> that everything computed in the iteration will be passed on, in case it is
> needed to update the step. That could be useful for approximated steps
> - added a simple Damped optimizer, it tries to take a step, if the cost is
> higher than before, half a step is tested, ...
> - a function object is created if the function argument is not passed
> (takes the arg 'fun' as the cost function, gradient for the gradient, ...).
> Some safeguards must still be implemented.
>
> I was thinking of the limits of this architecture :
> - defenitely all quasi-Newton optimizers can be ported to this framework,
> as well as all semi-quadratic ones
> - constrained optimization will not unless it is modified so that it can,
> but as I do not use such optimizers in my PhD thesis, I do not know them
> enough
>
> But even the simplex/polytope optimizer (fmin) can be expressed in the
> framework - it is useless though, as it would be slower -, and can
> advantages of the different stopping criteria. BTW, I used some parts of
> this framework in an EM algorithm with an AIC based optimizer on the top.
>
> As I said in another thread, I'm in favour of fine-grained modules, even
> if some wrapper can provide simple optimization procedures.
>
> Matthieu
>
> 2007/3/26, Matthieu Brucher < matthieu.brucher@gmail.com>:
> >
> > OK, I see why you want that approach.
> > > (So that you can still pass a single object around in your
> > > optimizer module.)  Yes, that seems right...
> >
> >
> >
> > Exactly :)
> >
> >
> > This seems to bundle naturally with a specific optimizer?
> >
> >
> >
> > I'm not an expert in optimization, but I intended several class/seminars
> > on the subject, and at least for the usual simple optimizer - the standard
> > optimizer, all damped approach, and all the other that use a step and a
> > criterion test - use this interface, and with a lot of different steps that
> > are usual - gradient, every conjugated gradient solution, (quasi-)Newton -
> > or criteria.
> > I even suppose it can do very well in semi-quadratic optimization, with
> > very little change, but I have to finish some work before I can read some
> > books on the subject to begin implementing it in Python.
> >
> >
> > If so, the class definition should reside in the StandardOptimizer
> > > module.
> > >
> > > Cheers,
> > > Alan Isaac
> > >
> > > PS For readability, I think Optimizer should define
> > > a "virtual" iterate method. E.g.,
> > > def iterate(self):
> > >     return NotImplemented
> >
> >
> > Yes, it seems better.
> >
> > Thanks for the opinion !
> >
> > Matthieu
> >
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-dev/attachments/20070413/f88d298a/attachment.html 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: optimizerProposal_03.zip
Type: application/zip
Size: 7587 bytes
Desc: not available
Url : http://projects.scipy.org/pipermail/scipy-dev/attachments/20070413/f88d298a/attachment.zip 


More information about the Scipy-dev mailing list