[SciPy-dev] Updated generic optimizers proposal
william ratcliff
william.ratcliff@gmail....
Wed Apr 25 11:04:47 CDT 2007
The 'simple' way applies only to the anneal algorithm in scipy. When one
chooses steps in a simulated annealing algorithm, there is always the
question of how to step from the current point. For anneal, it is currently
done based on an upper bound and lower bound (in one option). However,
there is nothing to prevent the searcher from crawling its way out of the
box. When most people imagine themselves searching in a bounded parameter
space, that is not the expected behavior. Now, it is possible that the
optimum solution is truly outside of the box and that the searcher is doing
the right thing. However, if that is not the case, then there is a
problem. So, what is one to do? The first obvious thing to try is to say,
if you reach the edge of a bounded parameter, stay there. However, that is
not ideal as you get stuck and can't explore the rest of the phase space.
So, I use the simple heuristic that if a trial move is to take you outside
of the box, simply stay where you are. In the next cycle, try to move
again. This will keep you in the box, and if there is truly a solution
outside of the box, will still move you towards the walls and let you know
that maybe you've set your bounds improperly. Now, there are questions of
efficiency. For example, how easy is it to get out of corners? Should one
do reflections? However, I believe that my rather simple heuristic will
preserve detailed balance and results in an algorithm that has the expected
behavior and is better than having no option ;>
As for deprecation--is it really true that
scipy.optimize.anneal is deprecated?
As for issues of this global optimizer or that global optimizer, why not let
the user decide based on their expectations of their fitting surface? For
some truly glassy surfaces, one is forced into techniques like simulated
annealing, parrallel tempering, genetic algorithms, etc. and I imagine that
their relative performance is based strongly on the particular problem that
their are trying to solve.
Cheers,
WIlliam Ratcliff
On 4/25/07, dmitrey <openopt@ukr.net> wrote:
>
> Hallo!
> I have been accepted for participating in the GSoC program with project
> related to scipy and optimization.
> If noone else here is able or have no time, I would take a look
> (however, I can't spend much time before summer start because of my
> exams; and anneal (as well as other global solvers) is not my specialty).
>
> I think that lb-ub bounds can hardly be implemented in a simple way
> because it depends very much on rand points generator quality, and the
> latter should be much more better than simple lb+rand*(ub-lb) elseware
> all points will be located in a thin area near their average value (same
> problem is present in integration of functions f: R^n->R with high
> dimensions (n>>1) ).
> I took a look at the generators by Joachim Vandekerckhove in his anneal
> (connected to my openopt for MATLAB/Octave), they seems to be too
> primitive.
> BTW afaik anneal currenlty is concerned as deprecated (I don't know
> better English word, not "up-to-date", old one), there are better global
> solvers, for example GRASP-based.
> WBR, D.
>
> william ratcliff wrote:
> > say, who's responsible for the anneal portion of optimize? I'd like
> > to check in a minor tweak which implements simple upper and bounds on
> > the fit parameters.
> >
> > Thanks,
> > William
> >
> > On 4/18/07, *Matthieu Brucher* <matthieu.brucher@gmail.com
> > <mailto:matthieu.brucher@gmail.com>> wrote:
> >
> > Hi,
> >
> > I'm lauching a new thread, the last was pretty big, and as I
> > almost put every advice in this proposal, I thought it would be
> > better.
> > First, I used scipy coding standard, I hope I didn't forget
> > something.
> >
> > I do not know where it would be put at the moment on my scipy
> > tree, and the tests are visual for the moment, I have to make them
> > automatic, but I do not know the framework used by scipy, I have
> > to check it first.
> >
> > So, the proposal :
> > - combining several objects to make an optimizer
> > - a function should be an object defining the __call__ method and
> > graient, hessian, ... if needed. It can be passed as several
> > separate functions as Alan suggested it, a new object is then
> created
> > - an optimizer is a combination of a function, a step_kind, a
> > line_search, a criterion and a starting point x0.
> > - the result of the optimization is return after a call to the
> > optimize() method
> > - every object (step or line_search) saves its modification in a
> > state variable in the optimizer. This variable can be accessed if
> > needed after the optimization.
> > - after each iteration, a record function is called with this
> > state variable - it is a dict, BTW -, if you want to save the
> > whole dict, don't forget to copy it, as it is modified during the
> > optimization
> >
> > For the moment are implemented :
> > - a standard algorithm, only calls step_kind then line_search for
> > a new candidate - the next optimizer would be one that calls a
> > modifying function on the computed result, that can be useful in
> > some cases -
> > - criteria :
> > - monotony criterion : the cost is decreasing - a factor can be
> > used to allow an error -
> > - relative value criterion : the relative value error is higher
> > than a fixed error
> > - absolute value criterion : the same with the absolute error
> > - step :
> > - gradient step
> > - Newton step
> > - Fletcher-Reeves conjugate gradient step - other conjugate
> > gradient will be available -
> > - line search :
> > - no line search, just take the step
> > - damped search, it's an inexact line search, that searches in
> > the step direction a set of parameters than decreases the cost by
> > dividing by two the step size while the cost is not decreasing
> > - Golden section search
> > - Fibonacci search
> >
> > I'm not pulling other criterion, step or line search, as my time
> > is finite when doing a structural change.
> >
> > There are 3 classic optimization test functions in the package,
> > Rosenbrock, Powell and a quadratic function, feel free to try
> > them. Sometimes, the optimizer converges to the true minimum,
> > sometimes it does not, I tried to propose several solutions to
> > show that every combinaison does not manage to find the minimum.
> >
> > Matthieu
> >
> > _______________________________________________
> > Scipy-dev mailing list
> > Scipy-dev@scipy.org <mailto:Scipy-dev@scipy.org>
> > http://projects.scipy.org/mailman/listinfo/scipy-dev
> >
> >
> >
> > ------------------------------------------------------------------------
> >
> > _______________________________________________
> > Scipy-dev mailing list
> > Scipy-dev@scipy.org
> > http://projects.scipy.org/mailman/listinfo/scipy-dev
> >
>
> _______________________________________________
> Scipy-dev mailing list
> Scipy-dev@scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-dev/attachments/20070425/dd9b289b/attachment-0001.html
More information about the Scipy-dev
mailing list