[SciPy-dev] Updated generic optimizers proposal

dmitrey openopt@ukr....
Wed Apr 25 06:33:20 CDT 2007

I have been accepted for participating in the GSoC program with project 
related to scipy and optimization.
If noone else here is able or have no time, I would take a look 
(however, I can't spend much time before summer start because of my 
exams; and anneal (as well as other global solvers) is not my specialty).

I think that lb-ub bounds can hardly be implemented in a simple way
because it depends very much on rand points generator quality, and the 
latter should be much more better than simple lb+rand*(ub-lb) elseware 
all points will be located in a thin area near their average value (same 
problem is present in integration of functions f: R^n->R with high 
dimensions (n>>1) ).
I took a look at the generators by Joachim Vandekerckhove in his anneal 
(connected to my openopt for MATLAB/Octave), they seems to be too primitive.
BTW afaik anneal currenlty is concerned as deprecated (I don't know 
better English word, not "up-to-date", old one), there are better global 
solvers, for example GRASP-based.

william ratcliff wrote:
> say, who's responsible for the anneal portion of optimize?  I'd like 
> to check in a minor tweak which implements simple upper and bounds on 
> the fit parameters.
> Thanks,
> William
> On 4/18/07, *Matthieu Brucher* <matthieu.brucher@gmail.com 
> <mailto:matthieu.brucher@gmail.com>> wrote:
>     Hi,
>     I'm lauching a new thread, the last was pretty big, and as I
>     almost put every advice in this proposal, I thought it would be
>     better.
>     First, I used scipy coding standard, I hope I didn't forget
>     something.
>     I do not know where it would be put at the moment on my scipy
>     tree, and the tests are visual for the moment, I have to make them
>     automatic, but I do not know the framework used by scipy, I have
>     to check it first.
>     So, the proposal :
>     - combining several objects to make an optimizer
>     - a function should be an object defining the __call__ method and
>     graient, hessian, ... if needed. It can be passed as several
>     separate functions as Alan suggested it, a new object is then created
>     - an optimizer is a combination of a function, a step_kind, a
>     line_search, a criterion and a starting point x0.
>     - the result of the optimization is return after a call to the
>     optimize() method
>     - every object (step or line_search) saves its modification in a
>     state variable in the optimizer. This variable can be accessed if
>     needed after the optimization.
>     - after each iteration, a record function is called with this
>     state variable - it is a dict, BTW -, if you want to save the
>     whole dict, don't forget to copy it, as it is modified during the
>     optimization
>     For the moment are implemented :
>     - a standard algorithm, only calls step_kind then line_search for
>     a new candidate - the next optimizer would be one that calls a
>     modifying function on the computed result, that can be useful in
>     some cases -
>     - criteria :
>      - monotony criterion : the cost is decreasing - a factor can be
>     used to allow an error -
>      - relative value criterion : the relative value error is higher
>     than a fixed error
>      - absolute value criterion : the same with the absolute error
>     - step :
>      - gradient step
>      - Newton step
>      - Fletcher-Reeves conjugate gradient step - other conjugate
>     gradient will be available -
>     - line search :
>      - no line search, just take the step
>      - damped search, it's an inexact line search, that searches in
>     the step direction a set of parameters than decreases the cost by
>     dividing by two the step size while the cost is not decreasing
>      - Golden section search
>      - Fibonacci search
>     I'm not pulling other criterion, step or line search, as my time
>     is finite when doing a structural change.
>     There are 3 classic optimization test functions in the package,
>     Rosenbrock, Powell and a quadratic function, feel free to try
>     them. Sometimes, the optimizer converges to the true minimum,
>     sometimes it does not, I tried to propose several solutions to
>     show that every combinaison does not manage to find the minimum.
>     Matthieu
>     _______________________________________________
>     Scipy-dev mailing list
>     Scipy-dev@scipy.org <mailto:Scipy-dev@scipy.org>
>     http://projects.scipy.org/mailman/listinfo/scipy-dev
> ------------------------------------------------------------------------
> _______________________________________________
> Scipy-dev mailing list
> Scipy-dev@scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-dev

More information about the Scipy-dev mailing list