[SciPy-dev] Updated generic optimizers proposal
Matthieu Brucher
matthieu.brucher@gmail....
Wed Apr 18 03:35:09 CDT 2007
Hi,
I'm lauching a new thread, the last was pretty big, and as I almost put
every advice in this proposal, I thought it would be better.
First, I used scipy coding standard, I hope I didn't forget something.
I do not know where it would be put at the moment on my scipy tree, and the
tests are visual for the moment, I have to make them automatic, but I do not
know the framework used by scipy, I have to check it first.
So, the proposal :
- combining several objects to make an optimizer
- a function should be an object defining the __call__ method and graient,
hessian, ... if needed. It can be passed as several separate functions as
Alan suggested it, a new object is then created
- an optimizer is a combination of a function, a step_kind, a line_search, a
criterion and a starting point x0.
- the result of the optimization is return after a call to the optimize()
method
- every object (step or line_search) saves its modification in a state
variable in the optimizer. This variable can be accessed if needed after the
optimization.
- after each iteration, a record function is called with this state variable
- it is a dict, BTW -, if you want to save the whole dict, don't forget to
copy it, as it is modified during the optimization
For the moment are implemented :
- a standard algorithm, only calls step_kind then line_search for a new
candidate - the next optimizer would be one that calls a modifying function
on the computed result, that can be useful in some cases -
- criteria :
- monotony criterion : the cost is decreasing - a factor can be used to
allow an error -
- relative value criterion : the relative value error is higher than a
fixed error
- absolute value criterion : the same with the absolute error
- step :
- gradient step
- Newton step
- Fletcher-Reeves conjugate gradient step - other conjugate gradient will
be available -
- line search :
- no line search, just take the step
- damped search, it's an inexact line search, that searches in the step
direction a set of parameters than decreases the cost by dividing by two the
step size while the cost is not decreasing
- Golden section search
- Fibonacci search
I'm not pulling other criterion, step or line search, as my time is finite
when doing a structural change.
There are 3 classic optimization test functions in the package, Rosenbrock,
Powell and a quadratic function, feel free to try them. Sometimes, the
optimizer converges to the true minimum, sometimes it does not, I tried to
propose several solutions to show that every combinaison does not manage to
find the minimum.
Matthieu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-dev/attachments/20070418/ed29963e/attachment.html
-------------- next part --------------
A non-text attachment was scrubbed...
Name: optimizer_proposal_04.tar.gz
Type: application/x-gzip
Size: 5903 bytes
Desc: not available
Url : http://projects.scipy.org/pipermail/scipy-dev/attachments/20070418/ed29963e/attachment.gz
More information about the Scipy-dev
mailing list