[SciPy-user] Proposal for more generic optimizers
Matthieu Brucher
matthieu.brucher@gmail....
Tue Feb 27 01:36:05 CST 2007
Hi,
I'm migratting toward Python for some weeks, but I do not find the tools I
had to develop for my PhD in SciPy at the moment. I can't, for instance,
find an elegant way to save the set of parameters used in an optimization
for the standard algorithms. What is more, I think they can be more generic.
What I did in C++, and I'd like your opinion about porting it in Python, was
to define a standard optimizer with no iteration loop - iterate was a pure
virtual method called by a optimize method -. This iteration loop was then
defined for standard optimizer or damped optimizer. Each time, the
parameters tested could be saved. Then, the step that had to be taken was an
instance of a class that used a gradient step, a Newton step, ... and the
same was used for the stoping criterion. The function was a class that
defined value, gradient, hessian, ... if needed.
For instance, a simplified instruction could have been :
Optimizer* optimizer = StandardOptimizer</*some more parameters not relevant
in Python*/(function, GradientStep(), SimpleCriterion(NbMaxIterations),
step, saveParameters);
optimizer->optimize();
optimizer->getOptimalParameters();
The "step" argument was a constant by which the computed step had to be
multiplied, by default, it was 1.
I know that this kind of writting is not as clear and lightweight as the
current one, which is used by Matlab too. But perhaps giving more latitude
to the user can be proposed with this system. If people want, I can try
making a real Python example...
Matthieu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-user/attachments/20070227/956e9b10/attachment.html
More information about the SciPy-user
mailing list