[SciPy-dev] Proposal for more generic optimizers (posted before on scipy-user)
Alan G Isaac
Sun Mar 11 13:26:44 CDT 2007
I don't really have time to look at this for the
next week, but a couple quick comments.
1. Instead of::
if 'stepSize' in kwargs:
self.stepSize = kwargs['stepSize']
self.stepSize = 1.
I prefer this idiom::
self.stepSize = kwargs.get('stepSize',1)
2. All optimizers should have a maxiter attribute,
even if you wish to set a large default. This needs
corresponding changes in ``optimize``.
3. It seems like ``AppendList`` is an odd and specific
object. I'd stick in in the example file.
4. I understand that you want an object that provides
the function, gradient, and hessian. But when you
make a class for these, it is full of (effectively)
class functions, which suggests just using a module.
I suspect there is a design issue to think about here.
This might (??) go so far as to raise questions about
the usefulness of the bundling.
More information about the Scipy-dev