[SciPy-User] OT: global optimization, hybrid global local search

denis denis-bz-gg@t-online...
Sun Apr 22 12:04:14 CDT 2012


On 22/04/2012 17:00, josef.pktd@gmail.com wrote:
> I'm looking at nonlinear regression, function estimation again.
> I'm interested in combining global search with local optimizers, which
> I think should be much faster in many of our problems.
>
> Anyone with ideas, experience, code?
...
Hi Josef,

agree that hybrid methods have potential
but like clearer goals before rushing in to code -- de gustabus.

"Optimization" covers a HUGE range --
     interactive, e.g. run 10 NM then look +- .1 then 10 NM more ...
          vs fully-automatic
     dimension: 2d / 3d visualizable, 4d .. say 10d, higher
     smooth / Gaussian noise / noisy but no noise model
     user gradients / finite-difference gradient est (low fruit) / no-deriv
     convex / not
     many application areas

with a correspondingly huge range of optimizers, frameworks, plot / visualizers
scipy.optimize, scikit-learn stuff, nlopt, cvxopt, stuff in R ...
and a huge range of users from curve_fit
to people who want X (but don't use it if it's there already)
not to mention *lots* of papers on 1-user methods.

There are more optimizers than test functions --
https://github.com/denis-bz/opt/{scopt,nlopt}/test/*sum
show how noisy some no-deriv optimizers are on Powell's difficult sin-cos function.


Do you use leastsq, any comments on that ?
cf. Martin Teichmann rewrite https://github.com/scipy/scipy/pull/90


In short, we have to concentrate; suggestions ?

cheers
   -- denis


> Josef
> (Why does scipy not have any good global optimizers?)

Examples please ?
deja vu: late March rant on "why doesn't leastsq do bounds" ?
Well it can, easily



More information about the SciPy-User mailing list