[SciPy-User] OT: global optimization, hybrid global local search

josef.pktd@gmai... josef.pktd@gmai...
Sun Apr 22 12:31:57 CDT 2012


On Sun, Apr 22, 2012 at 1:04 PM, denis <denis-bz-gg@t-online.de> wrote:
> On 22/04/2012 17:00, josef.pktd@gmail.com wrote:
>> I'm looking at nonlinear regression, function estimation again.
>> I'm interested in combining global search with local optimizers, which
>> I think should be much faster in many of our problems.
>>
>> Anyone with ideas, experience, code?
> ...
> Hi Josef,
>
> agree that hybrid methods have potential
> but like clearer goals before rushing in to code -- de gustabus.
>
> "Optimization" covers a HUGE range --
>     interactive, e.g. run 10 NM then look +- .1 then 10 NM more ...
>          vs fully-automatic
>     dimension: 2d / 3d visualizable, 4d .. say 10d, higher
>     smooth / Gaussian noise / noisy but no noise model
>     user gradients / finite-difference gradient est (low fruit) / no-deriv
>     convex / not
>     many application areas
>
> with a correspondingly huge range of optimizers, frameworks, plot / visualizers
> scipy.optimize, scikit-learn stuff, nlopt, cvxopt, stuff in R ...
> and a huge range of users from curve_fit
> to people who want X (but don't use it if it's there already)
> not to mention *lots* of papers on 1-user methods.
>
> There are more optimizers than test functions --
> https://github.com/denis-bz/opt/{scopt,nlopt}/test/*sum
> show how noisy some no-deriv optimizers are on Powell's difficult sin-cos function.
>
>
> Do you use leastsq, any comments on that ?
> cf. Martin Teichmann rewrite https://github.com/scipy/scipy/pull/90
>
>
> In short, we have to concentrate; suggestions ?

http://itl.nist.gov/div898/strd/nls/nls_main.shtml

5 to 20 parameters, smooth
least squares estimation or maximize log-likelihood
possibly with minimal user intervention

what I sometimes do: choose random starting parameters then use
leastsq, fmin or fmin_bfgs

Instead of random starting parameters, I would like to have a not
home-made, global directed search.

simulated annealing, differential evolution, ... sound too random to
me (and a waste of computer time) - not quite verified

tabu search (I don't know yet what that is) + leastsq or fmin ?

I'm a user of optimization algorithm and avoid looking at the inside,
if I don't have to.

Josef

>
> cheers
>   -- denis
>
>
>> Josef
>> (Why does scipy not have any good global optimizers?)
>
> Examples please ?
> deja vu: late March rant on "why doesn't leastsq do bounds" ?
> Well it can, easily
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user


More information about the SciPy-User mailing list