[SciPy-User] Optimization Test Cases
Sat Sep 8 09:22:26 CDT 2012
On Sat, Sep 8, 2012 at 8:59 AM, The Helmbolds <firstname.lastname@example.org> wrote:
> On Mon, Sep 3, 2012 at 6:59 PM, denis <email@example.com> wrote:
>> I'm looking for real or realistic testcases for Nelder-Mead
>> minimization of noisy functions, 2d to 10d or so, unconstrained
>> or box constraints, preferably not sum-of-squares and not Rosenbrock et
>> to wring out a new implementation that has restarts and verbose.
>> (Would like to discuss ways to restart too
>> but more ideas than test functions => never converge.)
> Try some maximum liklihood fitting problems, where parameters are chosen to
> maximize the likelihood function of some statistical distribution function.
> All you need for the Weibull case is in the attachment (in Microsoft Word
> Whatever thestatistical distribution you use, I suggest you begin by picking
> your own values for the parameters (then you'll know what the right answer
> is). Then generate a sample of values from that distribution/parameter
> combination. Feed that sample into your optimizatino program, and see if it
> gives results close to the parameter values you used to generate the sample.
I started my reply in a similar direction:
statsmodels has many cases with minimizing log likelihood.
We don't keep a list of where we ran into problems, but the Negative
Binomial that Vincent recently coded up has problems with where
Nelder-Mead wasn't good, and Powell often went way off. (IIRC)
(fmin_ncg with numerical derivatives works well.)
I have an example with a 3 component mixture distribution (univariate
with 8 parameters) with lot's of local minima, Nelder Mead is
sensitive to starting values, but it will take a bit more time to
clean it my code.
But I got distracted before checking whether I remember the details correctly.
we have two kinds of problems with Nelder-Mead in maximum likelihood:
one is the usual getting stuck in local minima
the other one is that Nelder-Mead stops at something where the
gradient is not very close to zero, which then might not even be a
real local minimum.
(on the other hand it's more robust when the starting values are not
good, and it's slow.)
> Bob and Paula H
> SciPy-User mailing list
More information about the SciPy-User