[SciPy-User] SciPy-User Digest, Vol 109, Issue 20
Sat Sep 8 12:31:12 CDT 2012
Right you are!! But if the test cases are created using distribution parameters chosen for the test, that will at least supply a check on the result.
Bob and Paula H
> From: "email@example.com" <firstname.lastname@example.org>
>Sent: Saturday, September 8, 2012 10:00 AM
>Subject: SciPy-User Digest, Vol 109, Issue 20
>Send SciPy-User mailing list submissions to
>To subscribe or unsubscribe via the World Wide Web, visit
>or, via email, send a message with subject or body 'help' to
>You can reach the person managing the list at
>When replying, please edit your Subject line so it is more specific
>than "Re: Contents of SciPy-User digest..."
> 1. Re: Optimization Test Cases (email@example.com)
>Date: Sat, 8 Sep 2012 10:22:26 -0400
>Subject: Re: [SciPy-User] Optimization Test Cases
>To: SciPy Users List <firstname.lastname@example.org>
>Content-Type: text/plain; charset=ISO-8859-1
>On Sat, Sep 8, 2012 at 8:59 AM, The Helmbolds <email@example.com> wrote:
>> On Mon, Sep 3, 2012 at 6:59 PM, denis <firstname.lastname@example.org> wrote:
>>> I'm looking for real or realistic testcases for Nelder-Mead
>>> minimization of noisy functions, 2d to 10d or so, unconstrained
>>> or box constraints, preferably not sum-of-squares and not Rosenbrock et
>>> to wring out a new implementation that has restarts and verbose.
>>> (Would like to discuss ways to restart too
>>> but more ideas than test functions => never converge.)
>> Try some maximum liklihood fitting problems, where parameters are chosen to
>> maximize the likelihood function of some statistical distribution function.
>> All you need for the Weibull case is in the attachment (in Microsoft Word
>> Whatever thestatistical distribution you use, I suggest you begin by picking
>> your own values for the parameters (then you'll know what the right answer
>> is). Then generate a sample of values from that distribution/parameter
>> combination. Feed that sample into your optimizatino program, and see if it
>> gives results close to the parameter values you used to generate the sample.
>I started my reply in a similar direction:
>statsmodels has many cases with minimizing log likelihood.
>We don't keep a list of where we ran into problems, but the Negative
>Binomial that Vincent recently coded up has problems with where
>Nelder-Mead wasn't good, and Powell often went way off. (IIRC)
>(fmin_ncg with numerical derivatives works well.)
>I have an example with a 3 component mixture distribution (univariate
>with 8 parameters) with lot's of local minima, Nelder Mead is
>sensitive to starting values, but it will take a bit more time to
>clean it my code.
>But I got distracted before checking whether I remember the details correctly.
>we have two kinds of problems with Nelder-Mead in maximum likelihood:
>one is the usual getting stuck in local minima
>the other one is that Nelder-Mead stops at something where the
>gradient is not very close to zero, which then might not even be a
>real local minimum.
>(on the other hand it's more robust when the starting values are not
>good, and it's slow.)
>> Bob and Paula H
>> SciPy-User mailing list
>SciPy-User mailing list
>End of SciPy-User Digest, Vol 109, Issue 20
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User