[SciPy-User] Confidence interval for bounded minimization
Wed Feb 22 20:06:45 CST 2012
On Wed, Feb 22, 2012 at 5:02 PM, Nathaniel Smith <email@example.com> wrote:
> On Wed, Feb 22, 2012 at 8:48 PM, <firstname.lastname@example.org> wrote:
>> On Wed, Feb 22, 2012 at 3:26 PM, Greg Friedland
>> <email@example.com> wrote:
>>> Is it possible to calculate asymptotic confidence intervals for any of
>>> the bounded minimization algorithms? As far as I can tell they don't
>>> return the Hessian; that's including the new 'minimize' function which
>>> seemed like it might.
>> If the parameter ends up at the bounds, then the standard statistics
>> doesn't apply. The Hessian is based on a local quadratic
>> approximation, which doesn't work if part of the local neigborhood is
>> out of bounds.
>> There is some special statistics for this, but so far I have seen only
>> the description how GAUSS handles it.
>> In statsmodels we use in some cases the bounds, or a transformation,
>> just to keep the optimizer in the required range, and we assume we get
>> an interior solution. In this case, it is possible to use the standard
>> calculations, the easiest is to use the local minimum that the
>> constraint or transformed optimizer found and use it as starting value
>> for an unconstrained optimization where we can get the Hessian (or
>> just calculate the Hessian based on the original objective function).
> Some optimizers compute the Hessian internally. In those cases, it
> would be nice to have a way to ask them to somehow return that value
> instead of throwing it away. I haven't used Matlab in a while, but I
> remember running into this as a standard feature at some point, and it
> was quite nice. Especially when working with a problem where each
> computation of the Hessian requires an hour or so of computing time.
If it takes an hour to compute the Hessian, then don't compute it :)
My guess, without checking, is that very few optimizers calculate the
full Hessian. But for those that do calculate an approximation of the
Hessian, it might be useful to get them,
There was the discussion once to get the Lagrange Multipliers out of
some optimizers, but the person (?) who asked for it found out that
the numbers are so bad that they are not usable.
I think Skipper uses in statsmodels almost only analytical derivatives
or our own finite difference Hessian/derivatives, where it would be
possible to store the last results, but we don't do it (yet).
> -- Nathaniel
> SciPy-User mailing list
More information about the SciPy-User