[SciPy-User] optimization routines can not handle infinity values
Tue Sep 14 10:51:24 CDT 2010
There are two categories of contraints optimizations:
- you can evaluate the function outside the constraints
- you cannot evaluate the function outsde the constraints.
If the first one can be handled by more general algorithms providing some
tricks, you cannot use them for the second one. Your problem is clearly a
second category problem, so you must use appropriate algorithms (which may
not be available in scipy directly, you may want to check OpenOpt).
It's not a problem of routines, it's a problem of appropriate algorithms.
2010/9/14 enrico avventi <email@example.com>
> hello all,
> i am trying out some of the optimization routines for a problem of mine
> that is on the form:
> min f(x)
> s.t M(x) is positive semidefinite
> where f is strictly convex in the feasible region with compact sublevel
> sets, M is linear and takes value in some subspace of hermitian matrices.
> the problem is convex but the costraint can not be handled directly by any
> of the optimization routines in scipy. So i choose to change it to an
> uncostrained problem with objective function:
> f1(x) = f(x) for M(x) pos semi def
> f1(x) = Inf otherwise
> the problem is that it seems the routines can not handle the infinity
> values correctly.
> Some of the routines (fmin_cg comes to mind) wants to check the gradient at
> points where the objective function is infinite. Clearly in such cases the
> gradient is not defined - i.e the calculations fail - and the algorithm
> Others (like fmin_bfgs) strangely converge to a point where the objective
> is infinite despite the fact that the initial point was not.
> Do you have any suggestion to fix this problem?
> SciPy-User mailing list
Information System Engineer, Ph.D.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User