[SciPy-user] Questions about scipy.optimize.fmin_cobyla
dmitrey
openopt@ukr....
Mon Jul 16 07:30:39 CDT 2007
fdu.xiaojf@gmail.com wrote:
> Hi all,
>
> I'm trying to minimize a function f(X) = f(x1,x2,...,xn) using cobyla.
>
> In my minimization, all variables should be larger than 0, so I define
> constraint functions like this:
>
> cons = []
> for i in range(n): # n is the number of variables
> c =lambda x: x[i] - 1e-10
> cons.append(c)
>
> And then I use fmin_cobyla(func=f, x0=[0.1]*n, cons=cons) to minimize the
> function.
>
> There are two problems:
>
> 1) In the constrain functions, I expected the value of i to be bounded
> to the specific constrain function, but it doesn't work as I expected.
>
> Here is a demonstration:
> In [14]: a = []
>
> In [15]: for i in range(4):
> b = lambda : i**2
> a.append(b)
> ....:
> ....:
>
> In [18]: for f in a:
> print f()
> ....:
> ....:
> 9
> 9
> 9
> 9
>
> What I want is that every function in list a should print different
> values. The value of a[0](), a[1](), a[2](), a[3]() should be 0, 1, 4,
> 9.
>
> How to achieve this?
>
> 2) In function f(X), there are item of log(xi). Although I have
> constrained the value of xi in the constrain functions, xi still may be
> equal or less than 0 during minimization, then "math domain error"
> occurred.
>
> What should I do ?
>
I would replace xi and log(xi) by exp(xi) and xi
You can try also free (BSD lic) OpenOpt solver lincher, it should handle
your problem well
something like
from scikits.openopt import NLP
p = NLP(f, x0, lb = zeros(x0.size))
r = p.solve('lincher')
however, currently lincher requires QP solver, and the only one
connected is CVXOPT one, so you should have cvxopt installed (GPL). BTW
cvxopt is capable of solving your problem, it has appropriate NLP solver
Also, take a look at tnc
http://www.scipy.org/doc/api_docs/scipy.optimize.tnc.html
- afaik it allows to handle lb - ub bounds
> My method is when "math domain error" occurred, catch it and set the
> return value of f to a very large number. Should this work or not?
>
This will not work with any NLP or Nonsmooth solver, that calls for
gradient/subgradient
this will work for some global solvers like anneal, but they are capable
of small-scaled problems only (nVars up to 10)
HTH, D.
>
>
> Thanks a lot!
>
> Xiao Jianfeng
> _______________________________________________
> SciPy-user mailing list
> SciPy-user@scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-user
>
>
>
>
More information about the SciPy-user
mailing list