[SciPy-user] Questions about scipy.optimize.fmin_cobyla

dmitrey openopt@ukr....
Mon Jul 16 08:35:00 CDT 2007


fdu.xiaojf@gmail.com wrote:
> dmitrey wrote:
>  > fdu.xiaojf@gmail.com wrote:
>
>  >>
>  >> 2) In function f(X), there are item of log(xi). Although I have
>  >> constrained the value of xi in the constrain functions, xi still may be
>  >> equal or less than 0 during minimization, then "math domain error"
>  >> occurred.
>  >>
>  >> What should I do ?
>  >>
>  > I would replace xi and log(xi) by exp(xi) and xi
>  > You can try also free (BSD lic) OpenOpt solver lincher, it should handle
>  > your problem well
>  > something like
>  > from scikits.openopt import NLP
>  > p = NLP(f, x0, lb = zeros(x0.size))
>  > r = p.solve('lincher')
>  > however, currently lincher requires QP solver, and the only one
>  > connected is CVXOPT one, so you should have cvxopt installed (GPL). BTW
>  > cvxopt is capable of solving your problem, it has appropriate  NLP solver
>  > Also, take a look at tnc
>  > http://www.scipy.org/doc/api_docs/scipy.optimize.tnc.html
>  >  - afaik it allows to handle lb - ub bounds
>  >
> cvxopt can only handle convex functions, but my function is too
> complicated to get the expression of derivate easily, so according to
> a previous post from Joachim Dahl(dahl.joachim@gmail.com) in this list,
> it is probably non-convex.
>
> Can openopt handle non-convex functions?
>   
this is incorrect question. For any NLP solver (w/o global ones of 
course) I can construct a non-convex func that will failed to be solved. 
Moreover, I can construct *convex* func, which one will be very, very 
hard to solve (for the solver chose). So I can only talk about "this 
solver is more or less suitable for solving non-convex funcs". 
Fortunately, lincher is rather suitable for solving non-convex funcs, 
but let me remember you once again - it requires cvxopt qp solver yet, 
no other ones are connected or are written (I hope till the end of 
summer openopt will have it own QP/QPCP solver).
> My function is too complex to compute derivate, so I can't use tnc to do
> the job.
>   

if derivatives are not provided to tnc, it will calculate that ones by 
itself via finite-differences, as any other NLP solver.
>  >
>  >> My method is when "math domain error" occurred, catch it and set the
>  >> return value of f to a very large number. Should this work or not?
>  >>
>  > This will not work with any NLP or Nonsmooth solver, that calls for
>  > gradient/subgradient this will work for some global solvers like
>  > anneal, but they are capable of small-scaled problems only (nVars up
>  > to 10) HTH, D.
>
> The number of variables is less than 15, does this make any sense?
>   
Very unlikely.
10 is already a big problem.

>The reason why I chose cobyla is that cobyla can handle inequality and equality constraints, and it doesn't require derivate information.

lincher is capable of handling equality constraints. However, there can 
be some difficult cases (for example, when constraints form a 
linear-dependent system in xk - point from iter k)
But for rather small nVars = 15 I guess no problems will be obtained.
HTH, D.
>
> Regards,
>
>   



More information about the SciPy-user mailing list