[SciPy-dev] Trouble with optimize.fmin_ncg
Wed Jul 25 04:22:13 CDT 2007
> Nils, are you sure that troubles raised after last svn changes?
> All my changes are in func _cubicmin from optimize.py
> but when I placed a breakpoint there, the hanging cycle didn't reached
> the one.
> Can't you do the same trick?
> line 309,
> d1 = empty((2,2))
> I have found the hanging cycle (optimize.py, line 1030,
> while numpy.add.reduce(abs(ri)) > termcond: )
> but numpy.add.reduce(abs(ri)) is constantly growing here.
> maybe you had changed x0 and now it's too far from x_opt?
> btw if 2nd derivatives are not supplied, then other cycle is hanging:
> line 1013:
> while (numpy.add.reduce(abs(update)) > xtol) and (k < maxiter):
> I don't know howto fix the problem.
> Please inform me about the breakpoint.
> BTW your func seems to be very suspicious to me
> def R(v):
> rq = dot(v.T,A*v)/dot(v.T,B*v)
> res = (A*v-rq*B*v)/linalg.norm(B*v)
> return rq
> are you sure that the func(v)=dot(v.T,A*v)/dot(v.T,B*v) is convex?
The function is the well-known Rayleigh quotient of the symmetric
definite matrix pair (A,B)
The theory of the unconstrained optimization problem is described in
Giles Auchmuty, Globally and rapidly convergent algorithms for symmetric
SIAM J. Matrix Anal. Appl. Vol. 12 Issue 4 pp. 690-706 (1991).
Another useful paper in this context is
M. Mongeau, M. Torki
Computing eigenelements of real symmetric matrices via optimization,
Computational Optimization and Applications, Vol. 29, pp. 263-287 (2004).
> I'm not.
> So using 2nd derivatives (or their approximating by fmin_ncg (if user
> didn't provide that ones) , in line 1033:
> Ap = approx_fhess_p(xk,psupi,fprime,epsilon)
> will handle non-convex funcs much more bad than 1-st order do.
> HTH, D.
More information about the Scipy-dev