[SciPy-user] Question about scipy.optimize

Gísli Óttarsson gislio@gmail....
Fri Nov 21 06:10:41 CST 2008


Hello all.

I am a relatively new user of python and scipy and I have been trying
out scipy's optimization facilities.  I am using scipy version 0.6.0,
as distributed with Ubuntu 8.04.

My exploration has centered around the minimization of x*x*y, subject
to the equality constraint 2*x*x+y*y=3.  In my experience, this
problem is solved by introducing a Lagrange multiplier and minimizing
the Lagrangian:

L = x*x*y - lambda * ( 2*x*x+y*y-3 )

I have had no problem finding the desired solution via Newton-Raphson
using the function and its first and second derivatives:

import scipy.optimize as opt
import numpy
import numpy.linalg as l

def f(r):
    x,y,lam=r
    return x*x*y  -lam*(2*x*x+y*y-3)

def g(r):
    x,y,lam=r
    return numpy.array([2*x*y-4*lam*x, x*x-2*lam*y, -(2*x*x+y*y-3)])

def h(r):
    x,y,lam=r
    return numpy.mat([[2.*y-4.*lam, 2.*x,
-4.*x],[2.*x,-2.*lam,-2.*y],[-4.*x,-2.*y,0.]])

def NR(f, g, h, x0, tol=1e-5, maxit=100):
    "Find a local extremum of f (a root of g) using Newton-Raphson"
    x1 = numpy.asarray(x0)
    f1 = f(x1)
    for i in range(0,maxit):
        dx = l.solve(h(x1),g(x1))
        ldx = numpy.sqrt(numpy.dot(dx,dx))
        x2 = x1-dx
        f2 = f(x2)
        if(ldx < tol): # x is close enough
            df = numpy.abs(f1-f2)
            if(df < tol): # f is close enough
                return x2, f2, df, ldx, i
        x1=x2
        f1=f2
    return x2, f2, df, ldx, i

print NR(f,g,h,[-2.,2.,3.],tol=1e-10)

My Newton-Raphson iteration converges in 5 iterations, but I have had
no success using any of the functions in scipy.optimize, for example:

print opt.fmin_bfgs(f=f, x0=[-2.,2.,3.], fprime=g)
print opt.fmin_ncg(f=f, x0=[-2.,2.,3.], fprime=g, fhess=h)

neither of which converges.

I am beginning to suspect some fundamental misunderstanding on my
part.  Could someone throw me a bone?

Best regards

   Gísli


More information about the SciPy-user mailing list