[SciPy-User] optimize.leastsq does not converge with full Jacobian?

Sturla Molden sturla@molden...
Thu Oct 1 02:12:19 CDT 2009


I was trying to fit some Michaelis-Menten data from Bates and Watts 
(1988) puromycin experiment to test non-linear regression in SciPy. I 
get some very strange results that I don't understand.

First the non-linear model to be fitted is:

y = Vmax * x / (x + Km)

Not using Jacobian works:

import numpy as np
import scipy
from scipy.linalg import qr, solve
from scipy.optimize import leastsq


# Bates and Watts (1988) puromycin data

data = ((0.02, 47, 76),
        (0.06, 97, 107),
        (0.11, 123, 139),
        (0.22, 152, 159),
        (0.56, 191, 201),
        (1.10, 200, 207))

data = np.array(data)

X = data[:,0:1].repeat(2,axis=1).flatten()
Y = data[:,1:].flatten()

# initial fit from Linewaver-Burk plot
y = Y**-1
x = np.vstack((np.ones(X.shape),X**-1)).T
q,r = qr(x, econ=True)
b = solve(r, (np.mat(y) * np.mat(q)).T).ravel()

# Michaelis-Menten fit from Lineweaver-Burk
Vmax = 1.0 / b[0]
Km = b[1] * Vmax

# refit with Levenberg-Marquardt method

def michaelis_menten(t, x):
    Vmax, Km = t
    return Vmax*x/(x + Km)

def residuals(t, x, y):
    return y - michaelis_menten(t, x)

(Vmax,Km),ierr = leastsq(residuals, (Vmax,Km), args=(X,Y))

This gives Vmax,Km = 2.12683559e+02, 6.41209954e-02, which is the 
"correct" answer.

However, when I use the full Jacobian,

def jacobian(t, x, y):
    j = np.zeros((2,x.shape[0]))
    Vmax, Km = t
    j[0,:] = x/(Km + x)
    j[1,:] = -Vmax*x/((Km + x)**2)
    return j

(Vmax,Km),ierr = leastsq(residuals, (Vmax,Km), args=(X,Y), 
Dfun=jacobian, col_deriv=1)

I always get the start value of (Vmax,Km) returned, which is (195.8027, 
0.0484065).

What on earth is going on?

Is there a bug in SciPy or am I being incredibly stupid?


Sturla Molden




















More information about the SciPy-User mailing list