[SciPy-User] Nonlinear fit to multiple data sets with a shared parameter, and three variable parameters.

Matt Newville newville@cars.uchicago....
Wed Apr 3 13:58:49 CDT 2013

Hi Troels,

On Wed, Apr 3, 2013 at 11:09 AM, Troels Emtekær Linnet
<tlinnet@gmail.com> wrote:
> Dear Scipy users.
> I am having trouble to implement what is probably known as:
> Nonlinear fit to multiple data sets with shared parameters
> I haven't been able to find a solution for this in scipy, and I would be
> happy to hear if someone could guide med how to fix this.
> I have a set of measured NMR peaks.
> Each peak has two eksperiment x values, x1, x2, which I can fit to a
> measured Y value.
> I have used lmfit, which extends scipy leastsq with some boundary options.
> For each peak, i can fit the following function:
> --------------------------------------
> def R1r_exch(pars,inp,data=None,eps=None):
>     tiltAngle,omega1=inp
>     R1 = pars['R1'].value
>     R2 = pars['R2'].value
>     kEX = pars['kEX'].value
>     phi = pars['phi'].value
>     model =
> R1*cos(tiltAngle*pi/180)**2+(R2+phi*kEX/((2*pi*omega1/tan(tiltAngle*pi/180))**2+(2*pi*omega1)**2+kEX**2))*sin(tiltAngle*pi/180)**2
>     if data is None:
>         return model
>     if eps is None:
>         return (model - data)
>     return (model-data)/eps
> calling with
> datX = [tilt,om1]
> par = lmfit.Parameters()
> par.add('R1', value=1.0, vary=True)
> par.add('R2', value=40.0, vary=True)
> par.add('kEX', value=10000.0, vary=False, min=0.0)
> par.add('phi', value=100000.0, vary=True, min=0.0)
> lmf = lmfit.minimize(R1r_exch, par, args=(datX, R1rex,
> R1rex_err),method='leastsq')
> print lmf.success, lmf.nfev
> print par['R1'].value, par['R2'].value, par['kEX'].value, par['phi'].value
> fig = figure('R1r %s'%NI)
> ax = fig.add_subplot(111)
> calcR1r = R1r_exch(par,datX)
> tilt_s, om1_s = zip(*sorted(zip(datX[0], datX[1])))
> datXs = [array(tilt_s), array(om1_s)]
> calcR1rs = f_R1r_exch_lmfit(par,datXs)
> -----------------------------------------------------------
> That goes fine for each single peak.
> But now I wan't to do global fitting.
> http://www.originlab.com/index.aspx?go=Products/Origin/DataAnalysis/CurveFitting/GlobalFitting
> http://www.wavemetrics.com/products/igorpro/dataanalysis/curvefitting/globalfitting.htm
> I would like to fit the nonlinear model to several peak data sets
> simultaneously.
> The parameters "R1,R2 and phi" should be allowed to vary for each NMR peak,
> while kEX should be global and shared for all NMR peaks.
> Is there anybody who would be able to help finding a solution or guide med
> to a package?
> Best
> Troels Emtekær Linnet
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user

I think you want to create a set of Parameters that includes the peak
parameters for each dataset, perhaps as:

   par = lmfit.Parameters()
   par.add('peak1_R1', value=1.0, vary=True)
   par.add('peak1_R2', value=40.0, vary=True)
   par.add('peak1_phi', value=100000.0, vary=True, min=0.0)
   par.add('peak1_kEX', value=10000.0, vary=False, min=0.0)

   par.add('peak2_R1', value=1.0, vary=True)
   par.add('peak2_R2', value=40.0, vary=True)
   par.add('peak2_phi', value=100000.0, vary=True, min=0.0)

To make sure that 'peak2_kEX' has the same value as 'peak1_kEX', you
can assign the constraint:
   par['peak2_kEX'].expr = 'peak1_kEX'

For this case, in which you want value to be identical for all
datasets (but possibly still vary?), you could just make a 'kEX'
parameter and use it in your R1r_exch() function.  The advantage of a
constraint expression as above is that it is easier to turn the
constraint off, and more flexible.  As an example, you could have two
peaks that compete, with fractional weights constrained to add to 1:
   par['peak2_kEX'].expr = '1 - peak1_kEX'

Anyway, to continue on extending your example to multiple datasets, I
would alter R1r_exch() to use plain values, not Parameters (useful for

    def R1r_exch(tiltAngle, omega, R1, R2, kEX, phi, data=None, eps=None):
        model =
        if data is None:
            return model
        if eps is None:
            return (model - data)
        return (model-data)/eps

and create an objective function that considers one dataset at a time
and then concatenates the per-dataset residuals, perhaps:

    def objective(par, tiltAngle, omega, datasets, eps):

        R1 = pars['peak1_R1'].value
        R2 = pars['peak1_R2'].value
        kEX = pars['peak1_kEX'].value
        phi = pars['peak1_phi'].value

        resid1 = R1r_exch(tiltAngle, omega, R1, R2, kEX, phi,
                          data=datasets[0], eps=eps[0])

        R1 = pars['peak2_R1'].value
        R2 = pars['peak2_R2'].value
        kEX = pars['peak2_kEX'].value
        phi = pars['peak2_phi'].value

        resid2 = R1r_exch(tiltAngle, omega, R1, R2, kEX, phi,
                          data=datasets[1], eps=eps[1])

        return np.concatenate((resid1, resid2))

Obviously, you might want to do something fancier (say,
auto-generating parameter names, and auto-extracting them) to extend
to many more datasets.     But I think that really it's the
concatenation that you're looking for.

To do the fit, something like this should work:

    result = lmfit.minimize(objeectivs, par, args=(titleAngle, omega,
                            [R1rex, R2rex], [R1rex_eps, R2rex_eps]))

This is off the top of my head, untested, and likely to have typos or
worse.  I think the main point is to abstract and parameterize each
data set well and write an objective function to do the concatenation
of residuals for each dataset.  After that, you can play with bounds
and constraints of the parameters all you want.


--Matt Newville

More information about the SciPy-User mailing list