[SciPy-User] simplex algorithm and curve fitting

josef.pktd@gmai... josef.pktd@gmai...
Tue Feb 21 06:07:33 CST 2012


On Tue, Feb 21, 2012 at 6:30 AM, servant mathieu
<servant.mathieu@gmail.com> wrote:
> Ok, so as far as I understand, curve fitting is not possible in scipy.. Many
> researchers use the fminsearch function in matlab based on simplex standard
> routines.  Is it a global optimizer?

fminsearch in matlab uses 'Nelder-Mead simplex direct search' which is
the same algorithm as scipy.optimize.fmin

My impression is that your function does not have a solution with
np.abs(tanh(A*k*x)) not equal to one, I didn't find one that would
come close to the solution when the tanh part is one.

curve_fitting is possible in scipy and works very well in many cases,
and it finds a solution based on the A/(k*x) part, but it cannot do
something impossible, but I didn't try 10000 starting values.

In general, if it is a quadratic problem, then leastsq works very
well, better than fmin or other fmin_xxx for unconstrained problems.

Josef

> Mat
>
> 2012/2/21 <josef.pktd@gmail.com>
>>
>> On Tue, Feb 21, 2012 at 5:19 AM, servant mathieu
>> <servant.mathieu@gmail.com> wrote:
>> > Thanks a lot Joseph. I've got a last question which concerns initials
>> > guess
>> > for parameters. In fact, in many papers, i read " we fitted both
>> > functions
>> > using standard simplex optimization routines. This was repeated 10000
>> > times
>> > with randomized initial values to avoid local minima."  The problemis
>> > the
>> > following: which range of values should we use for this randomization?
>>
>> I wanted to mention this before, all fmin are local optimizers, anneal
>> is the only global optimizer in scipy, but a bit tricky to use. There
>> are other global optimizers written in python that might work better,
>> but I never tried any of these packages.
>>
>> Choosing (random) starting values depends completely on the function
>> and there is no function independent recipe, since the
>> parameterization of a function is pretty arbitrary. So, you need an
>> "educated" guess over the possible range given the specific function
>> and problem.
>>
>> For specific classes of functions in not too high dimension it would
>> be possible to find (and code) starting values, or ranges for a random
>> search.
>>
>> I haven't tried out what your function looks like, but I would guess
>> that there are at least some sign restrictions. I usually try to see
>> if I can guess starting values, and ranges for randomization, based on
>> min, max and mean of the observations.
>>
>> Josef
>>
>> >
>> > Best,
>> > Mat
>> >
>> >
>> > 2012/2/21 <josef.pktd@gmail.com>
>> >
>> >> On Tue, Feb 21, 2012 at 4:38 AM, servant mathieu
>> >> <servant.mathieu@gmail.com> wrote:
>> >> > Dear Scipy users,
>> >> >
>> >> > I've got some troubles with the scipy.optimize.curve_fit function.
>> >> > This
>> >> > function is based on the Levenburg-Maquardt algorithm, which is
>> >> > extremely
>> >> > rapid but usually finds a local minimum, not a global one (and thus
>> >> > often
>> >> > returns anormal parameter values). In my research field, we usually
>> >> > use
>> >> > Nelder Mead's simplex routines to avoid this problem. However, I
>> >> > don't
>> >> > know
>> >> > if it is possible to perform curve fitting in scipy using simplex;
>> >> > the
>> >> > fmin
>> >> > function  doesn't seem to perform adjustments to data.
>> >> >
>> >> > Here is my code for fitting a three parameters hyperbolic cotangent
>> >> > function
>> >> > using curve_fit:
>> >> >
>> >> > from scipy.optimize import curve_fit
>> >> >
>> >> > import numpy as np
>> >> >
>> >> >
>> >> >
>> >> > def func (x, A,k, r ):
>> >> >
>> >> > return r + (A /(k*x)) * np.tanh (A*k*x)
>> >> >
>> >> >
>> >> >
>> >> > xdata = np.array ([0.15,0.25,0.35,0.45, 0.55, 0.75])
>> >> >
>> >> >
>> >> >
>> >> > datacomp = np.array ([344.3276300, 324.0051063, 314.2693475,
>> >> > 309.9906375,309.9251162, 307.3955800])
>> >> >
>> >> > dataincomp = np.array ([363.3839888, 343.5735787, 334.6013375,
>> >> > 327.7868238,
>> >> > 329.4642550, 328.0667050])
>> >> >
>> >> >
>> >> >
>> >> > poptcomp, pcovcomp = curve_fit (func, xdata, datacomp, maxfev =
>> >> > 10000)
>> >> >
>> >> > poptincomp, pcovincomp = curve_fit (func, xdata, dataincomp, maxfev =
>> >> > 10000)
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> > How could I proceed to perform the fitting using simplex?
>> >>
>> >> You need to define your own loss function for the optimizers like
>> >> fmin, or in future version minimize
>> >>
>> >>
>> >> http://docs.scipy.org/doc/scipy-0.10.0/reference/tutorial/optimize.html#nelder-mead-simplex-algorithm-fmin
>> >>
>> >> something like this
>> >>
>> >> def loss(params, args)
>> >>     A, k, r = params
>> >>     y, x = args
>> >>     return ((y - func (x, A,k, r ))**2).sum()
>> >>
>> >> and use loss in the call to fmin
>> >>
>> >> Josef
>> >>
>> >
>> >
>> >
>> >
>> >
>> >>
>> >> >
>> >> >
>> >> >
>> >> > Best,
>> >> >
>> >> > Mat
>> >> >
>> >> >
>> >> > _______________________________________________
>> >> > SciPy-User mailing list
>> >> > SciPy-User@scipy.org
>> >> > http://mail.scipy.org/mailman/listinfo/scipy-user
>> >> >
>> >> _______________________________________________
>> >> SciPy-User mailing list
>> >> SciPy-User@scipy.org
>> >> http://mail.scipy.org/mailman/listinfo/scipy-user
>> >
>> >
>> >
>> > _______________________________________________
>> > SciPy-User mailing list
>> > SciPy-User@scipy.org
>> > http://mail.scipy.org/mailman/listinfo/scipy-user
>> >
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User@scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
>
>
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>


More information about the SciPy-User mailing list