# [SciPy-User] simplex algorithm and curve fitting

josef.pktd@gmai... josef.pktd@gmai...
Tue Feb 21 05:29:04 CST 2012

```On Tue, Feb 21, 2012 at 5:47 AM,  <josef.pktd@gmail.com> wrote:
> On Tue, Feb 21, 2012 at 5:19 AM, servant mathieu
> <servant.mathieu@gmail.com> wrote:
>> Thanks a lot Joseph. I've got a last question which concerns initials guess
>> for parameters. In fact, in many papers, i read " we fitted both functions
>> using standard simplex optimization routines. This was repeated 10000 times
>> with randomized initial values to avoid local minima."  The problemis the
>> following: which range of values should we use for this randomization?
>
> I wanted to mention this before, all fmin are local optimizers, anneal
> is the only global optimizer in scipy, but a bit tricky to use. There
> are other global optimizers written in python that might work better,
> but I never tried any of these packages.
>
> Choosing (random) starting values depends completely on the function
> and there is no function independent recipe, since the
> parameterization of a function is pretty arbitrary. So, you need an
> "educated" guess over the possible range given the specific function
> and problem.
>
> For specific classes of functions in not too high dimension it would
> be possible to find (and code) starting values, or ranges for a random
> search.
>
> I haven't tried out what your function looks like, but I would guess
> that there are at least some sign restrictions. I usually try to see
> if I can guess starting values, and ranges for randomization, based on
> min, max and mean of the observations.

(can you please bottom post in this mailing list, it's difficult to
find the thread)

tanh looks bad for optimization, it's essentially flat, -1 or 1
outside of (-4,4) or so.

playing a bit, fmin and curve_fit just find solutions with the tanh
part equal to 1 or -1, fmin finds -1, curve_fit picks +1 (with the
same starting values)

If you want any action from the tanh part, then, it looks to me, A*k*x
would need to be restricted to be mostly in the (-4,4) range, maybe a
reparameterization (with b = A*k) would help.

Josef

>
> Josef
>
>>
>> Best,
>> Mat
>>
>>
>> 2012/2/21 <josef.pktd@gmail.com>
>>
>>> On Tue, Feb 21, 2012 at 4:38 AM, servant mathieu
>>> <servant.mathieu@gmail.com> wrote:
>>> > Dear Scipy users,
>>> >
>>> > I've got some troubles with the scipy.optimize.curve_fit function. This
>>> > function is based on the Levenburg-Maquardt algorithm, which is
>>> > extremely
>>> > rapid but usually finds a local minimum, not a global one (and thus
>>> > often
>>> > returns anormal parameter values). In my research field, we usually use
>>> > Nelder Mead's simplex routines to avoid this problem. However, I don't
>>> > know
>>> > if it is possible to perform curve fitting in scipy using simplex; the
>>> > fmin
>>> > function  doesn't seem to perform adjustments to data.
>>> >
>>> > Here is my code for fitting a three parameters hyperbolic cotangent
>>> > function
>>> > using curve_fit:
>>> >
>>> > from scipy.optimize import curve_fit
>>> >
>>> > import numpy as np
>>> >
>>> >
>>> >
>>> > def func (x, A,k, r ):
>>> >
>>> > return r + (A /(k*x)) * np.tanh (A*k*x)
>>> >
>>> >
>>> >
>>> > xdata = np.array ([0.15,0.25,0.35,0.45, 0.55, 0.75])
>>> >
>>> >
>>> >
>>> > datacomp = np.array ([344.3276300, 324.0051063, 314.2693475,
>>> > 309.9906375,309.9251162, 307.3955800])
>>> >
>>> > dataincomp = np.array ([363.3839888, 343.5735787, 334.6013375,
>>> > 327.7868238,
>>> > 329.4642550, 328.0667050])
>>> >
>>> >
>>> >
>>> > poptcomp, pcovcomp = curve_fit (func, xdata, datacomp, maxfev = 10000)
>>> >
>>> > poptincomp, pcovincomp = curve_fit (func, xdata, dataincomp, maxfev =
>>> > 10000)
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > How could I proceed to perform the fitting using simplex?
>>>
>>> You need to define your own loss function for the optimizers like
>>> fmin, or in future version minimize
>>>
>>> http://docs.scipy.org/doc/scipy-0.10.0/reference/tutorial/optimize.html#nelder-mead-simplex-algorithm-fmin
>>>
>>> something like this
>>>
>>> def loss(params, args)
>>>     A, k, r = params
>>>     y, x = args
>>>     return ((y - func (x, A,k, r ))**2).sum()
>>>
>>> and use loss in the call to fmin
>>>
>>> Josef
>>>
>>
>>
>>
>>
>>
>>>
>>> >
>>> >
>>> >
>>> > Best,
>>> >
>>> > Mat
>>> >
>>> >
>>> > _______________________________________________
>>> > SciPy-User mailing list
>>> > SciPy-User@scipy.org
>>> > http://mail.scipy.org/mailman/listinfo/scipy-user
>>> >
>>> _______________________________________________
>>> SciPy-User mailing list
>>> SciPy-User@scipy.org
>>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>
>>
>>
>> _______________________________________________
>> SciPy-User mailing list
>> SciPy-User@scipy.org
>> http://mail.scipy.org/mailman/listinfo/scipy-user
>>
```

More information about the SciPy-User mailing list