[SciPy-User] Global Curve Fitting of 2 functions to 2 sets of data-curves
Thu Jun 10 14:34:42 CDT 2010
On Thu, Jun 10, 2010 at 2:58 PM, Sebastian Haase <firstname.lastname@example.org> wrote:
> On Thu, Jun 10, 2010 at 8:27 PM, <email@example.com> wrote:
>> On Thu, Jun 10, 2010 at 4:05 AM, Sebastian Haase <firstname.lastname@example.org> wrote:
>>> so far I have been using scipy.optimize.leastsq to satisfy all my
>>> curve fitting needs.
>>> But now I am thinking about "global fitting" - i.e. fitting multiple
>>> dataset with shared parameters
>>> (e.g. ref here:
>>> I have looked here (http://www.scipy.org/Cookbook/FittingData) and here
>>> Can someone provide an example ? Which of the routines of
>>> scipy.optimize are "easiest" to use ?
>>> Finally, I'm thinking about a "much more" complicated fitting task:
>>> fitting two sets of datasets with two types of functions.
>>> In total I have 10 datasets to be fit with a function f1, and 10 more
>>> to be fit with function f2. Each function depends on 6 parameters
>>> A1,A2,A3, r1,r2,r3.
>>> A1,A2,A3 should be identical ("shared") between all 20 sets, while
>>> r1,r2,r3 should be shared between the i-th set of type f1 and the i-th
>>> set of f2.
>>> Last but not least it would be nice if one could specify constrains
>>> such that r1,r2,r3 >0 and A1+A2+A3 == 1 and 0<=Ai<=1.
>>> ;-) Is this too much ?
>>> Thanks for any help or hints,
>> Assuming your noise or error terms are uncorrelated, I would still use
>> optimize.leastsq or optimize.curve_fit using a function that stacks
>> all the errors in one 1-d array. If there are differences in the noise
>> variance, then weights/sigma per function as in curve_fit can be used.
>> common parameter restrictions across functions can be encoded by using
>> the same parameter in several (sub-)functions.
>> In this case, I would impose the constraints through reparameterization, e.g
>> r1 = exp(r1_), ...
>> A1 = exp(A1_)/(exp(A1_) + exp(A2_) + 1)
>> A1 = exp(A2_)/(exp(A1_) + exp(A2_) + 1)
>> A1 = 1/(exp(A1_) + exp(A2_) + 1)
>> (maybe it's more tricky to get the standard deviation of the original
>> parameter estimate)
>> or as an alternative, calculate the total weighted sum of squared
>> errors and use one of the constraint fmin in optimize.
> Thanks for the reply,
> I will have to think about implementing my constrains by redefining
> vars using those kinds of tricks with exp -- are you sure they don't
> mess up convergence ? I'm just thinking of the optimization steps
> being so different depending on the current parameter value during the
> iteration (i.e. the derivative of exp is very non-linear)
> What are those other functions in
> http://docs.scipy.org/doc/scipy/reference/optimize.html for ?
> (Once, long time ago, I did use fmin_cobyla ... but don't remember why
> I choose it. Maybe something like one-sided constrains !?)
fmin_slsqp is the most flexible for constraints, but so far I have
use the constraint maximizers only for toy examples, and don't know
how robust they are.
Imposing constraints by reparamaterization or penalization is in my
experience not much of a problem, except getting a slightly interior
solution instead of exact boundary value.
The multinomial logit parameterization for A1, A2, A3 is pretty common
in econometrics, I'm not sure what's the most common for
If you have analytical gradients, then one of the other optimizers
might be better than leastsq.
(These are just my impression from my use cases.)
> SciPy-User mailing list
More information about the SciPy-User