[SciPy-User] How to estimate error in polynomial coefficients from scipy.polyfit?
Anne Archibald
peridot.faceted@gmail....
Mon Mar 29 11:24:33 CDT 2010
On 29 March 2010 11:08, Jeremy Conlin <jlconlin@gmail.com> wrote:
> On Thu, Mar 25, 2010 at 9:40 PM, David Goldsmith
> <d.l.goldsmith@gmail.com> wrote:
>> On Thu, Mar 25, 2010 at 3:40 PM, Jeremy Conlin <jlconlin@gmail.com> wrote:
>>>
>>> Yikes! This sounds like it may be more trouble than it's worth. I
>>> have a few sets of statistical data that each need to have curves fit
>>> to them.
>>
>> That's an awfully generic need - it may be obvious from examination of the
>> data that a line is inappropriate, but besides polynomials there are many
>> other non-linear models (which can be linearly fit to data by means of data
>> transformation) which possess fewer parameters (and thus are simpler from a
>> parameter analysis perspective). So, the question is: why are you fitting
>> to polynomials? If it's just to get a good fit to the data, you might be
>> getting "more fit" than your data warrants (and even if that isn't a
>> problem, you probably want to use a polynomial form different from "standard
>> form," e.g., Lagrange interpolators). Are you sure something like an
>> exponential growth/decay or power law model (both of which are "more
>> natural," linearizable, two-parameter models) wouldn't be more appropriate -
>> it would almost certainly be simpler to analyze (and perhaps easier to
>> justify to a referee).
>>
>> On this note, perhaps some of our experts might care to comment: what
>> "physics" (in a generalized sense) gives rise to a polynomial dependency of
>> degree higher than two? The only generic thing I can think of is something
>> where third or higher order derivatives proportional to the independent
>> variable are important, and those are pretty uncommon.
>
> I will only be fitting data to a first or second degree polynomial.
> Eventually I would like to fit my data to an exponential or a power
> law, just to see how it compares to a low-order polynomial. Choosing
> these functions was based on qualitative analysis (i.e. "it looks
> quadratic").
I should say, then, that most of my dire-sounding comments can be
ignored in this case as long as you rescale the data to (-1,1) before
fitting your quadratic. (Which, come to think of it, gives you very
nearly the Chebyshev basis, but never mind.)
Exponentials and power laws are even easier, since they're linear when
you take logs of the appropriate coordinates. (Though here too, it
helps to center your data before fitting.)
> The best case scenario would be that I take what I learn from this
> "simple" example and apply it to more difficult problems as they come
> along down the road. It appears, however, that it's not so simple to
> apply it to other problems. I wish I had more time to learn about
> fitting data to curves. I'm sure there are a lot of powerful tools
> that can help.
This is a fine approach; in fact even if you end up doing non-linear
fitting (i.e. fits where the function is not a linear combination of
the parameters), the standard approach to getting errors is to just
use the derivatives of the fitting function at the best fit to treat
the problem as locally linear. (There are more elaborate alternatives
as well, including Monte Carlo approaches, which may be worth looking
into.)
Anne
More information about the SciPy-User
mailing list