[SciPy-User] How to estimate error in polynomial coefficients from scipy.polyfit?
Thu Mar 25 16:20:24 CDT 2010
On Thu, Mar 25, 2010 at 2:52 PM, Charles R Harris
> On Thu, Mar 25, 2010 at 2:32 PM, Jeremy Conlin <email@example.com> wrote:
>> I am using scipy.polyfit to fit a curve to my data. Members of this
>> list have been integral in my understanding of how to use this
>> function. Now I would like to know how I can get the uncertainties
>> (standard deviations) of polynomial coefficients from the returned
>> values from scipy.polyfit. If I understand correctly, the residuals
>> are sometimes called the R^2 error, right? That gives an estimate of
>> how well we have fit the data. I don't know how to use rank or any of
>> the other returned values to get the uncertainties.
>> Can someone please help?
> You want the covariance of the coefficients, (A.T * A)^-1/var, where A is
> the design matrix. I'd have to see what the scipy fit returns to tell you
> more. In anycase, from that you can plot curves at +/- sigma to show the
> error bounds on the result. I can be more explicit if you want.
Thanks Chuck. That seems to get closer to what I need. I am just
fitting data to a polynomial, nothing too fancy. I would like the
variance (not the covariance) of the coeffficients. As far as what is
returned from scipy.polyfit this is what the documentation says:
residuals, rank, singular_values, rcond : present only if `full` = True
Residuals of the least-squares fit, the effective rank of the scaled
Vandermonde coefficient matrix, its singular values, and the specified
value of `rcond`. For more details, see `linalg.lstsq`.
I don't think any of these things are "design matrix" as you have
indicated I need. The documentation for linalg.lstsq does not say
what rcond is unfortunately. Any ideas?
More information about the SciPy-User