[SciPy-user] linear regression
Wed May 27 14:03:54 CDT 2009
On Wed, May 27, 2009 at 13:28, <email@example.com> wrote:
> On Wed, May 27, 2009 at 12:35 PM, ms <firstname.lastname@example.org> wrote:
>> email@example.com ha scritto:
>>>> Have a look here <http://www.scipy.org/Cookbook/LinearRegression>
>>> y = Beta0 + Beta1 * x + Beta2 * x**2 is the second order polynomial.
>>> I also should have looked, polyfit returns the polynomial coefficients
>>> but doesn't calculate the variance-covariance matrix or standard
>>> errors of the OLS estimate.
>> AFAIK, the ODR fitting routines return all these parameters, so one can
>> maybe use that for linear fitting too.
> you mean scipy.odr?
> I never looked at it in details. Conceptionally it is very similar to
> standard regression, but I've never seen an application for it, nor do
> I know the probability theoretic or econometric background of it.
ODR is nonlinear least-squares with errors in both variables (e.g.
minimizing the weighted sum of squared distances from each point to
the corresponding closest points on the curve rather than "straight
down" as in OLS). scipy.odr implements both ODR and OLS. It also
implements implicit regression, where the relationship between
variables is not expressed as "y=f(x)" but "f(x,y)=0" such as fitting
> results for many cases will be relatively close to standard least
> A google search shows links to curve fitting but not to any
> econometric theory. On the other hand, there is a very large
> literature on how to treat measurement errors and endogeneity of
> regressors for (standard) least squares and maximum likelihood.
The extension is straightforward. ODR is really just a generalization
of least-squares. Unfortunately, the links to the relevant papers seem
to have died. I've put them up here:
"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
-- Umberto Eco
More information about the SciPy-user