[SciPy-User] Generalized least square on large dataset
Charles R Harris
Wed Mar 7 22:04:04 CST 2012
On Wed, Mar 7, 2012 at 8:58 PM, <email@example.com> wrote:
> On Wed, Mar 7, 2012 at 10:46 PM, Charles R Harris
> <firstname.lastname@example.org> wrote:
> > On Wed, Mar 7, 2012 at 7:39 PM, Peter Cimermančič
> > <email@example.com> wrote:
> >> Hi,
> >> I'd like to linearly fit the data that were NOT sampled independently. I
> >> came across generalized least square method:
> >> b=(X'*V^(-1)*X)^(-1)*X'*V^(-1)*Y
> >> X and Y are coordinates of the data points, and V is a "variance
> >> The equation is Matlab format - I've tried solving problem there too,
> >> it didn't work - but eventually I'd like to be able to solve problems
> >> that in python. The problem is that due to its size (1000 rows and
> >> the V matrix becomes singular, thus un-invertable. Any suggestions for
> >> to get around this problem? Maybe using a way of solving generalized
> >> regression problem other than GLS?
> > Plain old least squares will probably do a decent job for the fit, where
> > will run into trouble is if you want to estimate the covariance.
> side question:
> Are heteroscedasticity and (auto)correlation robust standard errors
> popular in any field outside of economics/econometrics, so called
> sandwich estimators of covariance matrix?
> (estimate with OLS ignoring non-independent and non-identical noise,
> but correct the covariance matrix)
> I recently expanded this in statsmodels, and would like to start soon
> some advertising in favor of sandwiches.
I'm not familiar with them, but I can't speak for many. Indeed, there seems
to be the most rudimentary understanding of statistics in many fields,
basically reducible to root sum of squares for the more sophisticated ;)
But I think I was contemplating something similar to what you mention.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User