[SciPy-user] OLS matrix-f(x) = 0 problem (Was: linear regression)

Gael Varoquaux gael.varoquaux@normalesup....
Wed May 27 16:53:40 CDT 2009

On Wed, May 27, 2009 at 03:55:18PM -0500, Robert Kern wrote:
> On Wed, May 27, 2009 at 15:50, Gael Varoquaux
> <gael.varoquaux@normalesup.org> wrote:
> > I have been fighting a bit with a OLS regression problem (my ignorance in
> > regression is wide), and a remark by Robert just prompted me to ask the
> > list:

> > On Wed, May 27, 2009 at 02:37:14PM -0500, Robert Kern wrote:
> >> "f(x)=0" models can express covariances between all dimensions of x.

> > Sorry for asking you about my 'homework', but people seem so
> > knowledgeable...

> > I have a multivariate dataset X, and a given sparse, lower triangular,
> > boolean, matrix T with an empty diagonal. I am interested in finding the
> > matrix R for which support(R) == support(T), that is the OLS solution to:

> > Y = np.dot(R, Y)

> Where did Y come from? And where did X and T go?

Darn, sorry. Y and X are the same thing: my data. T is only there to
specify the support of R. Another way to put it is that I know that a
large fraction of the coefficients of R are zeros.

I have a hunch that I need to 'unroll' the non-zero coefficients, and get
back to a simpler, and well-known OLS estimation problem, but I couldn't
do it.



More information about the SciPy-user mailing list