[Numpy-discussion] inversion of large matrices
Thu Sep 9 11:18:29 CDT 2010
> Yes, this is what I am computing. I am computing the pdf of a very high-
> dimensional multivariate normal. Is there a specialized method to compute
If you use cho_solve and cho_factor from scipy.linalg, you can proceed
cx = X - m
sqmahal = (cx*cho_solve(cho_factor(S),cx.T).T).sum(axis=1)
where X is the data (n x p), m is mean (1 x p) and S is the covariance (p
We could do this twice as fast using a hypothetical method tri_solve that
calls LAPACK subroutine DTRTRS.
cx = X - m
sqmahal = (tri_solve(cho_factor(S),cx.T).T**2).sum(axis=1)
It would be twice as fast because cho_solve does 50% redundant work in the
calculation of Mahalanobis distances, as we can skip the backsubstitution.
You can e.g. get DTRTRS from Intel MKL, ACML (free download from AMD) or
GotoBLAS. Just downloading ACML and using ctypes (or f2py or Cython) to
get DTRTRS is easy. If you do this you probably also want to expose
LAPACK's Cholesky subroutine DPOTRS from ACML or Intel MKL, to make sure
you get everything as fast as possible (that would also avoid the
dependency on SciPy).
I.e. call DPOTRS on S, compute CX, call DTRTRS, then take the sum of
squares along the first axis (of CX that DTRTRS modified inplace). That is
the fastest way to compute Mahalanobis distances I can think of...
Make sure you use an optimized LAPACK library: ACML, Intel MKL and
GotoBLAS are the only one worth looking at. ATLAS is much slower, and
forget about Netlib's reference implementations of BLAS and LAPACK.
I hope the SciPy dev team can be persuaded to include a wrapper for DTRTRS
in the future. It is after all extremely useful for Mahalanobis distances,
and thus for any use of linear models in statistics.
More information about the NumPy-Discussion