[Numpy-discussion] doing zillions of 3x3 determinants fast
Sun Aug 24 22:48:54 CDT 2008
I need to take the determinants of a large number of 3x3 matrices, in
order to determine for each of N points, in which of M tetrahedral cells
they lie. I arrange the matrices in an ndarray of shape (N,M,5,3,3).
As far as I can tell, Numpy doesn't have a function to do determinants
over a specific set of axes... so I can't say:
dets = np.linalg.det(a, axis1=-1, axis2=-1)
Using an explicit Python loop is very slow... doing the determinants of
100,000 random 3x3 matrices in a loop and inserting the results into a
preallocated results array takes about 5.1 seconds.
So instead I coded up my own routine to do the determinants over the last
2 axes, based on the naive textbook formula for a 3x3 determinant:
a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2]
d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2]
g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2]
This is *much much* faster... it does the same 100,000 determinants in
0.07 seconds. And the results differ from linalg.det's results by less
than 1 part in 10^15 on average (w/ a std dev of about 1 part in 10^13).
Does anyone know of any well-written routines to do lots and lots of
determinants of a fixed size? My routine has a couple problems I can see:
* it's fast enough for 100,000 determinants, but it bogs due to
all the temporary arrays when I try to do 1,000,000 determinants
(=72 MB array)
* I've heard the numerical stability of the naive determinant
algorithms is fairly poor, so I'm reluctant to use this function on
Any advice will be appreciated!
More information about the Numpy-discussion