[Numpy-discussion] doing zillions of 3x3 determinants fast

Daniel Lenski dlenski@gmail....
Sun Aug 24 23:01:18 CDT 2008

On Mon, 25 Aug 2008 03:48:54 +0000, Daniel Lenski wrote:
>   * it's fast enough for 100,000 determinants, but it bogs due to
>     all the temporary arrays when I try to do 1,000,000 determinants
>     (=72 MB array)

I've managed to reduce the memory usage significantly by getting the 
number of temporary arrays down to exactly two:

def det3(ar):
    a=ar[...,0,0]; b=ar[...,0,1]; c=ar[...,0,2]
    d=ar[...,1,0]; e=ar[...,1,1]; f=ar[...,1,2]
    g=ar[...,2,0]; h=ar[...,2,1]; i=ar[...,2,2]

    t=a.copy(); t*=e; t*=i; tot =t
    t=b.copy(); t*=f; t*=g; tot+=t
    t=c.copy(); t*=d; t*=h; tot+=t
    t=g.copy(); t*=e; t*=c; tot-=t
    t=h.copy(); t*=f; t*=a; tot-=t
    t=i.copy(); t*=d; t*=b; tot-=t

    return tot

Now it runs very fast with 1,000,000 determinants to do (<10X the time 
required to do 100,000) but I'm still worried about the stability with 
real-world data.


More information about the Numpy-discussion mailing list