[SciPy-User] segfault

Paul Anton Letnes paul.anton.letnes@gmail....
Sun Aug 14 15:45:30 CDT 2011


(I am cross posting this to the 

This code (see bottom of email) crashed with a segfault at the scipy.linalg.eigvals line:
% time python iterative-test.py
File read
zsh: segmentation fault  python iterative-test.py
python iterative-test.py  536.82s user 2.90s system 96% cpu 9:18.75 total

Numpy version: 1.6.1
Scipy version: 0.9.0
Python version: 2.7.2 (default, Jun 25 2011, 09:29:54) 
[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)]
Mac OS X 10.6.8

I have 4 GB of memory (and I believe Mac OS X allows you to use as much hard drive space as theoretically available as swap space), and the matrix A is of dtype numpy.complex64 and has shape (4608, 4608). In 'activity monitor' the process claims to use just shy of 370 MB of memory, and does not increase with time. By my calculations the A matrix should be about 162 MB (not worrying about 'object overhead', which should be small).

Anything I can do to help? I'd be happy to upload my matrix on my webpage, if someone wants to use it as test data. The hdf5 file is 162 MB so too big for the mailing list I suppose.


def main():
    f = h5py.File('A2.h5', 'r')
    A = f['A'][:]
    b = numpy.loadtxt('RHS_p_formatted_copy', dtype=numpy.float32)
    b = b[:, 0] + 1.0j * b[:, 1]
    print 'File read'

    t0 = time.time()
    print 'Eigvals:'
->  w = scipy.linalg.eigvals(A, overwrite_a=True)
    w = numpy.sort(w)
    t_eig = time.time()
    print 'eig time:', t_eig - t0
    print w

More information about the SciPy-User mailing list