[Numpy-discussion] wired error message in scipy.sparse.eigen function: Segmentation fault

David Cournapeau david@silveregg.co...
Thu Jan 28 00:11:25 CST 2010


Jankins wrote:
> Yes. I am using scipy.sparse.linalg.eigen.arpack.
> 
> The exact output is:
> 
> /usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so

I need the output of ldd on this file, actually, i.e the output of "ldd 
/usr/local/lib/python2.6/dist-packages/scipy/sparse/linalg/eigen/arpack/_arpack.so". 
It should output the libraries actually loaded by the OS.

> In fact, the matrix is from a directed graph with about 18,000 nodes and 
> 41,000 edges. Actually, this matrix is the smallest one I used.

Is it available somewhere ? 41000 edges should make the matrix very 
sparse. I first thought that your problem may be some buggy ATLAS, but 
the current arpack interface (the one used by sparse.linalg.eigen) is 
also quite buggy in my experience, though I could not reproduce it. 
Having a matrix which consistently reproduce the bug would be very useful.

In the short term, you may want to do without arpack support in scipy. 
In the longer term, I intend to improve support for sparse matrices 
linear algebra, as it is needed for my new job.

> Now I switch to use numpy.linalg.eigvals, but it is slower than 
> scipy.sparse.linalg.eigen.arpack module.

If you have a reasonable ATLAS install, scipy.linalg.eigvals should 
actually be quite fast. Sparse eigenvalues solver are much slower than 
full ones in general as long as:
	- your matrices are tiny (with tiny defined here as the plain matrix 
requiring one order of magnitude less memory than the total available 
memory, so something like matrices with ~ 1e7/1e8 entries on current 
desktop computers)
	- you need more than a few eigenvalues, or not just the 
biggest/smallest ones

cheers,

David


More information about the NumPy-Discussion mailing list