[SciPy-User] [SciPy-user] Problem with np.load() on Huge Sparse Matrix
Ryan R. Rosario
Fri Jun 4 01:39:28 CDT 2010
Is this a bug? Has anybody else experienced this?
Not being able to load a matrix from disk is a huge limitation for me. I
would appreciate any help anyone can provide with this.
Ryan R. Rosario wrote:
> I have a very huge sparse (395000 x 395000) CSC matrix that I cannot
> save in one pass, so I saved the data, indices, indptr and shape in
> separate files as suggested by Dave Wade-Farley a few years back.
> When I try to read back the indices pickle:
>>> np.save("indices.pickle", mymatrix.indices)
>>>> indices = np.load("indices.pickle.npy")
> array([394852, 394649, 394533, ..., 0, 0, 0], dtype=int32)
> array([394852, 394649, 394533, ..., 1557, 1223, 285], dtype=int32)
> Why is this happening? My only workaround is to print all of entries
> of intersection_matrix.indices to a file, and read in back which takes
> up to 2 hours. It would be great if I could get np.load to work
> because it is much faster.
> SciPy-User mailing list
View this message in context: http://old.nabble.com/Problem-with-np.load%28%29-on-Huge-Sparse-Matrix-tp28719518p28776255.html
Sent from the Scipy-User mailing list archive at Nabble.com.
More information about the SciPy-User