[SciPy-User] Sparse Matricies and NNLS

Michael Morrison mwmorrison93@gmail....
Thu Mar 28 15:05:35 CDT 2013


Hey Calvin, I was just looking into this same issue recently. Here's the
solution someone recommended on stackoverflow:

http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices

I haven't actually used it yet, but it seems Pytables is the way to go.

Good Luck,
Mike


On Thu, Mar 28, 2013 at 12:46 PM, Calvin Morrison <mutantturkey@gmail.com>wrote:

> Hi!
>
> I have a very large matrix that I am using with the
> scipy.optimize.nnls function, however the matrix is so large that it
> takes 30Gb of memory to load with python!
>
> I was thinking about using a sparse matrix since it is relatively
> sparse, but the problem is that even if I was to use the sparse
> matricies, the nnls function only accepts a ndarray, not a sparse
> matrix. (when I try and throw a sparse matrix at it I get an error.)
>
> So of course I'd need to convert it to a dense array before passing it
> into nnls, but that would totally void the whole point of a sparse
> array, because then python would still have to allocate the full dense
> matrix.
>
> Does anyone have an idea about how to efficiently pass a matrix and
> store it without blowing up the memory usage?
>
> Thank you,
>
> Calvin
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/scipy-user/attachments/20130328/f0947160/attachment.html 


More information about the SciPy-User mailing list