[SciPy-User] Sparse Matricies and NNLS
Thu Mar 28 15:05:35 CDT 2013
Hey Calvin, I was just looking into this same issue recently. Here's the
solution someone recommended on stackoverflow:
I haven't actually used it yet, but it seems Pytables is the way to go.
On Thu, Mar 28, 2013 at 12:46 PM, Calvin Morrison <firstname.lastname@example.org>wrote:
> I have a very large matrix that I am using with the
> scipy.optimize.nnls function, however the matrix is so large that it
> takes 30Gb of memory to load with python!
> I was thinking about using a sparse matrix since it is relatively
> sparse, but the problem is that even if I was to use the sparse
> matricies, the nnls function only accepts a ndarray, not a sparse
> matrix. (when I try and throw a sparse matrix at it I get an error.)
> So of course I'd need to convert it to a dense array before passing it
> into nnls, but that would totally void the whole point of a sparse
> array, because then python would still have to allocate the full dense
> Does anyone have an idea about how to efficiently pass a matrix and
> store it without blowing up the memory usage?
> Thank you,
> SciPy-User mailing list
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the SciPy-User