[SciPy-User] Sparse Matricies and NNLS
Thu Mar 28 14:46:39 CDT 2013
I have a very large matrix that I am using with the
scipy.optimize.nnls function, however the matrix is so large that it
takes 30Gb of memory to load with python!
I was thinking about using a sparse matrix since it is relatively
sparse, but the problem is that even if I was to use the sparse
matricies, the nnls function only accepts a ndarray, not a sparse
matrix. (when I try and throw a sparse matrix at it I get an error.)
So of course I'd need to convert it to a dense array before passing it
into nnls, but that would totally void the whole point of a sparse
array, because then python would still have to allocate the full dense
Does anyone have an idea about how to efficiently pass a matrix and
store it without blowing up the memory usage?
More information about the SciPy-User