[SciPy-user] more help with sparse matrix: creating

Robin robince@gmail....
Mon Oct 8 13:26:03 CDT 2007

On 10/8/07, Dominique Orban <dominique.orban@gmail.com> wrote:
> On 10/8/07, Robin <robince@gmail.com> wrote:
> > Unfortunately it seems I still can't do the MATLAB style indexing I
> wanted,
> > something like
> > M[2, [1,2,3]] = 1
> > but since I will be updating row by row anyway I can get around this by
> > creating a temporary row vector to insert with slicing.
> I am not sure what the above operation does in Matlab.

I guess in this example its the same as M[2,1:4]=1, so updates (2,1)  (2,2)
and (2,3) positions, but the advantage of this way is that they don't have
to be contiguous. Ie in my case I calculated the positions I want to set to
1 for each row, then want to do
M[2, positions2update] = 1
Anyway it should be easy for me to work round this for my case.

> Also, if I understand correctly the conversion to csr necessary for
> > matrix-vector multiplication cannot be done in place - this is a real
> shame
> That's unfortunately right. It would be a nice addition to PySparse.
> However in PySparse, once converted, you cannot modify a csr matrix
> anymore.
> > as it doubles the memory requirements for this over the MATLAB version
> > (I really want to make this array as big as possible)
> You don't HAVE to convert it to do matrix-vector product. Objects of
> type ll_mat also have a matvec() method. However, it is true that the
> csr format is more compact and speeds up products.

While generating the matrix takes a long time it only has to be done once -
the products are going to be performed inside an optimisation loop so have
to be as fast as possible...

Also I guess the pysparse objects cant be saved with io.savemat. I thought
it would be better to use a binary format, but I will try matrixmarket.

Actually the other issue is that you can't specify dtype with pysparse as
far as I can tell. My matrix contains only integer values (0 or 1) so I was
hoping to use dtype=byte. I think it might use less ram to do the
scipy.sparse way of allocating csr with small dtype, converting to lil to
fill, then converting back.
Is it correct that if I create an empty csr from scipy.sparse with a
specified nnz and convert it lil that lil will have memory reserved for
appropriate number of entries? (or does lil not have a notion of nnz
allocation and just expands dynamically as needed?)


-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/scipy-user/attachments/20071008/c14fef03/attachment-0001.html 

More information about the SciPy-user mailing list