[SciPy-dev] Sparse matrix module
schofield at ftw.at
Mon Oct 24 07:27:29 CDT 2005
Robert Cimrman wrote:
> Ed Schofield wrote:
>> I wrote in the SVN change log:
>> This is the beginning of a merge of Roman Geus's PySparse into scipy.
>> The goals are to make it more 'Pythonic', integrate sparse matrix types
>> into scipy so they act similarly to dense arrays, and to support more
>> a few more sparse data types, particularly CSC, in a nice OO hierarchy.
> Does this basically mean to make Roman's spmatrix extension type also
> a subclass of the original Travis' spmatrix? Or would you prefer to
> keep scipy-ized PySparse separately and ultimately (merge &) replace
> the original?
> Here is a little summary of the situation as I see it now:
> existing module (TravisSparse):
> - Python spmatrix class + a number of Python subclasses with a
> possible fortran/C underlying implementation
> - pros: trivial addition of new methods / attributes, speeding up
> could be done "later"; handles (or will handle :-) all numeric scipy
> types (ints, floats, doubles, ...) automagically
> - cons: speed in some situations?
> experimental module (RomanSparse) (random remarks, since I know nuts
> about it):
> - C extension class
> - pros: constructor speed? and speed in general?
> - cons: not so easy to change the low-level implementation, 'cause
> it's on that level already?; the solvers (umfpack etc.) are
> hand-wrapped - I would certainly prefer a generated interface (swig?)
> which would be AFAIK much more flexible.
> I am personally in favor of the TravisSparse approach as the base with
> RomanSparse subclass of spmatrix with a key feature "*speed* *speed*
> *speed*". Also the _solvers_ should be split as much as practical from
> the sparse matrix _type_. Of course, having some 'recommended format
> hinting' (e.g. CSR for umfpack), that would tell the user "use this if
> you want speed and to avoid implicit matrix conversion" would be
Yes, I agree that a Python implementation would be simpler and more
flexible than one in C. Perhaps our goal should be to build on the
existing TravisSparse module but replace the sparsetools Fortran code
with code from PySparse, which is probably better tested and debugged.
I've had a look at how objects with both C and Python member functions
are possible in the Python standard library. An example is the random
module: there's a randommodule.c file that is imported in the Python
random.py file and used as a base class for Python objects:
Then the functions exported by the C module are wrapped like this:
def seed(self, a=None):
Why don't we adopt a similar structure? Then we'd use Travis's spmatrix
as the base class, and derive csr_matrix from both spmatrix and an
object we import from C. Or is it possible for a C module to derive
directly from a Python class?!
More information about the Scipy-dev