[SciPy-dev] Splitting sparsetools_wrap source file ?
Thu Mar 13 10:15:29 CDT 2008
On Mon, Mar 10, 2008 at 8:14 AM, David Cournapeau
> and a nightmare to get right. I was more thinking about splitting the
> interface file, so that only a couple of functions are generated by
> each: this should be doable, no ? I can do it, if there is a chance for
> a patch to be included. There would be say N swig interface files (one
> for _diagonal, one for _scale, etc...), and sparsetools.py itself would
> be written manually, but would just import python functions from each
> generated python modules, that is would be a few lines only (I bet this
> python module could easily be generated, too, if wanted).
While better than manually splitting the _wrap file, this approach is
still cumbersome. There are ~35 functions in sparsetools, so a 1
function : 1 file policy is not really scalable.
I tried lumping all the CSR functions together and found only modest savings.
Disabling the templates that unroll dense loops for the BSR matrix
(see bsr_matvec) produced a measureable improvement in memory usage so
I've committed this version of sparsetools.h to SVN.
> I am not suggesting giving up swig or C++ templates. But the problem is
> not the cost of memory: when virtual machines came into the game, you
> hit really quickly the 32 bits limits (or more exactly, the fact that
> most computers cannot physically handle more than 4 Gb of Memory). For
> example, when I test numscons on solaris, I use indiana, which is a
> binary distribution of open solaris available for free, and the VM takes
> more than 1 Gb of ram when compiling sparsetools. Even on my recent
> macbook with 2 Gb of Ram, I am at the limit. And virtual machines are
> the only way for me to test many platforms (and build bots too often run
> on vmware).
Are you saying that g++ fails to compile on the VM, or that it starts
swapping to disk?
Nathan Bell email@example.com
More information about the Scipy-dev