[SciPy-dev] Splitting sparsetools_wrap source file ?

David Cournapeau david@ar.media.kyoto-u.ac...
Mon Mar 10 08:14:59 CDT 2008


Nathan Bell wrote:
>
> Splitting the file into multiple parts does reduce the memory usage,
> but not by the expected fraction.  Aside from manually splitting the
> SWIG output into multiple files (which would be tedious, time
> consuming, and error-prone), I'm not sure how to remedy the situation.
>   
This is obviously a bad solution (splitting the generated swig file), 
and a nightmare to get right. I was more thinking about splitting the 
interface file, so that only a couple of functions are generated by 
each: this should be doable, no ? I can do it, if there is a chance for 
a patch to be included. There would be say N swig interface files (one 
for _diagonal, one for _scale, etc...), and sparsetools.py itself would 
be written manually, but would just import python functions from each 
generated python modules, that is would be a few lines only (I bet this 
python module could easily be generated, too, if wanted).
> In the era of $25/GB RAM, is it not more expedient to simply increase
> your memory capacity?  Using SWIG and C++ templates is a major
> convenience in sparsetools since adding new dtypes becomes trivial.
>   
I am not suggesting giving up swig or C++ templates. But the problem is 
not the cost of memory: when virtual machines came into the game, you 
hit really quickly the 32 bits limits (or more exactly, the fact that 
most computers cannot physically handle more than 4 Gb of Memory). For 
example, when I test numscons on solaris, I use indiana, which is a 
binary distribution of open solaris available for free, and the VM takes 
more than 1 Gb of ram when compiling sparsetools. Even on my recent 
macbook with 2 Gb of Ram, I am at the limit. And virtual machines are 
the only way for me to test many platforms (and build bots too often run 
on vmware).

cheers,

David


More information about the Scipy-dev mailing list