[SciPy-user] shared memory machines

Gael Varoquaux gael.varoquaux@normalesup....
Mon Feb 2 00:38:33 CST 2009

On Mon, Feb 02, 2009 at 01:51:08AM +0100, Sturla Molden wrote:
> I have been working on a multiprocessing + NumPy cookbook tutorial. For
> now the unfinished draft is here:

> http://folk.uio.no/sturlamo/python/multiprocessing-tutorial.pdf

Hey, it's a very interested document. It seems that you have quite a lot
of insight on these problems.

I hadn't realized that a numpy array with the memory alocated as shared
memory would be automaticaly shared by multiprocessing (I tried, and to
my surprise, it works).

So it seems that shmem_as_ndarray (the implementation of which is fairly
similar in your code and in mine), and probably probably some array
creation helper like empty_shmem, is all we need to use multiprocessing
with numpy. Do you concur?

I also like a lot your code to figure out the number of processor. It is
very useful in a multiprocessing scientific computing package. However my
limitation is more often than not memory. Do you have cross platform code
to analyse the percent of memory used, and the absolute amount of memory

I think I should write empty_shmem, to complete hide the multiprocessing
Array, delete my useless SharedMemArray class, integrate your number of
processor function, and recirculate my code, if it is OK with you. In a
few iterations we can propose this for integration in numpy.



More information about the SciPy-user mailing list