[SciPy-User] numpy shared memory api/tests
Wed Mar 17 15:51:14 CDT 2010
the shared ctypes arrays in multiprocessing get pickled by a handler attached to multiprocessing.ForkingPickler. This means that they (and consequently also shmarrays) will only pickle properly when pickled through ForkingPickler (ie when passed as arguments to a new process, or e.g. as init arguments to a Pool). It makes shmarray multiprocessing specific, and also means that you can't send them through multiprocessing queues or as arguments to e.g. Pool.map.
I suspect that this behaviour of sharedcytpes arrays is a design feature, as, due to the fact that multiprocessing queues etc... also support distribution over a network, the only time that you can be sure that the recipient will have access to the shared memory is when you're forking or spawning the recipients on the same machine.
----- Original Message ----
From: Christopher Lee-Messer <firstname.lastname@example.org>
To: Gael Varoquaux <email@example.com>
Cc: SciPy Users List <firstname.lastname@example.org>
Sent: Thu, 18 March, 2010 5:23:14 AM
Subject: [SciPy-User] numpy shared memory api/tests
I've added David B.'s shmarray and added some tests and packaging to my repo.
It seems useful to have both a sysv-style shared memory and David's
implementation which is mmap based. I've tested that both work on
winxp, mac OS 10.6 32bit and linux kernel 2.6.x 32bit.
However, what should the api be?
If anyone wants to look at what the tests and make suggestions, I will
try to add them as I get time.
The current tests try creating small shared arrays using a
non-exhaustive list of dtypes. shmarray.py fails when trying to
pickle a c_double_Array_4 because multiprocessing.sharedctypes doesn't
support pickling it. I would guess that that support would be
straightforward to add.
On Sat, Mar 13, 2010 at 10:54 AM, Gael Varoquaux
> On Sat, Mar 13, 2010 at 08:45:08AM -0800, Christopher Lee-Messer wrote:
>> I don't know if Sturla, Gael Varoquaux, or Robert Kern are continuing
>> to work on these, but I plan to add testing and save results on
>> different platforms as my app gets used in different computers in the
> I am not. I have been wanting to propose this for integration in numpy
> for a long while, but I really haven't found the time, and I will not any
> time soon.
> Before any patch can be written to push in numpy, it does need tests
> and packaging, so what you are proposing to do is really great and very
> helpful. Also, using it on many platform would help ironing out details.
> Thanks for stepping up,
SciPy-User mailing list
More information about the SciPy-User