[SciPy-User] Multiprocessing and shared memory

Felix Schlesinger schlesin@cshl....
Sun Oct 18 23:10:03 CDT 2009


Sturla Molden <sturla <at> molden.no> writes:

> 
> Felix Schlesinger skrev:
> > That does work if one is careful never to create any new reference to
> > the shared array to or modify it in any other implicit way in the
> > worker process.
> No no no...
> 
> Only the pages (blocks of 4096 bytes) written to are copied. If you 
> don't write to the buffer, nothing it copied.
> 
> You don't write to the buffer of an ndarray by creating new references 
> to it.

Now that you say it, I agree, that really show be true (unless the kernel does
some magic 'optimization' here and gets it very wrong). 
However my program definitly made a copy of the whole array at some (at first
glance unpredictable) point. I'll go back and check. Maybe it had to do with the
fact that I was passing the array as a parameter to a subfunction within the
worker, but I don't see how right now. 
Thanks for the hint.

> >  The problem is that a modification will not cause an
> > error, but simply a copy (i.e. silent memory leak).
> >
> >   
> There is no memory leak here.

Memory leak is the wrong word, since there is still a valid reference to the
memory, but the effect was the same for me. The program gradually consumed more
and more memory until it ran out (I think due to pages being copied on write).

Felix




More information about the SciPy-User mailing list