[IPython-User] Multiprocessing with IPython

MinRK benjaminrk@gmail....
Mon Jul 23 13:22:09 CDT 2012


libzmq contexts/sockets are not fork-safe, so you mustn't use them
across the fork implied by multiproccessing.Process.

What is happening is that the zmq objects created as a part of your
initial Client (the one you use for:

print "There are %d engines." % len(client.ids)

are being passed across the fork, and cleaned up by garbage collection
in the subprocess which crashes.  I believe this has been fixed in
pyzmq master, but you can work around it by making sure there are no
zmq objects alive in the parent process when you fork, either by
calling client.close() prior to calling proc.start(), or simply not
creating the Client in the parent in the first place, e.g.:

def print_engines():
    print "There are %d engines." % len(parallel.Client())

p = Process(target=print_engines)
p.start()

Does that help?


-MinRK


On Mon, Jul 23, 2012 at 3:11 AM, mgi <miles.izzo@dataprocessors.com.au> wrote:
>
> For what it's worth, I managed to get around this error by initialising the
> Client with a fresh zmq context. Basically, I replaced:
>
> client = Client()
>
> with:
>
> import zmq
> client = Client(context=zmq.Context())
>
> Not sure if this is a valid approach, but it seemed to work!
>
>
>
> --
> View this message in context: http://python.6.n6.nabble.com/Multiprocessing-with-IPython-tp4982825p4982828.html
> Sent from the IPython - User mailing list archive at Nabble.com.
> _______________________________________________
> IPython-User mailing list
> IPython-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/ipython-user


More information about the IPython-User mailing list