[IPython-User] Multiprocessing with IPython

mgi miles.izzo@dataprocessors.com...
Mon Jul 23 02:37:42 CDT 2012


I have seen multiple issues in the past about using IPython in multiple
processes, but I'm not sure that this exact issue has been addressed.
Perhaps I'm doing something fundamentally wrong, in which case it would also
be great to know!

I'm starting a controller on a machine using:
ipcontroller --ip= --port=1024

Then connecting a couple of engines on different machines using:

They are operating under an NFS share, so the home directories are shared
(hence the engines can find the JSON settings from the default profile). The
connections seem to succeed.

Given the code pasted here: http://pastebin.com/3ezLnJZP

I get the following output:
There are 2 engines.
Assertion failed: ok (mailbox.cpp:84)

..followed by this stacktrace (after the process has already spat me back to
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in
  File "test_multiproc.py", line 9, in run
    my_client = Client()
  File "/usr/lib/python2.7/dist-packages/IPython/parallel/client/client.py",
line 387, in __init__
    self._connect(sshserver, ssh_kwargs, timeout)
  File "/usr/lib/python2.7/dist-packages/IPython/parallel/client/client.py",
line 491, in _connect
    raise error.TimeoutError("Hub connection request timed out")
TimeoutError: Hub connection request timed out

What am I doing wrong? Note that there are no problems with the connection
if I do everything in sub-processes, or everything in the main process (as
in, I can execute code on the engines from that script).

Some other details: I'm running Ubuntu 12.04 (precise) amd64, Python 2.7.3,
IPython 0.12.1, 0MQ 2.2.0 (also tried 2.1.11).

View this message in context: http://python.6.n6.nabble.com/Multiprocessing-with-IPython-tp4982825.html
Sent from the IPython - User mailing list archive at Nabble.com.

More information about the IPython-User mailing list