[IPython-User] Parallel question: Sending data directly between engines
Fri Jan 6 22:18:37 CST 2012
On Fri, Jan 6, 2012 at 20:05, Matthew Rocklin <email@example.com> wrote:
> Looks like there is another example in that same directory that creates
> some convenience functions around inter-engine communication. I suspect it
> will solve my problem.
Yes, the interengine should be the simpler, more relevant example.
The principal reason we do not have direct inter-engine communication as a
part of the IPython cluster is that an important requirement that we *do
not* have is the visibility of engines to each other and/or the Controller.
The Controller is the only thing that listens, and thus the only one that
must be accessible. For instance, you can have engines connecting to the
same Controller from all kinds of different clusters / subnets. For this
reason, inter-engine communication will not work in nearly as many
environments as the current IPython cluster. Obviously most simple
deployments would have no problem running things like the above example.
I might write some more examples for client-level proxy objects that allow
a better interface for inter-engine communication.
> On Fri, Jan 6, 2012 at 5:30 PM, Fernando Perez <firstname.lastname@example.org>wrote:
>> Hi Matthew,
>> On Fri, Jan 6, 2012 at 1:59 PM, Matthew Rocklin <email@example.com>
>> > Hello,
>> > What is the easiest way for me to send data directly between two ipython
>> > engines?
>> > I.e. if I'm running a "master" script on a particular machine and type
>> > something like the following
>> > rc = p.Client()
>> > e0 = rc
>> > e1 = rc
>> > e0['data'] = e1['data']
>> > Then I suspect the data will be sent from e1 up to the master engine and
>> > then down to e0. How can I skip the intermediate step?
>> Yup, that's what happens now. I know Min a little while ago whipped
>> up something for engine-to-engine communication in zmq, but there's no
>> high-level interface for that yet. This implements a simple 2d
>> parallel wave solver, and it's the only example we have at this point
>> going in this direction:
>> Before we had punted on this problem altogether, b/c the overhead of
>> Twisted was so high that for all sensible use cases needing this, the
>> answer was always 'use mpi'. Now, with zmq that's not the case
>> anymore, so we'll probably add some support for mpi-like
>> communications using purely zmq and the ipython api, but that hasn't
>> happened yet.
> IPython-User mailing list
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the IPython-User