[IPython-User] Parallel question: Sending data directly between engines

MinRK benjaminrk@gmail....
Thu Jan 19 18:50:18 CST 2012


simple binary-tree engine interconnect example:
https://github.com/ipython/ipython/pull/1295

-MinRK

On Mon, Jan 16, 2012 at 22:34, MinRK <benjaminrk@gmail.com> wrote:
> I've written a minimal binary-tree connector that provides all-reduce
> using PUSH/PULL sockets for the tree and PUB/SUB for the broadcast.
> It's quite simple, and almost all the logic is in establishing the
> connections in the first place, and is very similar to the existing
> EngineConnector example, just with different socket types and
> connection pattern (tree instead of all:all).
>
> I'll post it after I've cleaned it up a bit.
>
> I still think there's a more 'ØMQ-style' way to do this sort of thing,
> especially when ordering need not be enforced.
>
> -MinRK
>
> On Sun, Jan 8, 2012 at 12:39, Brian Granger <ellisonbg@gmail.com> wrote:
>> On Sun, Jan 8, 2012 at 12:30 PM, Olivier Grisel
>> <olivier.grisel@ensta.org> wrote:
>>> 2012/1/8 Brian Granger <ellisonbg@gmail.com>:
>>>> Don't forget, you can always just use MPI/mpi4py with IPython, which
>>>> has very efficient reduce/allreduce that use the spanning tree
>>>> approach.
>>>
>>> Indeed but the MPI abstraction might be too strong (by hiding too much
>>> of the underlying computational runtime): in particular it might
>>> prevent the algorithm implementer to leverage data-locality by
>>> scheduling some tasks to be run where it knows the input data is
>>> already located (on the hard-drive or in shared memory as memory
>>> mapped file for instance) rather that shipping it over the network
>>> over and over again.
>>>
>>> I like the ability of the IPython engines and controller and client to
>>> collect whatever metadata they want, pass them around using pyzmq and
>>> make it possible to plug your own smart scheduler based on those
>>> runtime metadata.
>>
>> Yes, IPython/pyzmq is definitely more flexible than MPI.
>>
>>> I have not checked mpi4py recently though. Last time I experimented
>>> with mpi was more than 7 years ago so things might have changed.
>>
>> It is truly an amazing package.  Some of the best code I know of.
>>
>> Cheers,
>>
>> Brian
>>
>>> --
>>> Olivier
>>> http://twitter.com/ogrisel - http://github.com/ogrisel
>>
>>
>>
>> --
>> Brian E. Granger
>> Cal Poly State University, San Luis Obispo
>> bgranger@calpoly.edu and ellisonbg@gmail.com
>> _______________________________________________
>> IPython-User mailing list
>> IPython-User@scipy.org
>> http://mail.scipy.org/mailman/listinfo/ipython-user


More information about the IPython-User mailing list