[IPython-dev] Parallel map
Sat Mar 8 12:32:26 CST 2008
On Sat, Mar 8, 2008 at 2:39 AM, Gael Varoquaux
> On Sat, Mar 08, 2008 at 02:28:02AM -0800, Fernando Perez wrote:
> > We need to summarize the recent work done at the sage/scipy/ipython
> > sprint.
> I did see some recent commits on Trac :->.
> > In particular, Min did a lot of excellent work *precisely* on
> > this issue, most (if not all) of which is already committed, to
> > provide a full ipython daemon for process control. This allows you to
> > do exactly that, to create/control/destroy engines and/or controllers
> > from within python scripts.
> Great. Now a very important question: can the daemon be started from
> Python without an os.system apparent for the user? I want to be able to
> log in a box, and run a pPthon script that does parallel computing with
> only a few lines of Python code.
Yes, absolutely. That was a key requirement of the design, that we
could provide fully self-contained examples of ipython parallel code
that don't start by telling the user 'open 4 terminals and type this
in each'... The docs will be catching up soon, likely after the Paris
> > The good thing is that you're already a bzr launchpad team member, so
> > I'm sure you'll soon be contributing this code and docs :)
> Hum. Currently I am physicaly killing myself doing way to many things.
> No, seriously, its worst than it has ever been, and I have a dangerously
> off balance sleep pattern. Maybe I'll get better next year when I start
> my new job. Maybe I'll even use ipython1 as a full part of my day work.
Well, don't worry: I made you a team member just so we can work in
Paris with you fluidly, and to make it easy for you to prep anything
you may need. I am NOT trying to bury you with more things, and I
realize fullyt that ETS/M2 are higher priorities on this front than
ipython. Please keep sleep and sanity above open source, really (I
say this after having crashed hard with a migraine yesterday
afternoon, probably from overdosing on work, stress and low sleep
myself, so I really mean it when I say that I want your well-being to
> > We finally figured out a trick using the 'with' statement to allow you
> > to write code like
> > with all_engines:
> > do_in_remote_engines(..)
> That's really nice. Now a big question is, does that remove the need for
> all the objects in the parallel code to be picklable. Because that is
> what was limiting me here.
Nope, pickle continues to be a requirement, except if you're sending
data over MPI, case in which MPI does the communication.
More information about the IPython-dev