[IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control

Darren Dale dsdale24@gmail....
Fri Sep 4 14:17:21 CDT 2009


On Fri, Sep 4, 2009 at 2:41 PM, Fernando Perez<fperez.net@gmail.com> wrote:
> On Fri, Sep 4, 2009 at 11:01 AM, Fernando Perez <fperez.net@gmail.com> wrote:
>
>> But my main point was not about the parallelization of a loop, but
>> rather about the basic idea of using a decorator to swap the execution
>> context of a bit of code for another one, be it a thread, a remote
>> ipython engine, a GPU, a tracing utility, a profiler, a cython JIT
>> engine or anything else.  Perhaps I chose my example a little poorly
>> to get that point across, sorry if that was the case.  It would be
>> good to come up with more obviously useful and unambiguous examples of
>> this, I'd love it if we generate some interesting discussion here.
>> I'll continue playing with this idea in my copious spare time, until
>> Brian's patience with my lack of code review in the last few days runs
>> out ;)
>
> Here's another trivial example, suppose you'd like to trace some code.
>  Again, starting from the simple loop from before:
>
> def loop_serial():
>    results = [None]*count
>
>    for i in range(count):
>        results[i] = do_work(data, i)
>
>    return summarize(results, count)
>
>
> you can then use this decorator:
>
> def traced(func):
>    import trace
>    t = trace.Trace()
>    t.runfunc(func)
>
> and a 2-line change of code:
>
> def loop_traced():
>    results = [None]*count
>
>    @traced  ### NEW
>    def func():  ### NEW, the name is irrelevant
>        for i in range(count):
>            results[i] = do_work(data, i)
>
>    return summarize(results, count)
>
> gives on execution:
>
> In [12]: run contexts.py
>  --- modulename: contexts, funcname: func
> contexts.py(64):     for i in range(count):
> contexts.py(65):         @traced
>  --- modulename: contexts, funcname: do_work
> contexts.py(10):     return data[i]/2
> contexts.py(64):     for i in range(count):
> contexts.py(65):         @traced
>
> ... etc.
>
> This shows how trivial, small decorators can be used to control code
> execution.  For example, if you are a fan of Robert's fabulous
> line_profiler (http://packages.python.org/line_profiler/), using this
> trivial trick you can profile arbitrarily small chunks of code inline:
>
> def profiled(func):
>    import line_profiler
>    prof = line_profiler.LineProfiler()
>    f = prof(func)
>    f()
>    prof.print_stats()
>    prof.disable()
>
> def loop_profiled():
>    results = [None]*count
>
>    @profiled  # NEW
>    def block():  # NEW
>        for i in range(count):
>            results[i] = do_work(data, i)
>
>    return summarize(results, count)
>
> When run, you get:
>
> In [3]: run contexts.py
> Timer unit: 1e-06 s
>
> File: contexts.py
> Function: block at line 82
> Total time: 1.6e-05 s
>
> Line #      Hits         Time  Per Hit   % Time  Line Contents
> ==============================================================
>    82                                               @profiled
>    83                                               def block():
>    84         5            7      1.4     43.8          for i in range(count):
>    85         4            9      2.2     56.2
> results[i] = do_work(data, i)
>
>
>
> Do these examples illustrate the idea better?

Your intent was clear, but the implementation still leaves me
wondering what is gained by using the @decorator syntax. Maybe I have
missed something, please bear with me.

With @decorator, you are still passing a function to another function
to modify its execution, its just a different syntax that achieves the
same result, isn't it? For example, using your for_each:
for_each(range(count))(loop) yields the same result without having to
redefine loop each time you call loop_deco.

In your above examples, without the @decorator syntax, func/block
(lets call it func since they are identical) can be defined once
outside loop_profiled or loop_traced, and then the execution of that
func can be temporarily modified with using traced(func) and
profiled(func) as you have written them. My point is that the
decorator syntax doesn't yield you anything that you didn't already
have, and it even seems more limiting, because @decorator is just a
syntactic nicety focused on function and class definitions, so we
don't have to do things like:

def func():
    ...
func = modify(func)

Since @decorator rebinds the modified func to func, you have to keep
redefining it within the various contexts, which seems to defeat the
purpose.

Darren


More information about the IPython-dev mailing list