[Numpy-discussion] automatic differentiation with PyAutoDiff

Nathaniel Smith njs@pobox....
Thu Jun 14 14:38:30 CDT 2012


On Thu, Jun 14, 2012 at 7:53 PM, James Bergstra
<bergstrj@iro.umontreal.ca> wrote:
> On Thu, Jun 14, 2012 at 11:01 AM, Nathaniel Smith <njs@pobox.com> wrote:
>
>>> Indeed that would be great as sympy already has already excellent math
>>> expression rendering.
>>>
>>> An alternative would be to output mathml or something similar that
>>> could be understood by the mathjax rendering module of the IPython
>>> notebook.
>>
>> I'd find it quite useful if it could spit out the derivative as Python
>> code that I could check and integrate into my source. I often have a
>> particular function that I need to optimize in many different
>> situations, but would rather not pull in a whole (complex and perhaps
>> fragile) bytecode introspection library just to repeatedly recompute
>> the same function on every run...
>>
>> -N
>
> I was hoping to get by with bytecode-> bytecode interface, are there
> bytecode -> source tools that could help here?

Not that I know of -- you might try googling "python reverse engineer"
or similar. Mostly people treat bytecode as the internal intermediate
format it is. I'm sort of confused at why people are suddenly excited
about using (some particular CPython release's version of) bytecode as
an input format when both the ast module and Cython are perfectly
capable of parsing real Python source into a nice abstract format, but
you all seem to be having fun so hey.

> Otherwise it might be possible to appeal to the symbolic intermediate
> representation to produce more legible source.
>
> With regards to "pulling in a whole bytecode introspection library" I
> don't really see what you mean. If the issue is that you want some way
> to verify that the output function is actually computing the right
> thing, then I hear you - that's an issue. If the issue that autodiff
> itself is slow, then I'd like to hear more about the application,
> because in minimization you usually have to call the function many
> times (hundreds) so the autodiff overhead should be relatively small
> (I'm not counting Theano's function compilation time here, which still
> can be significant... but that's a separate concern.)

For example, I wrote a library routine for doing log-linear
regression. Doing this required computing the derivative of the
likelihood function, which was a huge nitpicky hassle; took me a few
hours to work out and debug. But it's still just 10 lines of Python
code that I needed to figure out once and they're done forever, now.
I'd have been perfectly happy if I could have gotten those ten lines
by asking a random unreleased library I pulled off github, which
depended on heavy libraries like Theano and relied on a mostly
untested emulator for some particular version of the CPython VM. But
I'd be less happy to ask everyone who uses my code to install that
library as well, just so I could avoid having to spend a few hours
doing math. This isn't a criticism or your library or anything, it's
just that I'm always going to be reluctant to rely on an automatic
differentiation tool that takes arbitrary code as input, because it
almost certainly cannot be made fully robust. So it'd be nice to have
the option to stick a human in the loop.

-N


More information about the NumPy-Discussion mailing list