[Numpy-discussion] automatic differentiation with PyAutoDiff

James Bergstra bergstrj@iro.umontreal...
Wed Jun 13 12:42:22 CDT 2012


Further to the recent discussion on lazy evaluation & numba, I moved
what I was doing into a new project:

PyAutoDiff:
https://github.com/jaberg/pyautodiff

It currently works by executing CPython bytecode with a numpy-aware
engine that builds a symbolic expression graph with Theano... so you
can do for example:

>>> import autodiff, numpy as np
>>> autodiff.fmin_l_bfgs_b(lambda x: (x + 1) ** 2, [np.zeros(())])

... and you'll see `[array(-1.0)]` printed out.

In the future, I think it should be able to export the
gradient-computing function as bytecode, which could then be optimized
by e.g. numba or a theano bytecode front-end. For now it just compiles
and runs the Theano graph that it built.

It's still pretty rough (you'll see if you look at the code!) but I'm
excited about it.

- James


More information about the NumPy-Discussion mailing list