[Numpy-discussion] automatic differentiation with PyAutoDiff
Olivier Grisel
olivier.grisel@ensta....
Thu Jun 14 03:00:07 CDT 2012
2012/6/13 James Bergstra <bergstrj@iro.umontreal.ca>:
> Further to the recent discussion on lazy evaluation & numba, I moved
> what I was doing into a new project:
>
> PyAutoDiff:
> https://github.com/jaberg/pyautodiff
>
> It currently works by executing CPython bytecode with a numpy-aware
> engine that builds a symbolic expression graph with Theano... so you
> can do for example:
>
>>>> import autodiff, numpy as np
>>>> autodiff.fmin_l_bfgs_b(lambda x: (x + 1) ** 2, [np.zeros(())])
>
> ... and you'll see `[array(-1.0)]` printed out.
>
> In the future, I think it should be able to export the
> gradient-computing function as bytecode, which could then be optimized
> by e.g. numba or a theano bytecode front-end. For now it just compiles
> and runs the Theano graph that it built.
>
> It's still pretty rough (you'll see if you look at the code!) but I'm
> excited about it.
Very interesting. Would it be possible to use bytecode introspection
to printout the compute and display a symbolic representation of an
arbitrary python + numpy expression?
E.g. something along the lines of:
>>> g = autodiff.gradient(lambda x: (x + 1) ** 2, [np.zeros(())])
>>> print g
f(x) = 2 * x + 2
>>> g(np.arrange(3))
array[2, 4, 6]
--
Olivier
http://twitter.com/ogrisel - http://github.com/ogrisel
More information about the NumPy-Discussion
mailing list