[Numpy-discussion] Automatic differentiation (was Re: second-order gradient)
Wed Mar 11 05:12:07 CDT 2009
There are several possibilities, some of them are listed on
pycppad is a wrapper of the C++ library CppAD ( http://www.coin-or.org/CppAD/ )
the wrapper can do up to second order derivatives very efficiently in
the so-called reverse mode of AD
which is a wrapper for the C++ library ADOL-C (
this can do abritrary degree of derivatives and works quite well with
numpy, i.e. you can work with numpy arrays
also quite efficient in the so-called reverse mode of AD
can provide first order derivatives. But as far as I understand only
first order derivatives of functions
f: R -> R
and only in the usually not so efficient forward mode of AD
pure python, arbitrary derivatives in forward and reverse mode
still quite experimental.
Offers also the possibility to differentiate functions that make heavy
use of matrix operations.
this is not automatic differentiation but symbolic differentiation but
is sometimes useful
hope that helps,
On Wed, Mar 11, 2009 at 4:13 AM, Osman <firstname.lastname@example.org> wrote:
> I just saw this python package : PyDX which may answer your needs.
> The original URL is not working, but the svn location exists.
> svn co http://gr.anu.edu.au/svn/people/sdburton/pydx
> Numpy-discussion mailing list
More information about the Numpy-discussion