# [SciPy-user] Automatic Differentiation with PYADOLC and Removing Boost::Python dependency

Sebastian Walter sebastian.walter@gmail....
Thu Mar 26 04:50:24 CDT 2009

```Hello,

I have implemented a wrapper for  the C++ Automatic Differentiation
You can use it to differentiate complex algorithms to arbitrary order.
It works quite well with numpy.

You can have a look at it at http://github.com/b45ch1/pyadolc .

EXAMPLE USAGE:
==============
compute the Jacobian J of
f(x) = numpy.dot(A,x), where A is an (N,M) array

--------------- get_started.py ----------------------
import numpy

N = M = 10
A = numpy.zeros((M,N))
A[:] = [[ 1./N +(n==m) for n in range(N)] for m in range(M)]

def f(x):
return numpy.dot(A,x)

# tape a function evaluation
ax = numpy.array([adouble(0) for n in range(N)])
trace_on(1)
independent(ax)
ay = f(ax)
dependent(ay)
trace_off()

x = numpy.array([n+1 for n in range(N)])

# compute jacobian of f at x
J = jacobian(1,x)

# compute gradient of f at x
if M==1:

--------------- end get_started.py ----------------------

PERFORMANCE:
=============

It is really fast compared to existing  AD tools for Python as for
example Scientific.Functions.Derivatives.
Benchmark available at

compute hessian of:
def f(x):
return 0.5*dot(x,dot(A,x))

Runtime comparison:
adolc: elapsed time = 0.000411 sec
Scientific: elapsed time = 0.041264 sec

I.e.  pyadolc  is a factor 100 faster.

Removing Boost::Python dependency ?
===============================

I have used Boost::Python to wrap it, but I am not happy with that