[SciPy-user] [ANN][Automatic Differentiation] Beta Version of PYADOLC
Wed May 13 02:51:26 CDT 2009
On Tue, May 12, 2009 at 10:25 PM, David Warde-Farley <email@example.com> wrote:
> On 12-May-09, at 12:41 PM, Pauli Virtanen wrote:
>> AD typically builds an "implicit" graph expression corresponding to
>> computation, and constructs the Jacobian based on that. So it's not
>> symbolic or numerical differentiation.
> I've never quite understood the difference between what AD does and
> the 'symbolic' way, but from what I'm reading on Wikipedia it's just a
> way of *implementing* the chain rule cleverly using graph operations.
> Is that what you mean Pauli?
Yes, this is basically it.
> So it is exact differentiation (to the extent the floating point
> hardware can provide) rather than an approximation such as finite
> differences will yield, and thus the resulting code is equivalent in
> function to what you'd get if you symbolically differentiated and then
> coded it up, is that right?
Yes, it is exact up to the usual floating point error.
No, not necessarily. If you have a function
f: R^N --> R with N rather large and you want the gradient f' \in R^N
then symbolic differentation would give you N symbolic expressions for
each element f'_n.
That means that with symbolic differentiation the cost to compute the
gradient scales with N.
In AD it is possible to avoid this scaling with N. The gradient is
only as expensive as a small multiple of the function itself.
That means if N = 1000 and you need 1 second to evaluate the function,
you would need about 1000 seconds to compute the gradient if you use
but only a couple of seconds with AD (in the so-called reverse mode).
Also, AD works on arbitrary algorithms, that means you can also
provide functions with loops in the body. Recursions is typically a
no-go for symbolic differentiation.
> SciPy-user mailing list
More information about the SciPy-user