Forward- or Reverse-Mode Automatic Differentiation: What's the Difference?

by   Birthe van den Berg, et al.

Automatic differentiation (AD) has been a topic of interest for researchers in many disciplines, with increased popularity since its application to machine learning and neural networks. Although many researchers appreciate and know how to apply AD, it remains a challenge to truly understand the underlying processes. From an algebraic point of view, however, AD appears surprisingly natural: it originates from the differentiation laws. In this work we use Algebra of Programming techniques to reason about different AD variants, leveraging Haskell to illustrate our observations. Our findings stem from three fundamental algebraic abstractions: (1) the notion of module over a semiring, (2) Nagata's construction of the 'idealization of a module', and (3) Kronecker's delta function, that together allow us to write a single-line abstract definition of AD. From this single-line definition, and by instantiating our algebraic structures in various ways, we derive different AD variants, that have the same extensional behaviour, but different intensional properties, mainly in terms of (asymptotic) computational complexity. We show the different variants equivalent by means of Kronecker isomorphisms, a further elaboration of our Haskell infrastructure which guarantees correctness by construction. With this framework in place, this paper seeks to make AD variants more comprehensible, taking an algebraic perspective on the matter.


page 1

page 2

page 3

page 4


Correctness of Automatic Differentiation via Diffeologies and Categorical Gluing

We present semantic correctness proofs of Automatic Differentiation (AD)...

Automatic Differentiation in Prolog

Automatic differentiation (AD) is a range of algorithms to compute the n...

Higher Order Automatic Differentiation of Higher Order Functions

We present semantic correctness proofs of automatic differentiation (AD)...

Automatic Differentiation With Higher Infinitesimals, or Computational Smooth Infinitesimal Analysis in Weil Algebra

We propose an algorithm to compute the C^∞-ring structure of arbitrary W...

Automatic Differentiation for Complex Valued SVD

In this note, we report the back propagation formula for complex valued ...

Denotational Correctness of Foward-Mode Automatic Differentiation for Iteration and Recursion

We present semantic correctness proofs of forward-mode Automatic Differe...

Tricks from Deep Learning

The deep learning community has devised a diverse set of methods to make...

Please sign up or login with your details

Forgot password? Click here to reset