Operational calculus on programming spaces
In this paper we develop operational calculus on programming spaces that generalizes existing approaches to automatic differentiation of computer programs and provides a rigorous framework for program analysis through calculus. We present an abstract computing machine that models automatically differentiable computer programs. Computer programs are viewed as maps on a finite dimensional vector space called virtual memory space, which we extend by the tensor algebra of its dual to accommodate derivatives. The extended virtual memory is by itself an algebra of programs, a data structure one can calculate with, and its elements give the expansion of the original program as an infinite tensor series at program's input values. We define the operator of differentiation on programming spaces and implement a generalized shift operator in terms of its powers. Our approach offers a powerful tool for program analysis and approximation, and provides deep learning with a formal calculus. Such a calculus connects general programs with deep learning through operators that map both formulations to the same space. This equivalence enables a generalization of existing methods for neural analysis to any computer program, and vice versa. Several applications are presented, most notably a meaningful way of neural network initialization that leads to a process of program boosting.
READ FULL TEXT