SC Seminar: Max Aehle

Max Aehle, Chair for Scientific Computing (SciComp), TU Kaiserslautern

Title: Forward-Mode Automatic Differentiation of Compiled Programs

Abstract:

Many optimization algorithms, including training methods for neural networks, rely on the gradient of the objective function. Often, the gradient can be formed by algorithmic differentiation (AD), which is a set of techniques to obtain accurate derivatives of computer-implemented “client” code, augmenting it with forward-mode or reverse-mode AD logic.

AD tools achieve this augmentation in different ways. Source transformation tools (such as TAPENADE) parse the source code and insert AD logic in the respective programming language. Operator overloading tools (such as ADOL-C, CoDiPack or autograd) introduce a new type whose operators and math functions contain AD logic in addition to their original purpose. Compiler-based tools (such as CLAD or Enzyme) insert AD logic during the build process, thus allowing for a wider set of programming languages being used in the client program, and possibly reducing the amount of effort necessary to integrate AD into it.

In this talk, we present the new AD tool Derivgrind, which operates on the machine code of the compiled client program. Derivgrind needs to access only small portions of the client source code, in order to identify input and output variables in the compiled program. It therefore has an even more general scope of application, and potential to be even less time-consuming for the AD tool user.

We furthermore present applications of Derivgrind’s forward mode to the Python interpreter CPython, and to the software package GATE/Geant4 for simulations in medical imaging and radiotherapy.

How to join online

You can join online via Zoom, using the following link:
https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

Bookmark the permalink.