Algorithmic Differentiation

(Algorithmisches Differenzieren)

Lecturer: Prof. Nicolas R. Gauger, Dr. Max Sagebaum
Date and Place:

  • Lectures: Will be provided as weekly videos.
  • Flipped-classroom session: Wednesdays, 08:00h – 09:30 h (First meeting: 27.10.2021 via Jitsi)
  • Exercise session: Mondays, 14:00h – 15:30h (First exercise session: 08.11.2021)

Flipped-classroom session zoom link:
(Please have look at this side, for short notice updates.)
Exercise session zoom link:

ECTS: 5 Credits
Language: English


  • We provide one lecture video and one exercise sheet per week.
  • Questions regarding the lecture videos will be clarified in the flipped classroom sessions.
  • Exercise sheets must be handed in and are discussed in the exercise class.
  • In the first flipped classroom session on 27.10. (online), we will fix the date of the exercise class.
  • Both the flipped classroom sessions and the exercise sessions will start online via Jitsi. If all participants agree unanimously, we can switch to presence.
  • There will be an oral examination after the semester.
  • Students have to solve 50% of the exercises in order to qualify for the oral exam.


Master students in mathematics, computer science and engineering. Intermediate level in C/C++ programming is practical.


Algorithmic Differentiation (AD), also called automatic differentiation, is the name given to a set of techniques to evaluate the derivative of a function, which is realized as a computer program. AD is based on the simple fact that every computer program, independent of its complexity, executes a sequence of elementary arithmetic operations (+, -, *, etc.) and elementary functions (sin, cos, exp, etc.). By applying the chain rule of calculus repeatedly to these operations, we can generate another computer program that can compute the derivatives of any output with respect to any input of arbitrary order.
In the lecture, the fundamentals of AD will be covered. These include theoretical background as well as the implementation aspects. Special emphasis will be given to AD techniques such as checkpointing or iterative differentiation, which facilitate the usage of AD packages especially for large scale applications. In the exercises, the students are expected to gain more insight and best-practise by applying AD techniques to simple examples selected from scientific computing. Tutorials for three established AD packages (ADOL-C, CoDiPack and TAPENADE) will be also provided.


  • Fundamentals of Forward and Reverse Modes of AD
  • Implementation & Software
  • Higher Order Derivatives
  • Reversal Schedules and Checkpointing Schemes
  • Implicit and Iterative Differentiation
  • Memory Issues and Complexity Bounds for AD


  • Griewank, A., Walther, A.: Evaluating Derivatives, Principles and Techniques of Algorithmic Differentiation, Second Edition, SIAM
  • Naumann, U.: The Art of Differentiating Computer Programs: An Introduction to Algorithmic Differentiation, SIAM

Lecture Slides, lecture videos adn exercise sheets will be available in the OLAT website of the course:
KIS link