Scientific Computing Seminar

Date and Place: Thursdays and hybrid (live in 32-349/online via Zoom). For detailed dates see below!

Content

In the Scientific Computing Seminar we host talks of guests and members of the SciComp team as well as students of mathematics, computer science and engineering. Everybody interested in the topics is welcome.

List of Talks

  • Thu
    03
    Nov
    2022

    11:45 Hybrid (Room 32-349 and via Zoom)

    Max Aehle, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    Title: Forward-Mode Automatic Differentiation of Compiled Programs

    Abstract:

    Many optimization algorithms, including training methods for neural networks, rely on the gradient of the objective function. Often, the gradient can be formed by algorithmic differentiation (AD), which is a set of techniques to obtain accurate derivatives of computer-implemented “client” code, augmenting it with forward-mode or reverse-mode AD logic.

    AD tools achieve this augmentation in different ways. Source transformation tools (such as TAPENADE) parse the source code and insert AD logic in the respective programming language. Operator overloading tools (such as ADOL-C, CoDiPack or autograd) introduce a new type whose operators and math functions contain AD logic in addition to their original purpose. Compiler-based tools (such as CLAD or Enzyme) insert AD logic during the build process, thus allowing for a wider set of programming languages being used in the client program, and possibly reducing the amount of effort necessary to integrate AD into it.

    In this talk, we present the new AD tool Derivgrind, which operates on the machine code of the compiled client program. Derivgrind needs to access only small portions of the client source code, in order to identify input and output variables in the compiled program. It therefore has an even more general scope of application, and potential to be even less time-consuming for the AD tool user.

    We furthermore present applications of Derivgrind’s forward mode to the Python interpreter CPython, and to the software package GATE/Geant4 for simulations in medical imaging and radiotherapy.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    17
    Nov
    2022

    11:45Hybrid (Room 32-349 and via Zoom)

    Guillermo Suarez, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    Title: Non-Linear Surrogate Model Design for Aerodynamic Dataset Generation

    Abstract:

    One of the key components for designing an Air Vehicle is an Aerodynamic Dataset – a model representing the aircraft’s aerodynamic behavior throughout the entire flight envelope. One of the major benefits of Artificial Intelligence (AI) is the possibility to automate most parts of a design process in a highly reduced timeframe. It is therefore believed that one or several AI techniques show potential for automating parts of the current dataset generation process.

    In this talk we will present the latest work regarding the design and development of a non-linear surrogate model to adequately predict static stability and control parameters for the Unmanned Combat Air Vehicle DLR-F17 and the DLR-F19 aircraft. Within this scope, two different topologically optimized machine learning based surrogate modelling techniques are investigated in order to achieve the best generalization properties.

    The first architecture being investigated is Artificial Neural Networks (ANN), who have gained great popularity due to their ability to efficiently process large amounts of data. The second architecture to be investigated relies in Ensemble Learning (EL) methods. For the current work three different ensemble learning models have been assessed: Random Forests, Adaptive Boosting and Extreme Gradient Boosting.

    Additional lines of research also investigate machine learning algorithms with multiple outputs in order to explore whether a multi-output learning framework brings benefits in the form of increased predictive performance compared to a framework based on training multiple disjoint models. Furthermore, because of their outstanding performance in feature extraction, 1-dimensional Convolutional Neural Networks (CNN) will be also considered as a surrogate model.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Tue
    13
    Dec
    2022

    15:00Room 32-349

    The goal of the workshop is to give an overview of recent research activities at SciComp. In addition, collaborators from HS Offenburg, KIT, MTU and BOSCH will give invited presentations for scientific exchange. Finally, we will brainstorm about future collaboration.

    Program

    15:00-15:50 Scientific Short Presentations – SciComp (5 minutes each)
    (Dr. E. Özkaya, Dr. M. Sagebaum, Dr. L. Kusch, O. Burghardt, R. Pochampalli, J. Blühdorn, G. Suarez, Dr. L. Chen, M. Aehle, J. Rottmayer)

    15:50-16:15 Coffee break

    Special Invited Talk
    16:15-17:15 Steffen Schotthöfer (KIT, Karlsruhe)

    17:15-18:00 Scientific Short Presentations – Guests (10+5 minutes each)
    (Prof. J. Keuper (HS Offenburg), T. Kattmann (BOSCH), C. Battistoni (MTU))

    18:00-18:30 Brainstorming/Thoughts on Future Collaboration (Prof. N. Gauger)

    18:30 Workshop Dinner

  • Tue
    13
    Dec
    2022

    16:15 Hybrid (Room 32-349 and via Zoom)

    Steffen Schotthöfer, Instute for Applied and Numerical Mathematics, Karlsruhe Institute of Technology (KIT)

    Title: Dynamical Low Rank Compression for Efficient Neural Network Training

    Abstract:

    Neural networks have achieved tremendous success in a large variety of applications. However, their memory footprint and computational demand can render them impractical in application settings with limited hardware or energy resources. In this work, we propose a novel algorithm to find efficient low-rank subnetworks. Remarkably, these subnetworks are determined and adapted already during the training phase and the overall time and memory resources required by both training and evaluating them are significantly reduced. The main idea is to restrict the weight matrices to a low-rank manifold and to update the low-rank factors rather than the full matrix during training. To derive training updates that are restricted to the prescribed manifold, we employ techniques from dynamic model order reduction for matrix differential equations. This allows us to provide approximation, stability, and descent guarantees. Moreover, our method automatically and dynamically adapts the ranks during training to achieve the desired approximation accuracy. The efficiency of the proposed method is demonstrated through a variety of numerical experiments on fully-connected and convolutional networks.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    05
    Jan
    2023

    11:45Hybrid (Room 32-349 and via Zoom)

    Title: Trailing Edge Noise Reduction by Porous Treatment using Derivative-Free Optimization

    Jan Rottmayer, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    Abstract:

    This talk focuses on the derivative-free optimization algorithm known as Efficient Global Optimization and its application to the noise reduction caused by the blunt trailing edge of an airfoil. We compute noise levels over a range of frequencies and optimize by adjusting porosity and permeability as design parameters.

    In general, lifting bodies are used across a wide range of applications, including transportation and energy generation. They can produce unwanted noise in a variety of ways. On the topic of airfoils, the leading causes of noise are leading- and trailing edge noise. The latter occurs when the boundary layer convects turbulent sources over the trailing-edge. [1] demonstrated noise reduction via porous treatment at the trailing edge. „Porous edges and surfaces can act to reduce the correlation of a transitional or turbulent boundary layer, as well as reducing the convection velocity inside the boundary layer, which is an important determining factor for the magnitude of the scattered acoustic waves.“ [2]
    We explore this idea on the example of an airfoil with blunt trailing edge and partially porous geometry. The trailing-edge porosity is parametrized to reduce the dimension of the design space. Further, the computational costs are reduced by using an Amiet-based trailing edge noise model and a surrogate-based optimization approach.

    [1] T.A. Smith, C.A. Klettner. Airfoil trailing-edge noise and drag reduction at a moderate Reynolds number using wavy geometries; https://doi.org/10.1063/5.0120124; 2022
    [2] T. Geyer, E. Sarradj. Trailing edge noise of partially porous airfoils; https://doi.org/10.2514/6.2014-3039; 2014

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    12
    Jan
    2023

    11:45Hybrid (Room 32-349 and via Zoom)

    Prof. Esteban Ferrer, Professor in Applied Mathematics at the School of Aeronautics, Technical University of Madrid (UPM), Madrid, Spain

    Title: New avenues in high order fluid dynamics

    Abstract:

    We present the latest developments of our High-Order Spectral Element Solver (HORSES3D), an open source high-order discontinuous Galerkin framework capable of solving a variety of flow applications, including compressible flows (with or without shocks), incompressible flows, various RANS and LES turbulence models, particle dynamics, multiphase flows, and aeroacoustics [1].

    Recent developments allow us to simulate challenging multiphysics including turbulent flows, multiphase and moving bodies, using local p-adaption and fast multigrid time advancement. In addition, we present recent work that couples Machine Learning techniques and high order simulations.

    [1] E Ferrer, G Rubio, G Ntoukas, W Laskowski, O Mario, S Colombo, A. Mateo-Gabn, F Manrique de Lara, D Huergo, J Manzanero, AM Rueda-Ramrez, DA Kopriva, E Valero. HORSES3D: a high order discontinuous Galerkin solver for flow simulations and multi-physic applications. arXiv:2206.09733, 2022

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    02
    Feb
    2023

    11:45Hybrid (Room 32-349 and via Zoom)

    Max Aehle, Chair for Scientific Computing (SciComp), University of Kaiserslautern-Landau (RPTU)

    Title: Reverse-Mode Automatic Differentiation of Compiled Programs

    Abstract:

    Algorithmic differentiation (AD) is a set of techniques to obtain accurate derivatives of computer-implemented functions. For gradient-based numerical optimization purposes, the reverse mode of AD is especially suited — the run-time it needs to compute a gradient of the objective function is proportional to the run-time of the objective function, and independent of the number of design parameters.

    In practice, classical AD tools require that the source code of the computer-implemented function is available, in a limited set of programming languages. As a step towards making AD applicable to cross-language or partially closed-source client programs, we developed the new AD tool Derivgrind [1]. Derivgrind leverages the dynamic binary instrumentation framework Valgrind to add forward-mode AD logic to the machine code of compiled computer code.

    In this talk, we present the new index-handling and tape-recording capabilities that we added to Derivgrind during the last months [2]. In combination with a simple tape evaluator program, they enable operator-overloading-style reverse-mode AD for compiled programs.

    [1] Max Aehle, Johannes Blühdorn, Max Sagebaum, Nicolas R. Gauger. Forward-Mode Automatic Differentiation of Compiled Programs. arXiv:2209.01895, 2022.
    [2] Max Aehle, Johannes Blühdorn, Max Sagebaum, Nicolas R. Gauger. Reverse-Mode Automatic Differentiation of Compiled Programs. arXiv:2212.13760, 2022.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09