Scientific Computing Seminar

Date and Place: Thursdays and hybrid (live in 32-349/online via Zoom). For detailed dates see below!

Content

In the Scientific Computing Seminar we host talks of guests and members of the SciComp team as well as students of mathematics, computer science and engineering. Everbody interested in the topics is welcome.

List of Talks

  • Thu
    27
    Oct
    2022
    Fri
    10
    Feb
    2023

    Prof. Dr. Nicolas Gauger, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    SciComp Seminar Series

    Please contact Prof. Gauger, if you want to register for an online talk in our SciComp Seminar Series or just to register for the seminar.

    A list of the already scheduled talks can be found –> here:

  • Thu
    03
    Nov
    2022

    11:45 Hybrid (Room 32-349 and via Zoom)

    Max Aehle, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    Title: Forward-Mode Automatic Differentiation of Compiled Programs

    Abstract:

    Many optimization algorithms, including training methods for neural networks, rely on the gradient of the objective function. Often, the gradient can be formed by algorithmic differentiation (AD), which is a set of techniques to obtain accurate derivatives of computer-implemented “client” code, augmenting it with forward-mode or reverse-mode AD logic.

    AD tools achieve this augmentation in different ways. Source transformation tools (such as TAPENADE) parse the source code and insert AD logic in the respective programming language. Operator overloading tools (such as ADOL-C, CoDiPack or autograd) introduce a new type whose operators and math functions contain AD logic in addition to their original purpose. Compiler-based tools (such as CLAD or Enzyme) insert AD logic during the build process, thus allowing for a wider set of programming languages being used in the client program, and possibly reducing the amount of effort necessary to integrate AD into it.

    In this talk, we present the new AD tool Derivgrind, which operates on the machine code of the compiled client program. Derivgrind needs to access only small portions of the client source code, in order to identify input and output variables in the compiled program. It therefore has an even more general scope of application, and potential to be even less time-consuming for the AD tool user.

    We furthermore present applications of Derivgrind’s forward mode to the Python interpreter CPython, and to the software package GATE/Geant4 for simulations in medical imaging and radiotherapy.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    17
    Nov
    2022

    11:45Hybrid (Room 32-349 and via Zoom)

    Guillermo Suarez, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    Title: Non-Linear Surrogate Model Design for Aerodynamic Dataset Generation

    Abstract:

    One of the key components for designing an Air Vehicle is an Aerodynamic Dataset – a model representing the aircraft’s aerodynamic behavior throughout the entire flight envelope. One of the major benefits of Artificial Intelligence (AI) is the possibility to automate most parts of a design process in a highly reduced timeframe. It is therefore believed that one or several AI techniques show potential for automating parts of the current dataset generation process.

    In this talk we will present the latest work regarding the design and development of a non-linear surrogate model to adequately predict static stability and control parameters for the Unmanned Combat Air Vehicle DLR-F17 and the DLR-F19 aircraft. Within this scope, two different topologically optimized machine learning based surrogate modelling techniques are investigated in order to achieve the best generalization properties.

    The first architecture being investigated is Artificial Neural Networks (ANN), who have gained great popularity due to their ability to efficiently process large amounts of data. The second architecture to be investigated relies in Ensemble Learning (EL) methods. For the current work three different ensemble learning models have been assessed: Random Forests, Adaptive Boosting and Extreme Gradient Boosting.

    Additional lines of research also investigate machine learning algorithms with multiple outputs in order to explore whether a multi-output learning framework brings benefits in the form of increased predictive performance compared to a framework based on training multiple disjoint models. Furthermore, because of their outstanding performance in feature extraction, 1-dimensional Convolutional Neural Networks (CNN) will be also considered as a surrogate model.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Tue
    13
    Dec
    2022

    15:00Room 32-349

    The goal of the workshop is to give an overview of recent research activities at SciComp. In addition, collaborators from HS Offenburg, KIT, MTU and BOSCH will give invited presentations for scientific exchange. Finally, we will brainstorm about future collaboration.

    Program

    15:00-15:50 Scientific Short Presentations – SciComp (5 minutes each)
    (Dr. E. Özkaya, Dr. M. Sagebaum, Dr. L. Kusch, O. Burghardt, R. Pochampalli, J. Blühdorn, G. Suarez, Dr. L. Chen, M. Aehle, J. Rottmayer)

    15:50-16:15 Coffee break

    Special Invited Talk
    16:15-17:15 Steffen Schotthöfer (KIT, Karlsruhe)

    17:15-18:00 Scientific Short Presentations – Guests (10+5 minutes each)
    (Prof. J. Keuper (HS Offenburg), T. Kattmann (BOSCH), C. Battistoni (MTU))

    18:00-18:30 Brainstorming/Thoughts on Future Collaboration (Prof. N. Gauger)

    18:30 Workshop Dinner

  • Tue
    13
    Dec
    2022

    16:15 Hybrid (Room 32-349 and via Zoom)

    Steffen Schotthöfer, Instute for Applied and Numerical Mathematics, Karlsruhe Institute of Technology (KIT)

    Title: Dynamical Low Rank Compression for Efficient Neural Network Training

    Abstract:

    Neural networks have achieved tremendous success in a large variety of applications. However, their memory footprint and computational demand can render them impractical in application settings with limited hardware or energy resources. In this work, we propose a novel algorithm to find efficient low-rank subnetworks. Remarkably, these subnetworks are determined and adapted already during the training phase and the overall time and memory resources required by both training and evaluating them are significantly reduced. The main idea is to restrict the weight matrices to a low-rank manifold and to update the low-rank factors rather than the full matrix during training. To derive training updates that are restricted to the prescribed manifold, we employ techniques from dynamic model order reduction for matrix differential equations. This allows us to provide approximation, stability, and descent guarantees. Moreover, our method automatically and dynamically adapts the ranks during training to achieve the desired approximation accuracy. The efficiency of the proposed method is demonstrated through a variety of numerical experiments on fully-connected and convolutional networks.

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    05
    Jan
    2023

    11:45Hybrid (Room 32-349 and via Zoom)

    Title: Trailing Edge Noise Reduction via Optimal Porous Design

    Jan Rottmayer, Chair for Scientific Computing (SciComp), TU Kaiserslautern

    Abstract:

    Coming soon …

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

  • Thu
    12
    Jan
    2023

    11:45Hybrid (Room 32-349 and via Zoom)

    Prof. Esteban Ferrer, Professor in Applied Mathematics at the School of Aeronautics, Technical University of Madrid (UPM), Madrid, Spain

    Title: New avenues in high order fluid dynamics

    Abstract:

    We present the latest developments of our High-Order Spectral Element Solver (HORSES3D), an open source high-order discontinuous Galerkin framework capable of solving a variety of flow applications, including compressible flows (with or without shocks), incompressible flows, various RANS and LES turbulence models, particle dynamics, multiphase flows, and aeroacoustics [1].

    Recent developments allow us to simulate challenging multiphysics including turbulent flows, multiphase and moving bodies, using local p-adaption and fast multigrid time advancement. In addition, we present recent work that couples Machine Learning techniques and high order simulations.

    [1] E Ferrer, G Rubio, G Ntoukas, W Laskowski, O Mario, S Colombo, A. Mateo-Gabn, F Manrique de Lara, D Huergo, J Manzanero, AM Rueda-Ramrez, DA Kopriva, E Valero. HORSES3D: a high order discontinuous Galerkin solver for flow simulations and multi-physic applications. arXiv:2206.09733, 2022

    How to join online

    You can join online via Zoom, using the following link:
    https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09