Here you can find upcoming and past events organized by our group.
10:00SC Seminar Room 32-349
Xinyu Hui, Northwestern Polytechnical University Xi’an, China
Fast pressure distribution prediction and unsteady periodic flow field prediction method based on deep learning
In the aerodynamic design, optimization of the pressure distribution of airfoils is crucial for the aerodynamic components. Conventionally, the pressure distribution is solved by computational ﬂuid dynamics, which is a time-consuming task. Surrogate modeling can leverage such expense to some extent, but it needs careful shape parameterization schemes for airfoils. As an alternative, deep learning approximates inputs-outputs mapping without solving the eﬃciency-expensive physical equations and avoids the limitations of particular parameterization methods. Therefore, I present a data-driven approach for predicting the pressure distribution over airfoils based on Convolutional Neural Network (CNN). Given the airfoil geometry, a supervised learning problem is presented for predicting aerodynamic performance. Furthermore, we utilize a universal and ﬂexible parametrization method called Signed Distance Function to improve the performances of CNN. Given the unseen airfoils from the validation dataset to the trained model, the model achieves predicting the pressure coefficient in seconds, with a less than 2% mean square error.
A method based on deep learning to predict periodic unsteady flow field is proposed, and can predict the real-time complex vortex flow state at different moments accurately. Combining conditional generative adversarial network and convolutional neural network, improve the conditional constraint method from conditional generative adversarial network, a deep learning framework with conditional constraints is proposed, which is the regression generative adversarial network. The two scenarios of conditional generative adversarial network and regression generative adversarial network are tested and compared via giving different periodic moments to predict the corresponding flow field variables at this moment. The final results demonstrate that regression generative adversarial network can estimate complex flow fields, and way more faster than CFD simulation.
11:30SC Seminar Room 32-349
Prof. Dr. Hermann G. Matthies, Institut für Wissenschaftliches Rechnen, TU Braunschweig
Probability, Algebra, Analysis, and Numerics
Probability theory was axiomatically built on the concept of measure by A. Kolmogorov in the early 1930s, using the probability measure and the sigma-algebra as primary objects, and random variables, i.e. measurable functions, and their expectations as secondary. Not long after Kolmogorov ́s work, developments in operator algebras connected to quantum theory in the early 1940s lead to similar results in an approach where algebras of random variables and the expectation functional are the primary objects. Historically this picks up the view implicitly contained in the early probabilistic theory of the Bernoullis.
This algebraic approach allows extensions to more complicated concepts like non-commuting random variables and infinite dimensional function spaces, as it occurs e.g. in quantum field theory, random matrices, and tensor-valued random fields. It not only fully recovers the measure-theoretic approach, but can extend it considerably. For much practical and numerical work, which is often primarily concerned with random variables, expectations, and conditioning, it offers an independent theoretical underpinning. In short words, it is “probability without measure theory”.
This functional analytic setting has also strong connections to the spectral theory of linear operators, where analogies to integration are apparent if they are looked for. This then allows to compute other functions than just polynomials of the algebraic random variables. These links extend in a twofold way to the concept of weak distribution, which describes probability on infinite dimensional vector spaces. Here the random elements are represented by linear mappings, and factorisations of linear maps are intimately connected with representations and tensor products, as they appear in numerical approximations.
Taking this conceptual basis of vector spaces, algebras, linear functionals, and operators gives a fresh view on the concepts of expectation and conditioning, as it occurs in applications of Bayes’ theorem.
For numerical computations, this allows to represent random quantities not only as usual as samples, but as functions of an algebra of known random variables. These may be seen as functions on very high-dimensional domains. As already mentioned, tensor approximations appear naturally, and low-rank factorisations are one way to handle the resulting complexity. The resulting connections to an algebra may for example be used in the numerical processing of compressed representations of such high-dimensional objects.