Find here more information about the conference program

*Program changes are possible at short notice.

**Registration is possible on all three days throughout the whole day

4 October

Place: Foyer upstairs

Speaker: Mirjam Müller
Topic: Luck, circumstances, or strategy? Options for career planning in academia
Place: V47.02

Is it possible to plan an academic career, or does it depend on luck or circumstances? The key-note shows why it is important to take charge of your academic career and which aspects you should base your career planning on. Based on a structured overview of the expected academic portfolio, priorities and career models in the German academic system are addressed. From a coaching perspective, strategies for career planning will be discussed.

Opening of Early Career Symposium by

  • Prof. Dr. Manfred Bischoff, Vice Rector for Research and Early Career Researchers
  • Prof. Dr. Johannes Kästner, Head of the SimTech Graduate Academy


Go directly to Early Career Symposium

Place: Foyer downstairs

Place: V47.02

Welcoming addresses by

  • Prof. Dr.-Ing. Wolfram Ressel, Rector of the University of Stuttgart and
  • Prof. Dr.-Ing. Wolfgang Nowak, Spokesperson of the Cluster of Excellence SimTech


Speaker: Max Welling
Topic: Neural Wave Representations
Place: V47.02

Neural Wave Representations

Good neural architectures are rooted in good inductive biases (a.k.a. priors). Equivariance under symmetries is a prime example of a successful physics inspired prior which sometimes dramatically reduces the number of examples needed to learn predictive models. In this work we will try to extend this thinking to more flexible priors in the hidden variables of a neural network. In particular, we will impose wavelike dynamics in hidden variables under transformations of the inputs, which relaxes the stricter notion of equivariance. We find that under certain conditions, wavelike dynamics naturally arises in these hidden representations. We formalize this idea in a VAE-over-time architecture where the hidden dynamics is described by a Fokker-Planck (a.k.a. drift-diffusion) equation. This in turn leads to a new definition of a disentangled hidden representation of input states that can easily be manipulated to undergo transformations. If time allows, I will discuss very preliminary work on how the Schrodinger equation can also be used to move information in the hidden representations. Joint work with Andy T. Keller and Yue Song.

Max Welling (University of Amsterdam)

Max Welling is a full professor and research chair in machine learning at the University of Amsterdam. He is also a Distinguished Scientist at MSR. He is a fellow at the Canadian Institute for Advanced Research (CIFAR) and the European Lab for Learning and Intelligent Systems (ELLIS) where he serves on the founding board. His previous appointments include VP at Qualcomm Technologies, professor at UC Irvine, postdoc at U. Toronto and UCL under supervision of prof. Geoffrey Hinton, and postdoc at Caltech under supervision of prof. Pietro Perona. He finished his PhD in theoretical high energy physics under supervision of Nobel laureate prof. Gerard ‘t Hooft. Max Welling has served as associate editor in chief of IEEE TPAMI from 2011-2015, he serves on the advisory board of the Neurips foundation since 2015 and has been program chair and general chair of Neurips in 2013 and 2014 respectively. He was also program chair of AISTATS in 2009 and ECCV in 2016 and general chair and co-founder of MIDL 2018. Max Welling is recipient of the ECCV Koenderink Prize in 2010 and the ICML Test of Time award in 2021.


Speaker: Mark Girolami
Topic: The Statistical Finite Element Method
Place: V47.02

The Statistical Finite Element Method

The finite element method (FEM) is one of the great triumphs of applied mathematics, numerical analysis and software development. Recent developments in sensor and signalling technologies enable the phenomenological study of systems. The connection between sensor data and FEM has been restricted to solving inverse problems placing unwarranted faith in the fidelity of the mathematical description of the system under study. If one concedes mis-specification between generative reality and the FEM then a framework to systematically characterise this uncertainty is required. This talk will present a statistical construction of the FEM which systematically blends mathematical description with data observations by endowing the Hilbert space of FEM solutions with the additional structure of a Probability Measure.

Mark Girolami (University of Cambridge)

Mark Girolami is the Sir Kirby Laing Professor of Civil Engineering at the University of Cambridge, where he also holds the Royal Academy of Engineering Research Chair in Data Centric Engineering. Prior to Cambridge he held the Chair of Statistics in the Department of Mathematics at Imperial College London. Girolami was one of the founding executive directors of the Alan Turing Institute in the UK, where he established and led the Data Centric Engineering programme of research, he currently serves as the Chief Scientist at the Turing. Girolami is an elected fellow of the Royal Academy of Engineering, the Royal Society of Edinburgh, and has held both Advanced and Established Career Research Fellowships from EPSRC, as well as a Royal Society Wolfson Research Merit Award. He is currently Editor in Chief of the Cambridge University Press journal - Data Centric Engineering.



Place: Foyer downstairs

MS 3 "Machine-learning supported materials design"
Place: V 47.03

MS 4 "Data-Integrated Control with Guarantees"
Place: V 47.02

Place: Conference Venue, Pfaffenwaldring 47, Foyer downstairs

The welcome reception will take place after the first day of the conference. It is a chance to catch-up with friends, meet new people, make connections and discuss the sessions, testing ideas and finding new insights with great company.

Drinks and food provided.

5 October

Speaker: Detlef Lohse
Topic: Melting of Ice
Place: V47.02

Melting of Ice
Ice Melting in salty water: Layering and non-monotonic dependence on the mean salinity

The presence of salt in ocean water strongly affects the melt rate and the shape evolution of ice, both of utmost relevance in geophysical and ocean flow and thus for the climate. To get a better quantitative understanding of the physical mechanics at play in ice melting in salty water, we numerically investigate the lateral melting of an ice block in stably strati ed saline water, using a realistic, nonlinear equation of state (EOS). The developing ice shape from our numerical results shows good agreement with the experiments and theory from Huppert & Turner. Furthermore, we find that the melt rate of ice depends non-monotonically on the mean ambient salinity: It first decreases for increasing salt concentration until a local minimum is attained, and then increases again. This non-monotonic behavior of the ice melt rate is due to the competition among salinity-driven buoyancy, temperature-driven buoyancy, and salinity-induced strati cation. We develop a theoretical model based on the energy balance which gives a prediction of the salt concentration for which the melt rate is minimal, and is consistent with our data. Our  ndings give insight into the interplay between phase transitions and double-di usive convective flows.
(Detlef Lohse, Rui Yang, Christopher J. Howland, Hao-Ran Liu, and Robert Verzicco)

Detlef Lohse (University of Twente)

Detlef Lohse studied physics at the Universities of Kiel & Bonn (Germany), and got his PhD at Univ. of Marburg (1992). He then joined Univ. of Chicago as postdoc. After his habilitation (Marburg, 1997), in 1998 he became Chair at Univ. of Twente in the Netherlands and built up the Physics of Fluids group. Since 2015 he is also Member of the Max Planck Society and of the Max-Planck Institute in Göttingen. Lohse's present research interests include turbulence and multiphase flow and micro- and nanofluidics (bubbles, drops, inkjet printing, wetting). He does both fundamental and more applied science and combines experimental, theoretical, and numerical methods. Lohse is Associate Editor of J. Fluid Mech. (among others journals) and serves as Chair of the Executive Board of the Division of Fluid Dynamics of the American Physical Society and Member of the Executive Board of IUTAM. He is Member of the (American) National Academy of Engineering (2017), of the Dutch Academy of Sciences (KNAW, 2005), the German Academy of Sciences (Leopoldina, 2002) and Fellow of APS (2002). He won various scientific prizes, among which the Spinoza Prize (NWO, 2005), the Simon Stevin Meester Prize (STW, 2009), the Physica Prize of the Dutch Physics Society (2011), the AkzoNobel Science Award (2012), three European Research Council Advanced Grants (2010 & 2017 &n2023), the George K. Batchelor Prize (IUTAM, 2012), the APS Fluid Dynamics Prize (2017), the Balzan Prize (2018), and the Max Planck Medal (2019). In 2010, he got knighted to become “Ridder in de Orde van de Nederlandse Leeuw”.

Speaker: Karsten Reuter (Fritz-Haber Institute/Max-Planch-Gesellschaft)
Topic: Data-Enhanced Multiscale Theory of Operando Energy Conversion Systems
Place: V47.02

Emerging operando spectroscopies and microscopies reveal a highly dynamic behavior of interfaces in energy conversion systems.Insufficient insight and the concomitant inability to control or exploit the corresponding strong structural and compositional modifications centrally limits the development of performance catalysts, electrolyzers or batteries required for a sustainable energy supply for our society. Predictive-quality modeling and simulation has long become a major contributor to accelerated design all across the materials sciences, not least through powerful computational screening approaches. Current first-principles based methodology is nevertheless essentially unable to address the substantial, complex and continuous morphological transitions at the working interfaces in energy conversion systems. I will review this context from the perspective of first-principles based multiscale modeling, highlighting how the fusion with modern machine learning approaches increasingly allows to tackle the true complexity of working systems. Approaches pursued by our group thereby aim at maximum data efficiency by exploiting physical models wherever possible or through active learning that only queries data on demand.

Karsten Reuter (Fritz-Haber-Institut/Max-Planck-Gesellschaft)

Karsten Reuter‘s research concerns energy conversion and storage in the context of renewable energy technologies. Processes such as the photovoltaic generation of electric power, the storage in batteries or the conversion into chemical energy carriers such as hydrogen and synthetic fuels are currently either not efficient enough, or require rare or toxic materials and catalysts. The theorist specifically investigates the limiting steps in these processes at surfaces and interfaces, for which he develops modern multiscale modelling and simulation techniques. With the help of these techniques (and supercomputers) he is able to bridge many orders of magnitude and study these processes from the detailed elementary molecular reactions up to entire reactors and fuel cells.

Place: Foyer downstairs

Place: Foyer downstairs

Speaker: Daniel Tartakovsky (Stanford University)
Topic: Use and Abuse of Machine Learning in Scientific Discovery
Place: V47.02

My talk focuses on the limitations and potential of deep learning in the context of science-based predictions of dynamic phenomena. In this context, neural networks (NNs) are often used as surrogates or emulators of partial differential equations (PDEs) that describe the dynamics of complex systems. A virtually negligible computational cost of such surrogates renders them an attractive tool for ensemble-based computation, which requires a large number of repeated PDE solves. Since the latter are also needed to generate sufficient data for NN training, the usefulness of NN-based surrogates hinges on the balance between the training cost and the computational gain stemming from their deployment. We rely on multi-fidelity simulations to reduce the cost of data generation for subsequent training of a deep convolutional NN (CNN) using transfer learning. High- and low-fidelity images are generated by solving PDEs on fine and coarse meshes, respectively. We use theoretical results for multilevel Monte Carlo to guide our choice of the numbers of images of each kind. We demonstrate the performance of this multi-fidelity training strategy on the problem of estimation of the distribution of a quantity of interest, whose dynamics is governed by a system of nonlinear PDEs (parabolic PDEs of multi-phase flow in heterogeneous porous media) with uncertain/random parameters. Our numerical experiments demonstrate that a mixture of a comparatively large number of low-fidelity data and smaller numbers of high- and low-fidelity data provides an optimal balance of computational speed-up and prediction accuracy.

Daniel Tartakovsky (Stanford University)

Daniel Tartakovsky is a Professor in Energy Science and Engineering Department, Institute for Computational Mathematics and Engineering, and Bio-X at Stanford University. Prior to joining Stanford, he was a Staff Scientist in Theoretical Division at Los Alamos National Laboratory (1996-2005) and Professor of Mechanical and Aerospace Engineering at University of California San Diego (2005-2015). His research interests include data-driven modeling and simulations, parameter estimation, and uncertainty quantification, with applications ranging from electrochemical energy storage to biomedical engineering. 

Place: Foyer downstairs

Place: FILDERHALLE - Convention & Event Center, Bahnhofstraße 61, 70771 Leinfelden-Echterdingen

The confernce dinner starts directly after the last session by an organized bus transfer from University Schleife. It offers registered conference participants an excellent chance to enjoy delicious food in a relaxed setting while getting to know each other and making new connections outside the main conference environment. There is also an organized transfer back to the conference venue.

6 October

Speakers: Timo Koch and Sarbani Roy
Topic: Publishing Simulation Data and Software
Place: V47.02

...Research Software Best Practices

Software is an integral part in much of today's research which naturally poses the question how to publish and archive software as an important part of research results. Often research software is developed by researchers. The software itself may even be the subject of the research. The reproducibility of results obtained with such software (as an integral part of the scientific process) can be more efficiently achieved--researchers never have enough time! --by following a few basic rules during development and publishing. Discussing best practices, I will argue, that following guidelines has the added benefit of aiding the robustness of the research process itself. Moreover, I will try to make the point that for practical reasons, thinking about reusability of developed research software, rather than reproducibility of results, may be a more natural way to create research software that advances the scientific field of interest.  Finally, I will discuss several "modes" of publishing software and how they may create a sustainable development cycle. This discussion includes some experience we gained in a recently completed project in collaboration with an interdisciplinary team of researchers from the University of Stuttgart.

Timo Koch (University of Oslo)

Timo is a Marie Skłodowska-Curie postdoctoral fellow (Scientia fellows II) at the Department of Mathematics, University of Oslo and a guest researcher at the Simula Research Laboratory, Oslo. His research primarily centers around flow and transport phenomena in the brain and porous media at large, with a strong focus on developing computational methods and advancing numerical research software. He earned his PhD from the University of Stuttgart (Civil and Environmental Engineering and SRC Simulation Technology). As scientific employee in Stuttgart, he designed and implemented best practices and techniques for integration and system testing in the context of research software. This initiative was a key aspect of a 3-year project titled "Quality assurance in numerical software frameworks". As a postdoctoral fellow at the University of Stuttgart and, from 2020 onwards, at the University Oslo, he played a significant role in designing research software development workflows within the project "Sustainable infrastructure for the improved usability and archivability of research software" (funded by the [DFG]). He is an open-source enthusiast, passionate research code developer (mainly C++ and Python), lead developer for the open-source research software framework [DuMux], and a member of the core developer team of the Distributed Unified Numeric Environment (DUNE).

...Archiving Guidelines

The sustainability of research in the domain of data-integrated simulation science is intricately linked to a viable interactive archival infrastructure for research data. Establishing a framework (namely a dataset) by linking and referencing research data and its sources within widely distributed archival ecosystems, and then implementing persistent identifiers to that framework, strengthens the foundation of reproducible research. Precise and descriptive metadata, coupled with domain-specific ontologies, functions as a guiding light in this context. Persistently accessible data is not only a key factor towards achieving reproducible research but also facilitates collaboration, promoting data integrity, enabling peer review, and fostering knowledge preservation.  During this talk, a comprehensive collection of guidelines and best practices will be outlined, which provides a methodical and organized approach for archiving, thereby enhancing crucial aspects of achieving FAIR (Findable, Accessible, Interoperable, and Reusable) research standards. We will also provide a recommendation on selecting a data repository that aligns with these guidelines. Additionally, we will emphasize how standardized metadata improves the findability and understanding of research outcomes, underscoring its importance within this context. 

Sarbani Roy (University of Stuttgart/SimTech)

Sarbani Roy is an enthusiast of open science and FAIR research. She holds a doctoral degree in Mathematics from the Indian Institute of Technology Kharagpur (IIT Kharagpur), India. Shortly after, Sarbani joined the Department of Hydromechanics and Modelling of Hydrosystems at the University of Stuttgart as a postdoctoral researcher. Currently, Sarbani holds the position of research data manager at the Stuttgart Center for Simulation Sciences (SC SimTech) and is a scientific employee at the Competence Center for Research Data Management (FoKUS) within the University of Stuttgart.  Her active involvement in the BETTY task area of NFDI4Ing (German National Research Data Infrastructure for engineering science) showcases her commitment to advancing research data management practices. Sarbani possesses a strong research interest in advancing research data management practices, establishing platforms for archiving sustainable research software, and ensuring seamless reproducibility of scientific artifacts. Her keen interest also lies in exploring the role of metadata in ensuring the long-term sustainability of research software. She is deeply committed to advocating for open science principles and empowering researchers to enhance the accessibility, transparency, and reusability of their work. By fostering a culture of openness and facilitating effective research data management, she actively contributes to scientific progress and fosters collaboration. With a strong drive for interdisciplinary collaboration, Sarbani aims to bridge the gap between research and reproducibility.

Speaker: Ulrike von Luxburg
Topic: Explainability and regulations
Place: V47.02

Explainability is one of the concepts that dominate debates about the regulation of machine learning algorithms. In my presentation I will argue that in its current form, post-hoc explanation algorithms are unsuitable to achieve the law's objectives, for rather fundamental reasons. In particular, most situations where explanations are requested are adversarial, meaning that the explanation provider and receiver have opposing interests and incentives, so that the provider might manipulate the explanation for her own ends. I then discuss a theoretical analysis of Shapley Value based explanation algorithms that open the door to more formal guarantees for posthoc explanations.

Ulrike von Luxburg (University of Tübingen)

Ulrike von Luxburg is a full professor for the Theory of Machine Learning at the University of Tuebingen, Germany. Her research analyzes machine learning algorithms from a theoretical point of view, tries to understand their implicit mechanisms, and to give formal statistical guarantees for their performance. In this way, she reveals fundamental assumptions, biases and strenghts and weaknesses of widely used machine learning algorithms, for example in the field of explainable machine learning. Next to her own research group, she is coordinating a large research consortium on Machine Learning in Science. She is an active participant in local debates about ethics and responsibility in machine learning. 


Place: Foyer downstairs

MS 7 "Simulations@Operations - Pervasive simulations: anytime, anywhere"
Place: V 47.03

MS 8 "Accelerating Simulations through Quantum Computing"
Place: V 47.02

Place: Pfaffenwaldring 47, Foyer downstairs

Speaker: Miriah Meyer
Topic: Troubling Visualization
Place: V47.02

Visualization research is at an inflection point where the field is filled with increasingly diverse research interests and approaches, but where we are also struggling to make our tried-and-true approaches to research answer an increasingly complex range of questions. How do we consider people’s affective, emotional, and subjective relationships to data and visualization? How do we design novel visualizations in an increasingly complex and uncontrollable technology landscape? What are our ethical responsibilities to our collaborators, our participants, and each other? In this talk I’ll argue that it is time to trouble the foundational perspectives we hold around how we, as researchers, make sense of the world and design within it. I’ll talk about new perspectives we are using in the Vis Collective at Linköping University, and the myriad of research opportunities these perspectives are pointing us to.

Miriah Meyer (Linköping University) 

Miriah is a professor in the Division of Media & Information Technology at Linköping University, supported through the WASP program. Her research focuses on the design of visualization systems for helping people make sense of complex data, and on the development of methods for helping visualization designers make sense of the world. She obtained her bachelors degree in astronomy and astrophysics at Penn State University, earned a PhD in computer science from the University of Utah, and completed a postdoctoral fellowship at Harvard University. Prior to joining the faculty at LiU she was an associate professor in the School of Computing at the University of Utah and a member of the Vis Design Lab in the Scientific Computing & Imaging Institute. Miriah has received numerous awards and recognitions for her work including being named a University of Utah Distinguished Alumni, both a TED Fellow and a PopTech Science Fellow, a Microsoft Research Faculty Fellow, and included on MIT Technology Review's TR35 list of the top young innovators. She was also awarded an AAAS Mass Media Fellowship that landed her a stint as a science writer for the Chicago Tribune.

Speaker: Robert Bitmead
Topic: Particle filters for feedback control - simulation-based tools for state estimation
Place: V47.02

The particle filter is a computational mechanism for propagating empirical probability densities and updating them to incorporate new data using Bayes Rule via resampling. In a stochastic control context, the conditional density of the state is calculated in this way and then used for the construction of the next feedback control value. This latter calculation is in general intractable because it requires solution of the Stochastic Dynamic Programming Equation. So, simplifying and approximate methods are needed. The most dramatic of these is Certainty Equivalence Control, where a specific single “best” value of the state is selected and used without regard to the nature of the rest of the density, such as its variance; often this is chosen to be the conditional mean. Other choices will be presented and described. The complexity of moving from conditional state density to stochastic optimal control also limits the application to popular modern methods such as Model Predictive Control (MPC) because these rely on open-loop constrained optimization. The core issue is that stochastic optimal control is necessarily closed-loop and density based while MPC is an open-loop method in hiding. In the presentation, a middle path is developed which blends open and closed loop while maintaining computational tractability.  Simulation in open- and closed-loop form a part of the procedure to select a best state value. Alongside this, the particle filter operates to provide the correct density for search.

Robert Bitmead (University of California)

Bob Bitmead is Distinguished Professor in Mechanical & Aerospace Engineering at the University of California, San Diego. He holds degrees in Applied Mathematics and Electrical Engineering from Sydney University and Newcastle University, both in Australia. He has held faculty positions at the Australian National University and James Cook University of North Queensland. He is a control theorist with a long experience in control applications in many industrial sectors. His theoretical work is strongly informed and guided by these applications. He was the recipient of the 2014 ASME Rufus Oldenburger Medal and of the 2015 IEEE Controls Systems Transition to Practice Award. Bob was President of the IEEE Control Systems Society for 2019. He was a member of the IFAC Council from 1996 to 2002 and was Foundation Editor-in-Chief of the IFAC Journal of Systems & Control 2017-2022. He is Fellow of IEEE, IFAC and the Australian Academy of Technological Sciences and Engineering. Bob brews his own beer and is an accredited and active Australian Rules Football umpire training with the San Diego Lions Australian Football Club.

Presentation of the Best Poster Award and closing remarks as well as farewell remarks by Miriam Schulte.

Place: Pfaffenwaldring 47
Room: V47.02

To the top of the page