The Interdisciplinary Center for Scientific Computing (IWR) and its affiliated institutions organize a large number of workshops, conferences and other events for discussing latest scientific results as well as identifing upcoming challenges in the field of Scientific Computing. In addition the IWR regularly hosts events which emphasis on broadening and improving the interdisciplinary dialogue.
September 23-27, 2019 • Heidelberg, Germany
The IWR School 2019 gives a crash course in machine learning with applications from Natural Sciences and Life Sciences. We target young researchers from Natural Sciences and Life Sciences who want to learn more about machine learning. A background in machine learning is not required. Besides introducing the basic concepts of machine learning, we teach selected topics in more depth, such as deep learning, metric learning, transfer learning, Bayesian inverse problems, and causality. Experts from machine learning, Natural Science and Life Science explain how these machine learning approaches are utilized to solve problems in their respective fields of research.
Postgraduate students, PhD candidates, postdocs and young researchers:
- from Natural and Life Sciences: Microscopy, Biology, Medical, Physics,…
- with interest in Machine Learning
- Master students from Heidelberg University (core course listed in LSF)
The IWR School 2019 is taught in a series of courses and single lectures by:
- Christoph Lampert, Institute of Science and Technology Austria
- Oliver Stegle, European Bioinformatics Institute
- Robert Scheichl, Heidelberg University
- Dominik Janzing, Max Planck Institute for Intelligent Systems
- Klaus Maier Hein, German Cancer Research Center
- Bjoern Ommer, Heidelberg University
- Ullrich Köthe, Heidelberg University
- Anna Kreshuk, European Molecular Biology Laboratory
! Registration required ! (Deadline: July 7, 2019)
The IWR School 2019 is supported by the Cluster of Excellence STRUCTURES.
Location: Mathematikon • Im Neuenheimer Feld 205 · 69120 Heidelberg
Prof. Mark A. Girolami • University of Cambridge, UK
July 3, 2019 • 16:15
The finite element method (FEM) is one of the great triumphs of applied mathematics, numerical analysis and software development. Recent developments in sensor and signalling technologies enable the phenomenological study of systems. The connection between sensor data and FEM is restricted to solving inverse problems placing unwarranted faith in the fidelity of the mathematical description of the system. If one concedes mis-specification between generative reality and the FEM then a framework to systematically characterise this uncertainty is required. This talk will present a statistical construction of the FEM which systematically blends mathematical description with observations.
Location: Mathematikon • Conference Room, Room 5/104, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg
Dr. Andreas Potschka • Interdisciplinary Center for Scientific Computing (IWR)
May 16, 2019 • 14:15
We consider nonconvex and highly nonlinear mathematical programming problems including finite dimensional nonlinear programming problems as well as optimization problems with partial differential equations and control constraints. We present a novel numerical solution method, which is based on a projected gradient/anti-gradient flow for an augmented Lagrangian on the primal/dual variables. We show that under reasonable assumptions, the nonsmooth flow equations possess uniquely determined global solutions, whose limit points (provided that they exist) are critical, i.e., they satisfy a first-order necessary optimality condition. Under additional mild conditions, a critical point cannot be asymptotically stable if it has an emanating feasible curve along which the objective function decreases. This implies that small perturbations will make the flow escape critical points that are maxima or saddle points. If we apply a projected backward Euler method to the flow, we obtain a semismooth algebraic equation, whose solution can be traced for growing step sizes, e.g., by a continuation method with a local (inexact) semismooth Newton method as a corrector, until a singularity is encountered and the homotopy cannot be extended further. Moreover, the projected backward Euler equations admit an interpretation as necessary optimality conditions of a proximal-type regularization of the original problem. The prox-problems have favorable properties, which guarantee that the prox-problems have uniquely determined primal/dual solutions if the Euler step size is sufficiently small and the augmented Lagrangian parameter is sufficiently large. The prox-problems morph into the original problem when taking the step size to infinity, which allows the following active-set-type sequential homotopy method: From the current iterate, compute a projected backward Euler step by applying either local (inexact) semismooth Newton iterations on the step equations or local (inexact) SQP-type (sequential quadratic programming) methods on the prox-problems. If the homotopy cannot be continued much further, take the current result as a starting point for the next projected backward Euler step. If we can drive the step size all the way to infinity, we can transition to fast local convergence. We can interpret this sequential homotopy method as extensions to several well-known but seemingly unrelated optimization methods: A general globalization method for local inexact semismooth Newton methods and local inexact SQP-type methods, a proximal point algorithm for problems with explicit constraints, and an implicit version of the Arrow--Hurwicz gradient method for convex problems dating back to the 1950s extended to nonconvex problems. We close the talk with numerical results for a class of highly nonlinear and badly conditioned control constrained elliptic optimal control problems with a semismooth Newton approach for the regularized subproblems.
Preprint available on arxiv.org/abs/1902.06984.
Location: Mathematikon • Room 2/414, 2nd Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg
Organizers: Katharina Anders, Bernhard Höfle, Hubert Mara
April 1-4, 2019
- Automatic methods for 3D geospatial data processing
- Geographic applications of 3D data analysis
- Hands-on: 3D point cloud and mesh analysis
- Programming and research challenge: Development of computational methods for 3D information extraction
- Prof. Dr. Andreas Nüchter, University of Würzburg
- Jorge Martínez Sánchez, University of Santiago de Compostela
Please register on the website of the Compact Course until February 15, 2019
Contact: Katharina Anders
Location: Mathematikon • Conference Room, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg
Prof. Motassem Al Arydah • Khalifa University, Department of Mathematics, Abu Dhabi UAE
April 1, 2019 • 11:00
Lung cancer (LC) is the leading cause of death of cancer in Canada in both men and women, and indoor radon is the second leading cause of LC after tobacco smoking. The Population Attributable Risk (PAR) is used to assess radon exposure risk. We use the PAR to identify the radon levels responsible for most LC cases. During the period 2006–2009, 6% of houses in Ontario, 9% of houses in Alberta, 19% of houses in Manitoba, 7% of houses in Quebec, and 5% of houses in British Columbia had radon levels higher than 200 Bq/m3 and was responsible about 913, 211, 260, 972, and 258 lives, respectively. Radon mitigation programs could have prevented these LC cases. We use the PAR function of the two variables, radon action, and target levels, to search for a possible optimal mitigation program. The PAR is a linear function in the target radon value with an estimated slope of 0.0001 for Ontario, Alberta, Quebec, and British Columbia, and 0.0004 for Manitoba. The PAR is an increasing function in the radon action level. The PAR is sensitive to changes in the radon mitigation program and as such, any improvement is a worthwhile investment.
 Al-arydah, M. (2018). Estimating the Burden of Lung Cancer and the Efficiency of Home Radon Mitigation Systems in some Canadian Provinces. Science of the Total Environment, 626, , 287-306.
 Al-arydah, M. (2017). Population attributable risk associated with lung cancer induced by residential radon in Canada. Sensitivity to relative risk model and radon. Science of the Total Environment. dx.doi.org/10.1016/j.scitotenv.2017.04.067.
Location: Mathematikon • Seminar Room 10, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg
Dr. Hadley Wickham • Chief Scientist at RStudio, Adjunct Professor at University of Auckland, Stanford and Rice University
March 13, 2019 • 16:15
Tidy data is a standard way of storing your data where columns are variables and rows are observations. Tidy data, particularly when coupled with tidy tools, makes data analysis easier because you can spend less time wrangling the output of one function so that it works as the input for another. Tidy data will make your analysis easier but how you get wild-caught data into a tidy form? In this talk, I'll discuss some of the tools that I have worked on for tidying data (e.g. the tidyr package), the limitations of those tools, and what I'm thinking about next. In particular, I'll discuss a new approach for "pivoting" data, and discuss some of the challenges posed by data stored in hierarchical form (e.g. JSON).
Hadley Wickham has pioneered the development of advanced data visualisation and analysis approaches for the R statistical computing platform. He holds a BSc. in Human Biology, and a BSc. and MSc. in Statistics from the University of Auckland. He went on to work with Di Cook and Heike Hofmann at Iowa State University, and obtained his PhD in 2008. In 2007, Hadley released ggplot2 - a data visualisation library based on Leland Wilkinson's 'The Grammar of Graphics', and in 2013, unveiled 'The Tidyverse' - a collection of libraries and methodological approaches for the efficient manipulation of complex data in R. His contributions to the field were recognised in 2008, with his receipt of the John Chambers Award for Statisical Computing, and in 2015, he was made a fellow of the American Statistical Computing Association.
Location: Mathematikon • Conference Room, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg
February 25, 2019 • 17:00
The event is organized in close cooperation with the conference "Geometric Analysis meets Geometric Topology“.
This touring exhibition, whose starting point is the 7th ECM held in July 2016 in Berlin, stems from the observation that nowadays, women still find it difficult to embrace a career in the mathematical academic world and the disparity between the proportion of men and that of women among professional mathematicians is still shamefully large.
The thirteen women mathematicians portrayed here share with us their experience, thus serving as role models to stimulate young women scientists to trust their own strength. In presenting mathematics through women mathematicians’ perspectives and samples of their life stories, we hope to highlight the human aspects of producing mathematics, making this discipline more tangible and therefore more accessible to outsiders or newcomers.
This exhibition and the catalogue (publishing house: Verlag am Fluss) are the result of the joint efforts of the photographer Noel Tovia Matoff and four mathematicians by Sylvie Paycha, Sara Azzali, Alexandra Antoniouk, Magdalena Georgescu, with the precious help of Maria Hoffmann-Dartevelle, who translated into German and Sara Munday, who proofread the interviews and, last but not least, our two inspired graphic designers Wenke Neunast/eckedesign (exhibition) and Gesine Krüger (catalogue).
The exhibition will be on display from February 26 - May 31, 2019 at the Foyer of the Mathematikon.
The event is kindly supported by the Heidelberg Laureate Forum Foundation (HLFF).
Link: Exhibition Homepage
Location: Mathematikon • Foyer • Im Neuenheimer Feld 205 • 69120 Heidelberg
24. Januar 2019 • 14:00 Uhr
Moderne Materialien zählen zu den zentralen Zukunftsthemen in der angewandten Forschung. Die Verwendung hochmoderner Planungsprozesse in den Materialwissenschaften und bahnbrechende Entdeckungen in der Nanoforschung, in der organischen Elektronik und in der computergestützten theoretischen Chemie haben zu einer Revolution geführt. Als eine der Schlüsseltechnologien für das 21. Jahrhundert erlauben die Modernen Materialien einen zielgerichteten Einsatz in zahlreichen Anwendungsgebieten. Durch die Kombination von mehreren Eigenschaften werden immer effektivere Werkstoffe ermöglicht. So können zum Beispiel Stoffe entwickelt werden, die gleichzeitig extrem leicht aber dennoch stabil sind oder wesentlich effizienteren Energietransport erlauben.
Was ist von der Materialforschung in der nächsten Dekade an Entwicklungen noch zu erwarten? Wie wirken hier die unterschiedlichen Forschungsrichtungen zusammen? Und welche Rolle spielt die computergestützte Simulationstechnik bei diesem Paradigmenwechsel von der experimentellen Entwicklung neuer Materialien zur gezielten Planung von Materialeigenschaften am Computer? Mit diesen Fragen beschäftigt sich der 16. Modellierungstag. Wir haben Experten aus Universitäten, Forschung und Produktion eingeladen, um in Impulsvorträgen und praxisnahen Diskussionen die zentralen Fragestellungen aus diesem interdisziplinären Feld zu erörtern.
Der Modellierungstag greift die spannende Frage nach der Zukunft von „Modernen Materialien“ auf und fördert den Austausch zwischen Forschern, Entwicklern, Theoretikern und Anwendern. Beiträge aus unterschiedlichen Fachrichtungen liefern dazu Denkanstöße und Diskussionsgrundlagen.
Veranstaltungsort: Marsilius-Kolleg • Im Neuenheimer Feld 130.1 • 69120 Heidelberg
Dr. Sara Grundel • Max Planck Institute for Dynamics of Complex Technical Systems, Magdeburg
January 9, 2019 • 16:15
My recent work centers around the simulation of energy networks. In my talk, I will discuss the mathematical challenges as well as some recent results to efficiently simulate transient gas flow within a realistic gas pipeline transportation network. Mathematical models of such a system start from a set of hyperbolic partial differential equations, combined with ordinary differential equations and algebraic equations. Picking the necessary complexity is the first choice to make and discretization allows for a variety of choices again, whose implications we briefly discuss. Finally we attain a system of ordinary differential or differential-algebraic equations (ODE/DAE) that can be successfully reduced in complexity via classical model order reduction techniques. We discuss and compare some known methods for different pipeline networks. Furthermore we introduce an automatic clustering algorithm based on model order reduction principles and its application in power grids as well as water distribution networks.
Location: Mathematikon • Conference Room (5/300), 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg
Last Update: 16.05.2019 - 10:47