# HGS MathComp Curriculum & Events

## 2022/2023 ws

**Compact Courses**

#### Introduction to Mathematics of Deep Learning []

**Date:**
2022-10-26 - 14:30

**Speaker:** Romberg Guest Professor - Prof. Leonid Berlyand • Pennsylvania State University, USA

**Location:** Mathematikon • Conference Room, Room 5/104, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** not yet determined

**Abstract - File:** []

Minicourse: Introduction to Mathematics of Deep Learning

Leonid Berlyand • Pennsylvania State Universtiy, USA

October 26-27, 2022 • 14:30 - 16:30

Abstract:

The goal of this minicourse of four lectures is to introduce basic concepts from deep learning in a rigorous mathematical fashion, e.g introduce mathematical definitions of deep neural networks (DNNs), loss functions, the backpropagation algorithm, etc. We attempt to identify for each concept the simplest setting that minimizes technicalities but still contains the key mathematics. This minicourse follows the upcoming book “Mathematics of Deep Learning: an introduction” by L. Berlyand and P.-E. Jabin. Publisher: De Gruyter (to appear).

Lecture 1. History, general perspective and basic notions of deep learning

In this lecture, we briefly discuss the general perspective of machine learning:

what is it and why study it? Next, we introduce the classification problem in a supervised learning context and then introduce the key concept of artificial neural networks (ANNs) as the composition of linear maps and nonlinear activation function followed by other basic definitions describing ANNs.

Lecture 2. DNNs and approximation theory

In this lecture, we discuss the universal approximation theorem describing the wide class of continuous functions which DNNs can be used to approximate. This theorem explains the extensive use of DNNs in classification problems. Next, we introduce the concept of training via the gradient descent algorithm which improves the approximate classifier by iteratively __learning__ from the dataset.

Lecture 3. Backpropagation & CNNs

We begin from introducing the notion of computational complexity. Next, we introduce the backpropagation algorithm which significantly reduces the computational cost of optimizing the loss function. This is done in the simplest one neuron per layer setting, which while not practical allows us to explain the concept without many technicalities. Finally, if time permits, we briefly discuss convolutional neural networks and their properties.

Lecture 4. Implementing DNNs and Training: a brief overview of Pytorch

This lecture will be presented by my co-author P.-E. Jabin (Penn State).

We present here a short and very basic introduction to Pytorch, in the context of classification of images. We assume passing familiarity with coding and with python in particular. Many further tutorials exist online and several can be found at https://pytorch.org/tutorials/

#### Short Course: Introduction to Optimization []

**Date:**
2023-03-27 - 10:00

**Speaker:** Prof. Roland Herzog & Dr. Georg Müller • IWR

**Location:** Mathematikon • Conference Room, Room 5/104 & Seminar Rooms 10 + 11, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** 3

March 27-30, 2023 • 10:00-12:00 & 13:00-15:00

This 4-day course offers a compact introduction to mathematical optimization. We invite researchers in non-mathematical fields as well as in mathematics to attend the classes.

The goal of the course is to enable the participants to recognize the characteristics of a given optimization problem, to understand its difficulties and limitations, and to choose suitable solution methods as well as to develop ideas on how to model problems from their own field as optimization problems.

We will focus on the following four categories of optimization problems:

day 1) unconstrained optimization

day 2) convex optimization

day 3) nonlinear optimization

day 4) infinite-dimensional optimization

For each of these problem classes, we will study meaningful examples, the relevant theory, and prominent solution algorithms. Every day consists of a 90-minute lecture part in the morning and a 90-minute hands-on exercise session in the afternoon.

REGISTRATION REQUIRED !

Please register via the office of the graduate school: hgs@iwr.uni-heidelberg.de

For further information please visit the website:

https://scoop.iwr.uni-heidelberg.de/teaching/2022ws/short-course-optimization/

**IWR Colloquium**

#### IWR Colloquium & HGS MathComp "Mathematics for Life": "Neuromorphic Computing With Self-Organized Networks" []

**Date:**
2022-11-30 - 16:15

**Speaker:** Dr. Johannes Zierenberg • Max Planck Institute for Dynamics and Self-Organization

**Location:** Mathematikon • Conference Room, Room 5/104, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** not yet determined

Our brains are comprised of billions of neurons that form a complex network. This network is a result of both evolutionary optimization (fostering a modular arrangement including highly specialized areas) and our own experience (storing memories and skills by adapting connection strengths) and determines how we process sensory input to produce meaningful responses. Since neurons communicate with short electrical pulses only when necessary, they are extremely energy efficient. Given our worldwide increase in computing demand, there is thus a strong incentive to develop low-energy neuromorphic computing paradigms that mimic the working principles of the brain. But what are the relevant working principles of the brain? How does a neural network develop useful dynamics? In this seminar, I will present minimal principles to ensure stable collective neural dynamics from a statistical physics perspective, discuss how these can be used to tune network states to task requirements and show how they can be applied to neuromorphic computing. While I mainly focus on experience-driven self-organization, I will finish with some ideas to include evolutionary-driven architectures in the future.

The IWR Colloquium will be held as an in-person event at the Mathematikon. In addition it will be streamed via Zoom.

For more information please visit the website of the colloquium.

Link: www.iwr.uni-heidelberg.de/events/iwr-colloquium

“Mathematics of Life” is a special interest group organized by doctoral students of the HGS MathComp.

HGS MathComp Members will receive 1 ECTS credit for every 4 talks attended. Please make sure to include them in your BlueSheet.

#### IWR Colloquium: "Accelerated Sampling and Improved Synthesis in Diffusion Models" []

**Date:**
2023-01-11 - 16:15

**Speaker:** Tim Dockhorn • David R. Cheriton School of Computer Science • University of Waterloo, Canada

**Location:** Mathematikon • Conference Room, Room 5/104, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** not yet determined

Having access to a powerful generative model allows for a wide range of downstream applications, such as probabilistic inference, sampling, data completion, density evaluation, outlier detection, etc. Diffusion models (DMs) are an emerging class of deep generative models that have demonstrated remarkable synthesis quality. DMs rely on a diffusion process that gradually perturbs the data towards white noise, while the generative model learns to denoise. A major drawback of DMs, compared to, for example, Generative Adversarial Networks, is that sampling can be relatively slow.

In this seminar, I will give an accessible introduction to DMs and present our work on critically-damped Langevin DMs (CLD) which is based on ideas from statistical mechanics. CLD can be interpreted as running a joint diffusion in an extended space, where the auxiliary variables can be considered "velocities" that are coupled to the data variables as in Hamiltonian dynamics. CLD significantly accelerates sampling compared to the original DM formulation, however, many further improvements can be made by borrowing ideas from the ODE solver literature.

The IWR Colloquium will be held as an in-person event at the Mathematikon. In addition it will be streamed via Zoom.

For more information please visit the website of the colloquium.

Link: www.iwr.uni-heidelberg.de/events/iwr-colloquium

HGS MathComp Members will receive 1 ECTS credit for every 4 talks attended. Please make sure to include them in your BlueSheet.

#### ONLINE EVENT / IWR Colloquium: "Randomization techniques for solving large scale linear algebra problems" []

**Date:**
2023-02-01 - 17:00

**Speaker:** Prof. Laura Grigori • Director of Research, INRIA Paris • Alpines group, joint with Laboratoire J.L. Lions, Sorbonne University, Paris, France

**Location:** Online Event

**ECTS-Points:** not yet determined

In this talk we discuss randomization techniques for solving large scale linear algebra problems. We focus in particular on solving linear systems of equations and eigenvalue problems. We first introduce a randomized Gram-Schmidt process for orthogonalizing a set of vectors and its block version. We discuss its efficiency and its numerical stability while also using mixed precision. Further randomized GMRES and randomized FOM methods are discussed for solving linear systems of equations as well as randomized Rayleigh-Ritz procedure for solving eigenvalue problems.

The IWR Colloquium will be streamed via Zoom. For more information please visit the website of the colloquium.

Link: www.iwr.uni-heidelberg.de/events/iwr-colloquium

HGS MathComp Members will receive 1 ECTS credit for every 4 talks attended. Please make sure to include them in your BlueSheet.

**Lecture**

#### The large-data limit of the MBO scheme for data clustering []

**Date:**
2022-10-31 - 11:00

**Speaker:** Prof. Tim Laux • Hausdorff Center for Mathematics, University of Bonn

**Location:** Mathematikon • Seminar-Room A • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** not yet determined

The MBO scheme is an efficient algorithm for data clustering, the task of partitioning a given dataset into several meaningful clusters. In this talk, I will present the first rigorous analysis of this scheme in the large-data limit.

The starting point for the first part of the talk is that each iteration of the MBO scheme corresponds to one step of implicit gradient descent for the thresholding energy on the similarity graph of the dataset. It is then natural to think that outcomes of the MBO scheme are (local) minimizers of this energy. We prove that the algorithm is consistent, in the sense that these (local) minimizers converge to (local) minimizers of a suitably weighted optimal partition problem.

To study the dynamics of the scheme, we use the theory of viscosity solutions. The main ingredients are (i) a new abstract convergence result based on quantitative estimates for heat operators and (ii) the derivation of these estimates in the setting of random geometric graphs.

To implement the scheme in practice, two important parameters are the number of eigenvalues for computing the heat operator and the step size of the scheme. Our results give a theoretical justification for the choice of these parameters in relation to sample size and interaction width.

This is joint work with Jona Lelmi (University of Bonn).

The lecture will be also streamed online: https://us02web.zoom.us/j/4889309058

#### A glimpse into academic publishing

**Date:**
2023-03-14 - 16:00

**Speaker:** Dr. Jan Holland • Senior Publisher, Springer & Dr. Remi Lodh • Senior Editor, Springer

**Location:** Mathematikon • Conference Room, Room 5/104, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** not yet determined

This will be a short presentation about the (Julius) Springer publishing house and contemporary academic publishing. We will begin by briefly touching upon Springer’s historical connections to the University of Heidelberg before providing some insights into the modern publishing process. In particular, we will offer hints as to how to successfully publish journal articles and books of various types, and take questions from the audience. This should be especially useful for graduate students and early-career researchers.

Following the lecture, there will be a get-together. It will take place in the adjoining Common Room at 17:00. Snacks and beverages will be provided.

This event is jointly organized by HGS MathComp and the Faculty of Mathematics and Computer Science.

**Seminar**

#### Double Seminar: "heiAIMS Transmitting Live" on the occasion of the International Day of Mathematics

**Date:**
2023-03-14 - 14:00

**Speaker:** Dr. Mafoya Landry Dassoundo • Department of Mathematical Sciences, African Institute for Mathematical Sciences (AIMS) & Dr. Michael Winckler • HGS MathComp

**Location:** Mathematikon • Seminar Room 11, 5th Floor • Im Neuenheimer Feld 205 • 69120 Heidelberg

**ECTS-Points:** not yet determined

Target audience: BSc students in Mathematics and Computer Science

This event is a double seminar between MATHEMATIKON, Heidelberg and AIMS, South Africa on the occasion of the International Day of Mathematics. In two short and interactive lectures we celebrate the day of mathematics and bring together students from the two institutions on a virtual platform.

The event is organized by "heiAIMS, the Heidelberg - Cape Town Network for Applied Mathematics and Scientific Computing" which is funded by the Baden-Württemberg-STIPENDIUM for University Students, a program established by the Baden-Württemberg Stiftung.

###

Title: Mathematics and African Arts

Speaker: Dr. Mafoya Landry Dassoundo • Department of Mathematical Sciences, African Institute for Mathematical Sciences (AIMS)

Abstract:

First of all, we will define, and recall basic the properties of braids groups and connect them naturally with knot theory. Some of its applications in the hairstyles of African women will be presented. Secondly, we will share a cultural experience around one of the oldest African games well known as "African stones game".

###

Title: What Exactly is Infinity?

Speaker: Dr. Michael Winckler • HGS MathComp

Abstract:

The concept of infinity is hard to understand for humans. Counting finite sets is a natural concept, but understand what countably infinite sets are and that we can even investigate uncountably infinite sets is difficult to comprehend.

In this lecture we will start by comparing finite sets and establishing a rigorous method to compare sets in their size. Expanding this to infinite sets will help us to move forward and show that different levels of infinity exist. We will conclude the lecture by trying to visualize our results in drawing space-filling curves, thus experiencing that our notion of size does not easily extend to infinity.

Remark: Please bring paper and a pencil to class