Seminar

Connections of Deep Learning and PDEs for Visual Computing

Summer Term 2021


(Main) Seminar: Connections of Deep Learning and PDEs for Visual Computing

Karl Schrader, Prof. Joachim Weickert

Summer Term 2021

(Main) Seminar (2 h)

Notice for bachelor/master students of mathematics: This is a »Hauptseminar« in the sense of these study programs.

Classification for a swiss role. (Authors: Eldad Haber and Lars Ruthotto)


NEWS:

April 09, 2021:
Announcement of time slot.

February 04, 2021:
Two additional papers were added to the list of topics.

February 03, 2021:
The registration is now closed.

January 28, 2021:
The seminar will be fully online.
All students will be made members of a Microsoft Teams Group. All communication will commence there.



Important DatesDescriptionRegistrationRequirementsIntroductory MeetingOverview of Topics



Introductory meeting (mandatory):
The introductory meeting will take place on Friday, February 05, 2021, 2:15 p.m., via Teams.
In this meeting, we will assign the topics to the participants. Attendance is mandatory for all participants. You are expected to turn on your camera during the meeting. Do not forget to register first (see below).
All registered participants will be added to the Team prior to the meeting.

Regular meetings during the summer term 2021:
Every Wednesday at 16:00 s.t., starting Mai 05, web meetings via Teams

Contents: Partial Differential Equations are at the core of many methods in image processing and computer vision, such as denoising, image inpainting, and deblurring. In recent years, neural networks have been challenging these model-based approaches in terms of performance, while lacking their compactness and theoretical insights.
In this seminar, we will cover two ways to bring PDEs and neural networks together. Firstly, we explore how PDEs can be solved efficiently using neural networks. Second, we transfer ideas from successful PDEs from image processing and computer vision and apply them to neural networks. Examples include taking stable discretisations of PDEs and using them to build stable neural networks. Another idea uses reversible PDEs to construct arbitrarily deep networks.

Prerequisites: The seminar is for advanced bachelor or master students in Visual Computing, Mathematics, or Computer Science. Basic mathematical knowledge (e.g. Mathematik für Informatiker I-III) and some knowledge in image processing and computer vision is required. Previous knowledge of neural networks is helpful but not required.

Language: All papers are written in English, and English is the language of presentation.


The registration for this course is closed. You can see the order of the registrations here.

Since the number of talks is limited, we ask for your understanding that participants will be considered strictly in the order of registration – no exceptions.

Regular attendance: You must attend all virtual seminar meetings. You are expected to turn on your camera during the meetings. If you are sick, please send a medical certificate via mail to Karl Schrader. If you have technical difficulties, let us know as soon as possible.

Talk: Your talk will consist of a 20min prerecorded video which is divided into 3-4 parts. A supervisor will stream them live during the virtual seminar meeting. After each part, there will be room for questions. After the talk, there will be a final discussion. Your presentation must be delivered in English. Your slides and your write-up, too, have to be in English. Send your video files in mp4 format and your presentation slides in pdf format to Karl Schrader at least 24h before the seminar meeting.

Write-up: The write-up has to be handed in three weeks after the lecture period ends. The deadline is Tuesday, August 31, 23:59. The write-up should summarise your talk and has to consist of 5 pages per speaker. Please adhere to the guidelines for write-ups posted in our Teams group. Submit your write-up in pdf format directly in the corresponding assignment in Teams.

Plagiarism: Adhere to the standards of scientific referencing and avoid plagiarism: Quotations and copied material (such as images) must be clearly marked as such, and a bibliography is required. Otherwise the seminar counts as failed. See the write-up guidelines for a detailed explanation on how to cite correctly.

Mandatory consultation: Talk preparation (included a preliminary video and presentation slides) has to be presented to your seminar supervisor no later than one week before the talk is given. It is your responsibility to approach us timely and make your appointment for a video call.

No-shows: No-shows are unfair to your fellow students: Some talks are based on previous talks, and your seminar place might have prevented the participation of another student. Thus, in case you do not appear to your scheduled talk (except for reasons beyond your control), we reserve the right to exclude you from future seminars of our group.

Participation in discussions: The discussions after the presentations are a vital part of this seminar. This means that the audience (i.e. all paricipants) poses questions and tries to find positive and negative aspects of the proposed idea.

Being in time: To avoid interrupting the seminar, all participants have to be logged into the web meeting in time. Please make sure to log in early, in case there are technical difficulties. Participants that turn out to be regularly late must expect a negative influence on their grade.


The slides of the introductory meeting will be uploaded here. They contain important information for preparing a good talk.



No. Paper ID Date Speaker Topic
1 1 05/5 Yashaswini Mysuru Udaya Kumar M. A. Ponti, L. S. F. Ribeiro, T. S. Nazare, T. Bui, J. Collomoss:
Everything you Wanted to Know about Deep Learning for Computer Vision but were Afraid to Ask.
Sections I-III.
2 2 05/5 Saurabh Pandey G. Cybenko:
Approximation by Superpositions of a Sigmoidal Function.
3 3 12/5 Paul Bungert A. Baydin, B. Pearlmutter, A. Radul:
Automatic Differentiation in Machine Learning: a Survey.
4 4 12/5 Sohom Mukherjee S. Rudy, S. Brunton, J. Proctor, J. Kutz:
Data-driven discovery of partial differential equations.
5 7 19/5 Simon Sabri Schönhofen M. Lichtenstein, G. Pai, R. Kimmel:
Deep Eikonal Solver.
6 5 19/5 M. Raissi, P. Perdikaris, G. Karniadakis:
Physics Informed Deep Learning (Part I): Data-driven Solutions of Nonlinear Partial Differential Equations.

7 8 26/5 Erik Johnson Z. Li, N. Kovachki, K. Azizzadenesheli, B. Liu, K. Bhattacharya, A. Stuart, A. Anandkumar:
Fourier Neural Operator for Parametric Partial Differential Equations.
8 9 26/5 Matthias Hock W. E, J. Han, A. Jentzen:
Algorithms for Solving High Dimensional PDEs: From Nonlinear Monte Carlo to Machine Learning.

9 10 02/6 Yassir Janah T. Alt, J. Weickert, P. Peter:
Translating Diffusion, Wavelets, and Regularisation into Residual Networks.

10 12 02/6 Y. Lu, A. Zhong, Q. Li, B. Dong:
Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations.
11 11 09/6 Y. Chen and T. Pock:
Trainable Nonlinear Reaction Diffusion: A Flexible Framework for Fast and Effective Image Restoration.

12 13 09/6 Beste Ekmen B. Chang, L. Meng, E. Haber, L. Ruthotto, D. Begert, E. Holtham:
Reversible Architectures for Arbitrarily Deep Residual Neural Networks.
13 14 16/6 Danish Shahzad E. Haber, L. Ruthotto:
Stable Architectures for Deep Neural Networks.

14 16 16/6 Christen Millerdurai T. Alt, P. Peter, J. Weickert, K. Schrader:
Translating Numerical Concepts for PDEs into Neural Architectures.



MIA Group
©2001-2023
The author is not
responsible for
the content of
external pages.

Imprint - Data protection